WorldWideScience

Sample records for term earthquake prediction

  1. Localization of intermediate-term earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Kossobokov, V.G.; Keilis-Borok, V.I. (International Inst. of Earthquake Prediction Theory and Mathematical Geophysics, Moscow (USSR)); Smith, S.W. (Univ. of Washington, Seattle (USA))

    1990-11-10

    Relative seismic quiescence within a region which has already been diagnosed as having entered a Time of Increased Probability (TIP) for the occurrence of a strong earthquake can be used to refine the locality in which the earthquake may be expected to occur. A simple algorithm with parameters fitted from the data in Northern California preceding the 1980 magnitude 7.0 earthquake offshore from Eureka depicts relative quiescence within the region of a TIP. The procedure was tested, without readaptation of parameter, on 17 other strong earthquake occurrences in North America, Japan, and Eurasia, most of which were in regions for which a TIP had been previously diagnosed. The localizing algorithm successfully outlined a region within which the subsequent earthquake occurred for 16 of these 17 strong earthquakes. The area of prediction in each case was reduced significantly, ranging between 7% and 25% of the total area covered by the TIP.

  2. From a physical approach to earthquake prediction, towards long and short term warnings ahead of large earthquakes

    Science.gov (United States)

    Stefansson, R.; Bonafede, M.

    2012-04-01

    For 20 years the South Iceland Seismic Zone (SISZ) was a test site for multinational earthquake prediction research, partly bridging the gap between laboratory tests samples, and the huge transform zones of the Earth. The approach was to explore the physics of processes leading up to large earthquakes. The book Advances in Earthquake Prediction, Research and Risk Mitigation, by R. Stefansson (2011), published by Springer/PRAXIS, and an article in the August issue of the BSSA by Stefansson, M. Bonafede and G. Gudmundsson (2011) contain a good overview of the findings, and more references, as well as examples of partially successful long and short term warnings based on such an approach. Significant findings are: Earthquakes that occurred hundreds of years ago left scars in the crust, expressed in volumes of heterogeneity that demonstrate the size of their faults. Rheology and stress heterogeneity within these volumes are significantly variable in time and space. Crustal processes in and near such faults may be observed by microearthquake information decades before the sudden onset of a new large earthquake. High pressure fluids of mantle origin may in response to strain, especially near plate boundaries, migrate upward into the brittle/elastic crust to play a significant role in modifying crustal conditions on a long and short term. Preparatory processes of various earthquakes can not be expected to be the same. We learn about an impending earthquake by observing long term preparatory processes at the fault, finding a constitutive relationship that governs the processes, and then extrapolating that relationship into near space and future. This is a deterministic approach in earthquake prediction research. Such extrapolations contain many uncertainties. However the long time pattern of observations of the pre-earthquake fault process will help us to put probability constraints on our extrapolations and our warnings. The approach described is different from the usual

  3. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  4. Sociological aspects of earthquake prediction

    Science.gov (United States)

    Spall, H.

    1979-01-01

    Henry Spall talked recently with Denis Mileti who is in the Department of Sociology, Colorado State University, Fort Collins, Colo. Dr. Mileti is a sociologst involved with research programs that study the socioeconomic impact of earthquake prediction

  5. Is It Possible to Predict Strong Earthquakes?

    CERN Document Server

    Polyakov, Yuriy S; Solovyeva, Anna B; Timashev, Serge F

    2015-01-01

    The possibility of earthquake prediction is one of the key open questions in modern geophysics. We propose an approach based on the analysis of common short-term candidate precursors (2 weeks to 3 months prior to strong earthquake) with the subsequent processing of brain activity signals generated in specific types of rats (kept in laboratory settings) who reportedly sense an impending earthquake few days prior to the event. We illustrate the identification of short-term precursors using the groundwater sodium-ion concentration data in the time frame from 2010 to 2014 (a major earthquake occurred on February 28, 2013), recorded at two different sites in the south-eastern part of the Kamchatka peninsula, Russia. The candidate precursors are observed as synchronized peaks in the nonstationarity factors, introduced within the flicker-noise spectroscopy framework for signal processing, for the high-frequency component of both time series. These peaks correspond to the local reorganizations of the underlying geoph...

  6. Earthquake prediction from China's mobile gravity data

    Directory of Open Access Journals (Sweden)

    Yiqing Zhu

    2015-03-01

    Full Text Available The relation between plate tectonics and earthquake evolution is analyzed systematically on the basis of 1998–2010 absolute and relative gravity data from the Crustal Movement Observation Network of China. Most earthquakes originated in the plate boundary or within the fault zone. Tectonic deformation was most intense and exhibited discontinuity within the tectonically active fault zone because of the differential movement; the stress accumulation produced an abrupt gravity change, which was further enhanced by the earthquake. The gravity data from mainland China since 2000 obviously reflected five major earthquakes (Ms > 7, all of which were better reflected than before 2000. Regional gravity anomalies and a gravity gradient change were observed in the area around the epicenter about 2 or 3 years before the earthquake occurred, suggesting that gravity change may be a seismic precursor. Furthermore, in this study, the medium-term predictions of the Ms7.3 Yutian, Ms8.0 Wenchuan, and Ms7.0 Lushan earthquakes are analytically presented and evaluated, especially to estimate location of earthquake.

  7. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  8. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  9. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  10. Current affairs in earthquake prediction in Japan

    Science.gov (United States)

    Uyeda, Seiya

    2015-12-01

    As of mid-2014, the main organizations of the earthquake (EQ hereafter) prediction program, including the Seismological Society of Japan (SSJ), the MEXT Headquarters for EQ Research Promotion, hold the official position that they neither can nor want to make any short-term prediction. It is an extraordinary stance of responsible authorities when the nation, after the devastating 2011 M9 Tohoku EQ, most urgently needs whatever information that may exist on forthcoming EQs. Japan's national project for EQ prediction started in 1965, but it has made no success. The main reason for no success is the failure to capture precursors. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this stance has been further fortified by the 2011 M9 Tohoku Mega-quake. This paper tries to explain how this situation came about and suggest that it may in fact be a legitimate one which should have come a long time ago. Actually, substantial positive changes are taking place now. Some promising signs are arising even from cooperation of researchers with private sectors and there is a move to establish an "EQ Prediction Society of Japan". From now on, maintaining the high scientific standards in EQ prediction will be of crucial importance.

  11. Strong ground motion prediction using virtual earthquakes.

    Science.gov (United States)

    Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

    2014-01-24

    Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

  12. Haicheng, China, earthquake of 4 February 1975: the first successfully predicted major earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Adams, R.D.

    1976-07-01

    The earthquake of magnitude 7.3 that occurred near the town of Haicheng in Northeast China on Feb 4, 1975, was the first major earthquake anywhere in the world known to have been predicted with enough certainty for people to have been warned and measures taken for civil protection. These steps were successful in keeping the number of casualties small. This paper describes a visit to the affected area seven and a half months after the earthquake and discussions with Chinese scientists about their successful prediction methods. The prediction resulted from the synthesis of many types of investigation, but the main methods used for long-, mid-, and short-term prediction appear to have been based on studies of seismicity, deformation, and foreshocks, respectively.

  13. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  14. On some methods for assessing earthquake predictions

    Science.gov (United States)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  15. A mathematical model for predicting earthquake occurrence ...

    African Journals Online (AJOL)

    We consider the continental crust under damage. We use the observed results of microseism in many seismic stations of the world which was established to study the time series of the activities of the continental crust with a view to predicting possible time of occurrence of earthquake. We consider microseism time series ...

  16. 76 FR 19123 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2011-04-06

    ....S. Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) AGENCY: U.S... Earthquake Prediction Evaluation Council (NEPEC) will hold a 1-day meeting on April 16, 2011. The meeting... the Director of the U.S. Geological Survey on proposed earthquake predictions, on the completeness and...

  17. A Deterministic Approach to Earthquake Prediction

    Directory of Open Access Journals (Sweden)

    Vittorio Sgrigna

    2012-01-01

    Full Text Available The paper aims at giving suggestions for a deterministic approach to investigate possible earthquake prediction and warning. A fundamental contribution can come by observations and physical modeling of earthquake precursors aiming at seeing in perspective the phenomenon earthquake within the framework of a unified theory able to explain the causes of its genesis, and the dynamics, rheology, and microphysics of its preparation, occurrence, postseismic relaxation, and interseismic phases. Studies based on combined ground and space observations of earthquake precursors are essential to address the issue. Unfortunately, up to now, what is lacking is the demonstration of a causal relationship (with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. In doing this, modern and/or new methods and technologies have to be adopted to try to solve the problem. Coordinated space- and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of Low-Earth-Orbit (LEO satellites. Moreover, a new strong theoretical scientific effort is necessary to try to understand the physics of the earthquake.

  18. 78 FR 64973 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2013-10-30

    ... Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) AGENCY: U.S. Geological Survey, Interior. ACTION: Notice of meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake... proposed earthquake predictions, on the completeness and scientific validity of the available data related...

  19. Raising the science awareness of first year undergraduate students via an earthquake prediction seminar

    Science.gov (United States)

    Gilstrap, T. D.

    2011-12-01

    The public is fascinated with and fearful of natural hazards such as earthquakes. After every major earthquake there is a surge of interest in earthquake science and earthquake prediction. Yet many people do not understand the challenges of earthquake prediction and the need to fund earthquake research. An earthquake prediction seminar is offered to first year undergraduate students to improve their understanding of why earthquakes happen, how earthquake research is done and more specifically why it is so challenging to issue short-term earthquake prediction. Some of these students may become scientists but most will not. For the majority this is an opportunity to learn how science research works and how it is related to policy and society. The seminar is seven weeks long, two hours per week and has been taught every year for the last four years. The material is presented conceptually; there is very little quantitative work involved. The class starts with a field trip to the Randolph College Seismic Station where students learn about seismographs and the different types of seismic waves. Students are then provided with basic background on earthquakes. They learn how to pick arrival times using real seismograms, how to use earthquake catalogues, how to predict the arrival of an earthquake wave at any location on Earth. Next they learn about long, intermediate, short and real time earthquake prediction. Discussions are an essential part of the seminar. Students are challenged to draw their own conclusions on the pros and cons of earthquake prediction. Time is designated to discuss the political and economic impact of earthquake prediction. At the end of the seven weeks students are required to write a paper and discuss the need for earthquake prediction. The class is not focused on the science but rather the links between the science issues and their economical and political impact. Weekly homework assignments are used to aid and assess students' learning. Pre and

  20. Predictability of population displacement after the 2010 Haiti earthquake.

    Science.gov (United States)

    Lu, Xin; Bengtsson, Linus; Holme, Petter

    2012-07-17

    Most severe disasters cause large population movements. These movements make it difficult for relief organizations to efficiently reach people in need. Understanding and predicting the locations of affected people during disasters is key to effective humanitarian relief operations and to long-term societal reconstruction. We collaborated with the largest mobile phone operator in Haiti (Digicel) and analyzed the movements of 1.9 million mobile phone users during the period from 42 d before, to 341 d after the devastating Haiti earthquake of January 12, 2010. Nineteen days after the earthquake, population movements had caused the population of the capital Port-au-Prince to decrease by an estimated 23%. Both the travel distances and size of people's movement trajectories grew after the earthquake. These findings, in combination with the disorder that was present after the disaster, suggest that people's movements would have become less predictable. Instead, the predictability of people's trajectories remained high and even increased slightly during the three-month period after the earthquake. Moreover, the destinations of people who left the capital during the first three weeks after the earthquake was highly correlated with their mobility patterns during normal times, and specifically with the locations in which people had significant social bonds. For the people who left Port-au-Prince, the duration of their stay outside the city, as well as the time for their return, all followed a skewed, fat-tailed distribution. The findings suggest that population movements during disasters may be significantly more predictable than previously thought.

  1. Testing for the 'predictability' of dynamically triggered earthquakes in The Geysers geothermal field

    Science.gov (United States)

    Aiken, Chastity; Meng, Xiaofeng; Hardebeck, Jeanne

    2018-03-01

    The Geysers geothermal field is well known for being susceptible to dynamic triggering of earthquakes by large distant earthquakes, owing to the introduction of fluids for energy production. Yet, it is unknown if dynamic triggering of earthquakes is 'predictable' or whether dynamic triggering could lead to a potential hazard for energy production. In this paper, our goal is to investigate the characteristics of triggering and the physical conditions that promote triggering to determine whether or not triggering is in anyway foreseeable. We find that, at present, triggering in The Geysers is not easily 'predictable' in terms of when and where based on observable physical conditions. However, triggered earthquake magnitude positively correlates with peak imparted dynamic stress, and larger dynamic stresses tend to trigger sequences similar to mainshock-aftershock sequences. Thus, we may be able to 'predict' what size earthquakes to expect at The Geysers following a large distant earthquake.

  2. 76 FR 69761 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2011-11-09

    ....S. Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) AGENCY: U.S. Geological Survey. ACTION: Notice of Meeting. SUMMARY: Pursuant to Public Law 96-472, the National Earthquake... Government. The Council shall advise the Director of the U.S. Geological Survey on proposed earthquake...

  3. Gambling scores for earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  4. A test to evaluate the earthquake prediction algorithm, M8

    Science.gov (United States)

    Healy, John H.; Kossobokov, Vladimir G.; Dewey, James W.

    1992-01-01

    A test of the algorithm M8 is described. The test is constructed to meet four rules, which we propose to be applicable to the test of any method for earthquake prediction:  1. An earthquake prediction technique should be presented as a well documented, logical algorithm that can be used by  investigators without restrictions. 2. The algorithm should be coded in a common programming language and implementable on widely available computer systems. 3. A test of the earthquake prediction technique should involve future predictions with a black box version of the algorithm in which potentially adjustable parameters are fixed in advance. The source of the input data must be defined and ambiguities in these data must be resolved automatically by the algorithm. 4. At least one reasonable null hypothesis should be stated in advance of testing the earthquake prediction method, and it should be stated how this null hypothesis will be used to estimate the statistical significance of the earthquake predictions. The M8 algorithm has successfully predicted several destructive earthquakes, in the sense that the earthquakes occurred inside regions with linear dimensions from 384 to 854 km that the algorithm had identified as being in times of increased probability for strong earthquakes. In addition, M8 has successfully "post predicted" high percentages of strong earthquakes in regions to which it has been applied in retroactive studies. The statistical significance of previous predictions has not been established, however, and post-prediction studies in general are notoriously subject to success-enhancement through hindsight. Nor has it been determined how much more precise an M8 prediction might be than forecasts and probability-of-occurrence estimates made by other techniques. We view our test of M8 both as a means to better determine the effectiveness of M8 and as an experimental structure within which to make observations that might lead to improvements in the algorithm or

  5. Adjusting the M8 algorithm to earthquake prediction in the Iranian plateau

    Science.gov (United States)

    Mojarab, Masoud; Memarian, Hossein; Zare, Mehdi; Kossobokov, Vladimir

    2017-07-01

    Earthquake prediction is one of the challenging problems of seismology. The present study intended to setup a routine prediction of major earthquakes in the Iranian plateau using a modification of the intermediate-term middle-range algorithm M8, in which original version has demonstrated high performance in a real-time Global Test over the last two decades. An investigation of earthquake catalog covering the entire the Iranian plateau through 2012 has shown that a modification of the M8 algorithm, adjusted for a rather low level of earthquake occurrence reported in the region, is capable for targeting magnitude 7.5+ events. The occurrence of the April 16, 2013, M7.7 Saravan and the September 24, 2013, M7.7 Awaran earthquakes at the time of writing this paper (14 months before Saravan earthquake occurrence) confirmed the results of investigation and demonstrated the need for further studies in this region. Earlier tests, M8 application in all over the Iran, showed that the 2013 Saravan and Awaran earthquakes may precede a great earthquake with magnitude 8+ in Makran region. To verify this statement, the algorithm M8 was applied once again on an updated catalog to September 2013. The result indicated that although the study region recently experienced two magnitude 7.5+ earthquakes, it remains prone to a major earthquake. The present study confirms the applicability of M8 algorithm for predicting earthquakes in the Iranian plateau and establishes an opportunity for a routine monitoring of seismic activity aimed at prediction of the largest earthquakes that can play a significant role in mitigation of damages due to natural hazard.

  6. Prediction of earthquakes: a data evaluation and exchange problem

    Energy Technology Data Exchange (ETDEWEB)

    Melchior, Paul

    1978-11-15

    Recent experiences in earthquake prediction are recalled. Precursor information seems to be available from geodetic measurements, hydrological and geochemical measurements, electric and magnetic measurements, purely seismic phenomena, and zoological phenomena; some new methods are proposed. A list of possible earthquake triggers is given. The dilatancy model is contrasted with a dry model; they seem to be equally successful. In conclusion, the space and time range of the precursors is discussed in relation to the magnitude of earthquakes. (RWR)

  7. Gambling score in earthquake prediction analysis

    Science.gov (United States)

    Molchan, G.; Romashkova, L.

    2011-03-01

    The number of successes and the space-time alarm rate are commonly used to characterize the strength of an earthquake prediction method and the significance of prediction results. It has been recently suggested to use a new characteristic to evaluate the forecaster's skill, the gambling score (GS), which incorporates the difficulty of guessing each target event by using different weights for different alarms. We expand parametrization of the GS and use the M8 prediction algorithm to illustrate difficulties of the new approach in the analysis of the prediction significance. We show that the level of significance strongly depends (1) on the choice of alarm weights, (2) on the partitioning of the entire alarm volume into component parts and (3) on the accuracy of the spatial rate measure of target events. These tools are at the disposal of the researcher and can affect the significance estimate. Formally, all reasonable GSs discussed here corroborate that the M8 method is non-trivial in the prediction of 8.0 ≤M < 8.5 events because the point estimates of the significance are in the range 0.5-5 per cent. However, the conservative estimate 3.7 per cent based on the number of successes seems preferable owing to two circumstances: (1) it is based on relative values of the spatial rate and hence is more stable and (2) the statistic of successes enables us to construct analytically an upper estimate of the significance taking into account the uncertainty of the spatial rate measure.

  8. The Long-term Impacts of Earthquakes on Economic Growth

    Science.gov (United States)

    Lackner, S.

    2016-12-01

    The social science literature has so far not reached a consensus on whether and how earthquakes actually impact economic growth in the long-run. Several hypotheses have been suggested and some even argue for a positive impact. A general weakness in the literature, however, is the predominant use of inadequate measures for the exogenous natural hazard of an earthquake. The most common problems are the lack of individual event size (e.g. earthquake dummy or number of events), the use of magnitude instead of a measure for surface shaking, and endogeneity issues when traditional qualitative intensity scales or actual impact data is used. Here we use peak ground acceleration (PGA) as the ground motion intensity measure and investigate the impacts of earthquake shaking on long-run economic growth. We construct a data set from USGS ShakeMaps that can be considered the universe of global relevant earthquake ground shaking from 1973 to 2014. This data set is then combined with World Bank GDP data to conduct a regression analysis. Furthermore, the impacts of PGA on different industries and other economic variables such as employment and education are also investigated. This will on one hand help to identify the mechanism of how earthquakes impact long-run growth and also show potential impacts on other welfare indicators that are not captured by GDP. This is the first application of global earthquake shaking data to investigate long-term earthquake impacts.

  9. Prospects for earthquake prediction and control

    Science.gov (United States)

    Healy, J.H.; Lee, W.H.K.; Pakiser, L.C.; Raleigh, C.B.; Wood, M.D.

    1972-01-01

    The San Andreas fault is viewed, according to the concepts of seafloor spreading and plate tectonics, as a transform fault that separates the Pacific and North American plates and along which relative movements of 2 to 6 cm/year have been taking place. The resulting strain can be released by creep, by earthquakes of moderate size, or (as near San Francisco and Los Angeles) by great earthquakes. Microearthquakes, as mapped by a dense seismograph network in central California, generally coincide with zones of the San Andreas fault system that are creeping. Microearthquakes are few and scattered in zones where elastic energy is being stored. Changes in the rate of strain, as recorded by tiltmeter arrays, have been observed before several earthquakes of about magnitude 4. Changes in fluid pressure may control timing of seismic activity and make it possible to control natural earthquakes by controlling variations in fluid pressure in fault zones. An experiment in earthquake control is underway at the Rangely oil field in Colorado, where the rates of fluid injection and withdrawal in experimental wells are being controlled. ?? 1972.

  10. Turning the rumor of May 11, 2011 earthquake prediction In Rome, Italy, into an information day on earthquake hazard

    Science.gov (United States)

    Amato, A.; Cultrera, G.; Margheriti, L.; Nostro, C.; Selvaggi, G.; INGVterremoti Team

    2011-12-01

    headquarters until 9 p.m.: families, school classes with and without teachers, civil protection groups, journalists. This initiative, built up in a few weeks, had a very large feedback, also due to the media highlighting the presumed prediction. Although we could not rule out the possibility of a strong earthquake in central Italy (with effects in Rome) we tried to explain the meaning of short term earthquake prediction vs. probabilistic seismic hazard assessment. Despite many people remained with the fear (many decided to take a day off and leave the town or stay in public parks), we contributed to reduce this feeling and therefore the social cost of this strange Roman day. Moreover, another lesson learned is that these (fortunately sporadic) circumstances, when people's attention is high, are important opportunities for science communication. We thank all the INGV colleagues who contributed to the May 11 Open Day, in particular the Press Office, the Educational and Outreach laboratory, the Graphics Laboratory and SissaMedialab. P.S. no large earthquake happened

  11. Application of Astronomic Time-latitude Residuals in Earthquake Prediction

    Science.gov (United States)

    Yanben, Han; Lihua, Ma; Hui, Hu; Rui, Wang; Youjin, Su

    2007-04-01

    After the earthquake (Ms = 6.1) occurred in Luquan county of Yunnan province on April 18, 1985, the relationship between major earthquakes and astronomical time-latitude residuals (ATLR) of a photoelectric astrolabe in Yunnan Observatory was analyzed. ATLR are the rest after deducting the effects of Earth’s whole motion from the observations of time and latitude. It was found that there appeared the anomalies of the ATLR before earthquakes which happened in and around Yunnan, a seismic active region. The reason of the anomalies is possibly from change of the plumb line due to the motion of the groundmass before earthquakes. Afterwards, using studies of the anomalous characters and laws of ATLR, we tried to provide the warning information prior to the occurrence of a few major earthquakes in the region. The significant synchronous anomalies of ATLR of the observatory appeared before the earthquake of magnitude 6.2 in Dayao county of Yunnan province, on July 21, 2003. It has been again verified that the anomalies possibly provide the prediction information for strong earthquakes around the observatory.

  12. Long-term impact of earthquakes on sleep quality.

    Directory of Open Access Journals (Sweden)

    Daniela Tempesta

    Full Text Available PURPOSE: We investigated the impact of the 6.3 magnitude 2009 L'Aquila (Italy earthquake on standardized self-report measures of sleep quality (Pittsburgh Sleep Quality Index, PSQI and frequency of disruptive nocturnal behaviours (Pittsburgh Sleep Quality Index-Addendum, PSQI-A two years after the natural disaster. METHODS: Self-reported sleep quality was assessed in 665 L'Aquila citizens exposed to the earthquake compared with a different sample (n = 754 of L'Aquila citizens tested 24 months before the earthquake. In addition, sleep quality and disruptive nocturnal behaviours (DNB of people exposed to the traumatic experience were compared with people that in the same period lived in different areas ranging between 40 and 115 km from the earthquake epicenter (n = 3574. RESULTS: The comparison between L'Aquila citizens before and after the earthquake showed a significant deterioration of sleep quality after the exposure to the trauma. In addition, two years after the earthquake L'Aquila citizens showed the highest PSQI scores and the highest incidence of DNB compared to subjects living in the surroundings. Interestingly, above-the-threshold PSQI scores were found in the participants living within 70 km from the epicenter, while trauma-related DNBs were found in people living in a range of 40 km. Multiple regressions confirmed that proximity to the epicenter is predictive of sleep disturbances and DNB, also suggesting a possible mediating effect of depression on PSQI scores. CONCLUSIONS: The psychological effects of an earthquake may be much more pervasive and long-lasting of its building destruction, lasting for years and involving a much larger population. A reduced sleep quality and an increased frequency of DNB after two years may be a risk factor for the development of depression and posttraumatic stress disorder.

  13. Implications of fault constitutive properties for earthquake prediction

    Science.gov (United States)

    Dieterich, J.H.; Kilgore, B.

    1996-01-01

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance D(c), apparent fracture energy at a rupture front, time- dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of D, apply to faults in nature. However, scaling of D(c) is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  14. Researches on Application of GPS to Earthquake Monitoring and Prediction

    Directory of Open Access Journals (Sweden)

    Wanju BO

    2007-10-01

    Full Text Available The earliest researches on application of GPS to earthquake monitoring and prediction in China began in 1980s, and it was limited to learn some relative technology from other countries and do some test with a few of equipments. As the improvement of software for data processing and the depreciating of hardware, several local GPS network had been gradually set up till the end of 1990s, and then more systematically GPS monitoring, data processing and its application research have been done gradually. In this paper, 3 research examples of the application of GPS to earthquake monitoring and prediction are presented.

  15. Earthquake Prediction Research In Iceland, Applications For Hazard Assessments and Warnings

    Science.gov (United States)

    Stefansson, R.

    Earthquake prediction research in Iceland, applications for hazard assessments and warnings. The first multinational earthquake prediction research project in Iceland was the Eu- ropean Council encouraged SIL project of the Nordic countries, 1988-1995. The path selected for this research was to study the physics of crustal processes leading to earth- quakes. It was considered that small earthquakes, down to magnitude zero, were the most significant for this purpose, because of the detailed information which they pro- vide both in time and space. The test area for the project was the earthquake prone region of the South Iceland seismic zone (SISZ). The PRENLAB and PRENLAB-2 projects, 1996-2000 supported by the European Union were a direct continuation of the SIL project, but with a more multidisciplinary approach. PRENLAB stands for "Earthquake prediction research in a natural labo- ratory". The basic objective was to advance our understanding in general on where, when and how dangerous NH10earthquake motion might strike. Methods were devel- oped to study crustal processes and conditions, by microearthquake information, by continuous GPS, InSAR, theoretical modelling, fault mapping and paleoseismology. New algorithms were developed for short term warnings. A very useful short term warning was issued twice in the year 2000, one for a sudden start of an eruption in Volcano Hekla February 26, and the other 25 hours before a second (in a sequence of two) magnitude 6.6 (Ms) earthquake in the South Iceland seismic zone in June 21, with the correct location and approximate size. A formal short term warning, although not going to the public, was also issued before a magnitude 5 earthquake in November 1998. In the presentation it will be shortly described what these warnings were based on. A general hazard assessmnets was presented in scientific journals 10-15 years ago assessing within a few kilometers the location of the faults of the two 2000 earthquakes and suggesting

  16. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  17. Short-term earthquake probabilities during the L'Aquila earthquake sequence in central Italy, 2009

    Science.gov (United States)

    Falcone, G.; Murru, M.; Zhuang, J.; Console, R.

    2014-12-01

    We compare the forecasting performance of several statistical models, which are used to describe the occurrence process of earthquakes, in forecasting the short-term earthquake probabilities during the occurrence of the L'Aquila earthquake sequence in central Italy, 2009. These models include the Proximity to Past Earthquakes (PPE) model and different versions of the Epidemic Type Aftershock Sequence (ETAS) model. We used the information gains corresponding to the Poisson and binomial scores to evaluate the performance of these models. It is shown that all ETAS models work better than the PPE model. However, when comparing the different types of the ETAS models, the one with the same fixed exponent coefficient α = 2.3 for both the productivity function and the scaling factor in the spatial response function, performs better in forecasting the active aftershock sequence than the other models with different exponent coefficients when the Poisson score is adopted. These latter models perform only better when a lower magnitude threshold of 2.0 and the binomial score are used. The reason is likely due to the fact that the catalog does not contain an event of magnitude similar to the L'Aquila main shock (Mw 6.3) in the training period (April 16, 2005 to March 15, 2009). In this case the a-value is under-estimated and thus also the forecasted seismicity is underestimated when the productivity function is extrapolated to high magnitudes. These results suggest that the training catalog used for estimating the model parameters should include earthquakes of similar magnitudes as the main shock when forecasting seismicity is during an aftershock sequences.

  18. 75 FR 63854 - National Earthquake Prediction Evaluation Council (NEPEC) Advisory Committee

    Science.gov (United States)

    2010-10-18

    ... Geological Survey National Earthquake Prediction Evaluation Council (NEPEC) Advisory Committee AGENCY: U.S... Earthquake Prediction Evaluation Council (NEPEC) will hold a 2-day meeting on November 3 and 4, 2010. The... the Director of the U.S. Geological Survey (USGS) on proposed earthquake predictions, on the...

  19. Interpreting Financial Market Crashes as Earthquakes: A New early Warning System for Medium Term Crashes

    OpenAIRE

    Gresnigt, Francine; Kole, Erik; Franses, Philip Hans

    2014-01-01

    This discussion paper has led to a publication in the Journal of Banking and Finance , 2015, 56, 123-139. We propose a modeling framework which allows for creating probability predictions on a future market crash in the medium term, like sometime in the next five days. Our framework draws upon noticeable similarities between stock returns around a financial market crash and seismic activity around earthquakes. Our model is incorporated in an Early Warning System for future crash days. Testing...

  20. Analysing earthquake slip models with the spatial prediction comparison test

    KAUST Repository

    Zhang, L.

    2014-11-10

    Earthquake rupture models inferred from inversions of geophysical and/or geodetic data exhibit remarkable variability due to uncertainties in modelling assumptions, the use of different inversion algorithms, or variations in data selection and data processing. A robust statistical comparison of different rupture models obtained for a single earthquake is needed to quantify the intra-event variability, both for benchmark exercises and for real earthquakes. The same approach may be useful to characterize (dis-)similarities in events that are typically grouped into a common class of events (e.g. moderate-size crustal strike-slip earthquakes or tsunamigenic large subduction earthquakes). For this purpose, we examine the performance of the spatial prediction comparison test (SPCT), a statistical test developed to compare spatial (random) fields by means of a chosen loss function that describes an error relation between a 2-D field (‘model’) and a reference model. We implement and calibrate the SPCT approach for a suite of synthetic 2-D slip distributions, generated as spatial random fields with various characteristics, and then apply the method to results of a benchmark inversion exercise with known solution. We find the SPCT to be sensitive to different spatial correlations lengths, and different heterogeneity levels of the slip distributions. The SPCT approach proves to be a simple and effective tool for ranking the slip models with respect to a reference model.

  1. Artificial neural network model for earthquake prediction with radon monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Kuelahci, Fatih [Science and Art Faculty, Physics Department, Firat University, Elazig 23169 (Turkey)], E-mail: fatihkulahci@firat.edu.tr; Inceoez, Murat [Engineering Faculty, Geology Department, Firat University, Elazig 23169 (Turkey); Dogru, Mahmut [Science and Art Faculty, Physics Department, Firat University, Elazig 23169 (Turkey)], E-mail: mdogru@firat.edu.tr; Aksoy, Ercan [Engineering Faculty, Geology Department, Firat University, Elazig 23169 (Turkey); Baykara, Oktay [Education Faculty, Science Education Division, Firat University, Elazig 23169 (Turkey)

    2009-01-15

    Apart from the linear monitoring studies concerning the relationship between radon and earthquake, an artificial neural networks (ANNs) model approach is presented starting out from non-linear changes of the eight different parameters during the earthquake occurrence. A three-layer Levenberg-Marquardt feedforward learning algorithm is used to model the earthquake prediction process in the East Anatolian Fault System (EAFS). The proposed ANN system employs individual training strategy with fixed-weight and supervised models leading to estimations. The average relative error between the magnitudes of the earthquakes acquired by ANN and measured data is about 2.3%. The relative error between the test and earthquake data varies between 0% and 12%. In addition, the factor analysis was applied on all data and the model output values to see the statistical variation. The total variance of 80.18% was explained with four factors by this analysis. Consequently, it can be concluded that ANN approach is a potential alternative to other models with complex mathematical operations.

  2. Tsunami Prediction and Earthquake Parameters Estimation in the Red Sea

    KAUST Repository

    Sawlan, Zaid A

    2012-12-01

    Tsunami concerns have increased in the world after the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami. Consequently, tsunami models have been developed rapidly in the last few years. One of the advanced tsunami models is the GeoClaw tsunami model introduced by LeVeque (2011). This model is adaptive and consistent. Because of different sources of uncertainties in the model, observations are needed to improve model prediction through a data assimilation framework. Model inputs are earthquake parameters and topography. This thesis introduces a real-time tsunami forecasting method that combines tsunami model with observations using a hybrid ensemble Kalman filter and ensemble Kalman smoother. The filter is used for state prediction while the smoother operates smoothing to estimate the earthquake parameters. This method reduces the error produced by uncertain inputs. In addition, state-parameter EnKF is implemented to estimate earthquake parameters. Although number of observations is small, estimated parameters generates a better tsunami prediction than the model. Methods and results of prediction experiments in the Red Sea are presented and the prospect of developing an operational tsunami prediction system in the Red Sea is discussed.

  3. A probabilistic neural network for earthquake magnitude prediction.

    Science.gov (United States)

    Adeli, Hojjat; Panakkat, Ashif

    2009-09-01

    A probabilistic neural network (PNN) is presented for predicting the magnitude of the largest earthquake in a pre-defined future time period in a seismic region using eight mathematically computed parameters known as seismicity indicators. The indicators considered are the time elapsed during a particular number (n) of significant seismic events before the month in question, the slope of the Gutenberg-Richter inverse power law curve for the n events, the mean square deviation about the regression line based on the Gutenberg-Richter inverse power law for the n events, the average magnitude of the last n events, the difference between the observed maximum magnitude among the last n events and that expected through the Gutenberg-Richter relationship known as the magnitude deficit, the rate of square root of seismic energy released during the n events, the mean time or period between characteristic events, and the coefficient of variation of the mean time. Prediction accuracies of the model are evaluated using three different statistical measures: the probability of detection, the false alarm ratio, and the true skill score or R score. The PNN model is trained and tested using data for the Southern California region. The model yields good prediction accuracies for earthquakes of magnitude between 4.5 and 6.0. The PNN model presented in this paper complements the recurrent neural network model developed by the authors previously, where good results were reported for predicting earthquakes with magnitude greater than 6.0.

  4. TheLong- Termed Outcomes of the Earthquakes on University Students Exposed to 1999 Marmara Region Earthquakes

    OpenAIRE

    Esra Ceyhan; Aydogan Aykut Ceyhan

    2006-01-01

    In this research, the purpose is to investigate health and memory problems, substance abuse status and fears of university students caused by the earthquakes. It was examined whether these long-term outcomes (health and memory problems, substance abuse status and fears) were related to gender, the situation of losing family members or significant others in the earthquake, the status of sheltering (in tents or prefabricated homes). This research was carried out with 209 university students att...

  5. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    NARCIS (Netherlands)

    Cheong, S.A.; Tan, T.L.; Chen, C.-C.; Chang, W.-L.; Liu, Z.; Chew, L.Y.; Sloot, P.M.A.; Johnson, N.F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting

  6. Jet streams anomalies as possible short-term precursors of earthquakes with M>6.0

    Directory of Open Access Journals (Sweden)

    Hong-Chun Wu

    2014-02-01

    Full Text Available Satellite data of thermal images revealed the existence of thermal fields, connected with big linear structures and systems of crust faults. The measuring height of outgoing longwave radiation is located to the range of jet stream. This work describes a possible link between strong earthquakes and jet streams in two regions. The front or tail ends of jet groups maintain their position for 6 or more hours in the vicinity of epicenters of strong (M>6.0 earthquakes in 2006-2010. The probability of observing a stationary jet stream behavior is estimated in 93.6% of the cases on one sixhour map and in 26.7% of cases - on two adjacent maps. The median of distribution of distances between epicenters and the relevant positions of jet stream corresponds to 36.5 km. Estimates of cumulative probability of realization of prediction were 24.2% for 10 days, 48.4% for 20 days, 66.1% for 30 days, 87.1% for 40 days, 93.5% for 50 days and 100% during 70 days. The observed precursory effects are of considerable interest for possible use for real short-term prediction of earthquakes.

  7. Prediction model of earthquake with the identification of earthquake source polarity mechanism through the focal classification using ANFIS and PCA technique

    Science.gov (United States)

    Setyonegoro, W.

    2016-05-01

    Incidence of earthquake disaster has caused casualties and material in considerable amounts. This research has purposes to predictability the return period of earthquake with the identification of the mechanism of earthquake which in case study area in Sumatra. To predict earthquakes which training data of the historical earthquake is using ANFIS technique. In this technique the historical data set compiled into intervals of earthquake occurrence daily average in a year. Output to be obtained is a model return period earthquake events daily average in a year. Return period earthquake occurrence models that have been learning by ANFIS, then performed the polarity recognition through image recognition techniques on the focal sphere using principal component analysis PCA method. The results, model predicted a return period earthquake events for the average monthly return period showed a correlation coefficient 0.014562.

  8. Study of Earthquake Disaster Prediction System of Langfang city Based on GIS

    Science.gov (United States)

    Huang, Meng; Zhang, Dian; Li, Pan; Zhang, YunHui; Zhang, RuoFei

    2017-07-01

    In this paper, according to the status of China’s need to improve the ability of earthquake disaster prevention, this paper puts forward the implementation plan of earthquake disaster prediction system of Langfang city based on GIS. Based on the GIS spatial database, coordinate transformation technology, GIS spatial analysis technology and PHP development technology, the seismic damage factor algorithm is used to predict the damage of the city under different intensity earthquake disaster conditions. The earthquake disaster prediction system of Langfang city is based on the B / S system architecture. Degree and spatial distribution and two-dimensional visualization display, comprehensive query analysis and efficient auxiliary decision-making function to determine the weak earthquake in the city and rapid warning. The system has realized the transformation of the city’s earthquake disaster reduction work from static planning to dynamic management, and improved the city’s earthquake and disaster prevention capability.

  9. Turning the rumor of the May 11, 2011, earthquake prediction in Rome, Italy, into an information day on earthquake hazard

    Directory of Open Access Journals (Sweden)

    Concetta Nostro

    2012-07-01

    Full Text Available A devastating earthquake was predicted to hit Rome on May 11, 2011. This prediction was never officially released, but it grew on the internet and was amplified by the media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment, and this fed the credibility of the earthquake prediction. During the months preceding May 2011, the Istituto Nazionale di Geofisica e Vulcanologia (INGV was overwhelmed with requests for information about this prediction, by the inhabitants of Rome and by tourists. Given the echo of this earthquake prediction, on May 11, 2011, the INGV decided to organize an Open Day at its headquarters in Rome, to inform the public about Italian seismicity and earthquake physics. The Open Day was preceded by a press conference two days before, to talk with journalists about this prediction, and to present the Open Day. During this ‘Day’, 13 new videos were also posted on our YouTube/INGVterremoti channel to explain earthquake processes and hazards, and to provide periodic updates on seismicity in Italy from the seismicity monitoring room. On May 11, 2011, the INGV headquarters was peacefully invaded by over 3,000 visitors, from 10:00 am to 9:00 pm: families, students with and without teachers, civil protection groups, and many journalists. This initiative that was built up in a few weeks has had very large feedback, and was a great opportunity to talk with journalists and people about earthquake prediction, and more in general about the seismic risk in Italy.

  10. Long- and short-term operational earthquake forecasting in Italy before and after the recent L'Aquila earthquake

    Science.gov (United States)

    Marzocchi, Warner

    2010-05-01

    The recent large earthquake that devastated the city of L'Aquila, on April 6 2009, gave us a unique opportunity to check and push forward the Italian operational long- and short-term earthquake forecasting capability. Here, we describe how this experience brought to light quite new challenges and how we face them. Four points deserve a specific mention. First, this earthquake gave us the unique opportunity to check the status of Italian operational earthquake forecasting at a long- and short-term scale, comparing the forecasts with real data in a pure prospective testing. Second, it has been the first time in which we provided - after the mainshock - daily one-day forecasts to the Civil Protection to manage at best the crisis. Here, we discuss the scientific and practical problems - and solution adopted - encountered in providing such short-term forecasts. Third, we discuss how the short-term probabilistic estimations have been practically used to manage the crisis. Basically, this experience demonstrates an urgent need for a connection between probabilistic forecasts and decision-making in order to establish - before crises - quantitative and transparent protocols for decision support. Four, we show how this event has forged new thoughts about operational earthquake forecasting - i.e., about how science can help to mitigate seismic risk -, and pointed out new scientific challenges for seismologists.

  11. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    Science.gov (United States)

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  12. Rapid decision tool to predict earthquake destruction in Sumatra by using first motion study

    Science.gov (United States)

    Bhakta, Shardul Sanjay

    The main idea of this project is to build an interactive and smart Geographic Information system tool which can help predict intensity of real time earthquakes in Sumatra Island of Indonesia. The tool has an underlying intelligence to predict the intensity of an earthquake depending on analysis of similar earthquakes in the past in that specific region. Whenever an earthquake takes place in Sumatra, a First Motion Study is conducted; this decides its type, depth, latitude and longitude. When the user inputs this information into the input string, the tool will try to find similar earthquakes with a similar First Motion Survey and depth. It will do a survey of similar earthquakes and predict if this real time earthquake can be disastrous or not. This tool has been developed in JAVA. I have used MOJO (Map Objects JAVA Objects) to show map of Indonesia and earthquake locations in the form of points. ESRI has created MOJO which is a set of JAVA API's. The Indonesia map, earthquake location points and its co-relation was all designed using MOJO. MOJO is a powerful tool which made it easy to design the tool. This tool is easy to use and the user has to input only a few parameters for the end result. I hope this tool justifies its use in prediction of earthquakes and help save lives in Sumatra.

  13. WHY WE CANNOT PREDICT STRONG EARTHQUAKES IN THE EARTH’S CRUST

    Directory of Open Access Journals (Sweden)

    Iosif L. Gufeld

    2011-01-01

    needed to address the issues raised in this publication, including problems and possibilities of prediction of earthquakes in the crust. Incontrovertible achievements of the Earth sciences are reviewed, considering specific features of seismic events and variations of various parameters of the lithosphere, the block structure of the lithosphere and processes in the lithosphere. Much attention is given to analyses of driving forces of the seismotectonic process. The studies of variations of parameters of the medium, including rapid (hourly or daily changes, show that processes, that predetermine the state of stresses or the energy capacity of the medium (Figures 2 and 3 in the lithosphere, are overlooked. Analyses are based on processes of interactions between ascending flows of hydrogen and helium and the solid lithosphere. A consequence of such processes is gas porosity that controls many parameters of the medium and the oscillation regime of the threedimensional state of stresses of the block structures (Figures 6, 7, and 12, which impacts the dynamics of block movements. The endogenous activity of the lithosphere and its instability are controlled by degassing of light gases.The paper reviews processes of preparation for strong earthquakes in the crust with regard to the block structure of platform areas and subduction zones (Figures 13 and 14. It is demonstrated that the conventional methods yield ambiguous assessments of seismic hazard both in terms of time and locations of epicenter zones, and focal areas of subduction zones are out of control in principle. Processes that actually take place in the lithosphere are causes of such an ambiguity, i.e. the lack of any deterministic relations in development of critical seismotectonic situations. Methods for identification of the geological medium characterized by continuously variable parameters are considered. Directions of fundamental studies of the seismic process and principles of seismic activity monitoring are

  14. WHY WE CANNOT PREDICT STRONG EARTHQUAKES IN THE EARTH’S CRUST

    Directory of Open Access Journals (Sweden)

    Iosif L. Gufeld

    2015-09-01

    needed to address the issues raised in this publication, including problems and possibilities of prediction of earthquakes in the crust. Incontrovertible achievements of the Earth sciences are reviewed, considering specific features of seismic events and variations of various parameters of the lithosphere, the block structure of the lithosphere and processes in the lithosphere. Much attention is given to analyses of driving forces of the seismotectonic process. The studies of variations of parameters of the medium, including rapid (hourly or daily changes, show that processes, that predetermine the state of stresses or the energy capacity of the medium (Figures 2 and 3 in the lithosphere, are overlooked. Analyses are based on processes of interactions between ascending flows of hydrogen and helium and the solid lithosphere. A consequence of such processes is gas porosity that controls many parameters of the medium and the oscillation regime of the threedimensional state of stresses of the block structures (Figures 6, 7, and 12, which impacts the dynamics of block movements. The endogenous activity of the lithosphere and its instability are controlled by degassing of light gases.The paper reviews processes of preparation for strong earthquakes in the crust with regard to the block structure of platform areas and subduction zones (Figures 13 and 14. It is demonstrated that the conventional methods yield ambiguous assessments of seismic hazard both in terms of time and locations of epicenter zones, and focal areas of subduction zones are out of control in principle. Processes that actually take place in the lithosphere are causes of such an ambiguity, i.e. the lack of any deterministic relations in development of critical seismotectonic situations. Methods for identification of the geological medium characterized by continuously variable parameters are considered. Directions of fundamental studies of the seismic process and principles of seismic activity monitoring are

  15. The long-term geologic hazards in areas struck by large-magnitude earthquakes

    NARCIS (Netherlands)

    Wasowski, Janusz; Jibson, Randell W.; Huang, Runqiu; van Asch, Theo

    2014-01-01

    Large-magnitude earthquakes occur every year, but most hit remote and uninhabited regions and thus go unnoticed. Although populated areas are affected infrequently by large earthquakes, each time the outcomes are devastating in terms of life and property loss. The human and economic costs of natural

  16. Uncertainty, variability, and earthquake physics in ground‐motion prediction equations

    Science.gov (United States)

    Baltay, Annemarie S.; Hanks, Thomas C.; Abrahamson, Norm A.

    2017-01-01

    Residuals between ground‐motion data and ground‐motion prediction equations (GMPEs) can be decomposed into terms representing earthquake source, path, and site effects. These terms can be cast in terms of repeatable (epistemic) residuals and the random (aleatory) components. Identifying the repeatable residuals leads to a GMPE with reduced uncertainty for a specific source, site, or path location, which in turn can yield a lower hazard level at small probabilities of exceedance. We illustrate a schematic framework for this residual partitioning with a dataset from the ANZA network, which straddles the central San Jacinto fault in southern California. The dataset consists of more than 3200 1.15≤M≤3 earthquakes and their peak ground accelerations (PGAs), recorded at close distances (R≤20  km). We construct a small‐magnitude GMPE for these PGA data, incorporating VS30 site conditions and geometrical spreading. Identification and removal of the repeatable source, path, and site terms yield an overall reduction in the standard deviation from 0.97 (in ln units) to 0.44, for a nonergodic assumption, that is, for a single‐source location, single site, and single path. We give examples of relationships between independent seismological observables and the repeatable terms. We find a correlation between location‐based source terms and stress drops in the San Jacinto fault zone region; an explanation of the site term as a function of kappa, the near‐site attenuation parameter; and a suggestion that the path component can be related directly to elastic structure. These correlations allow the repeatable source location, site, and path terms to be determined a priori using independent geophysical relationships. Those terms could be incorporated into location‐specific GMPEs for more accurate and precise ground‐motion prediction.

  17. Earthquakes

    Science.gov (United States)

    ... earthquake occurs in a populated area, it may cause property damage, injuries, and even deaths. If you live in a coastal area, there is the possibility of a tsunami. Damage from earthquakes can also lead to floods or fires. Although there are no guarantees of ...

  18. Characteristics of 2009 L'Aquila earthquake with an emphasis on earthquake prediction and geotechnical damage

    OpenAIRE

    アイダン, オメル; KUMSAR, Halil; TOPRAK, Selçuk; Barla, Giovanni

    2010-01-01

    The 2009 L'Aquila earthquake with moment magnitude Mw 6.3 occurred in the Abruzzi Region in Central Italy. The earthquake caused the loss of 294 lives and the casualties were particularly heavy in the old city of L'Aquila. The authors have readily investigated the damages due to this earthquake, which was caused by a normal fault with the heavily damaged part being on the hanging-wall side of the causative fault. Approximately 2 weeks before the earthquake, a technician involved with radon mo...

  19. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    Science.gov (United States)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  20. Predictive factors of depression symptoms among adolescents in the 18-month follow-up after Wenchuan earthquake in China.

    Science.gov (United States)

    Chui, Cheryl H K; Ran, Mao-Sheng; Li, Rong-Hui; Fan, Mei; Zhang, Zhen; Li, Yuan-Hao; Ou, Guo Jing; Jiang, Zhe; Tong, Yu-Zhen; Fang, Ding-Zhi

    2017-02-01

    It is unclear about the change and risk factors of depression among adolescent survivors after earthquake. This study aimed to explore the change of depression, and identify the predictive factors of depression among adolescent survivors after the 2008 Wenchuan earthquake in China. The depression among high school students at 6, 12 and 18 months after the Wenchuan earthquake were investigated. The Beck Depression Inventory (BDI) was used in this study to assess the severity of depression. Subjects included 548 student survivors in an affected high school. The rates of depression among the adolescent survivors at 6-, 12- and 18-month after the earthquake were 27.3%, 42.9% and 33.3%, respectively, for males, and 42.9%, 61.9% and 53.4%, respectively, for females. Depression symptoms, trauma-related self-injury, suicidal ideation and PTSD symptoms at the 6-month follow-up were significant predictive factors for depression at the 18-month time interval following the earthquake. This study highlights the need for considering disaster-related psychological sequela and risk factors of depression symptoms in the planning and implementation of mental health services. Long-term mental and psychological supports for victims of natural disasters are imperative.

  1. Earthquakes

    Science.gov (United States)

    Shedlock, Kaye M.; Pakiser, Louis Charles

    1998-01-01

    One of the most frightening and destructive phenomena of nature is a severe earthquake and its terrible aftereffects. An earthquake is a sudden movement of the Earth, caused by the abrupt release of strain that has accumulated over a long time. For hundreds of millions of years, the forces of plate tectonics have shaped the Earth as the huge plates that form the Earth's surface slowly move over, under, and past each other. Sometimes the movement is gradual. At other times, the plates are locked together, unable to release the accumulating energy. When the accumulated energy grows strong enough, the plates break free. If the earthquake occurs in a populated area, it may cause many deaths and injuries and extensive property damage. Today we are challenging the assumption that earthquakes must present an uncontrollable and unpredictable hazard to life and property. Scientists have begun to estimate the locations and likelihoods of future damaging earthquakes. Sites of greatest hazard are being identified, and definite progress is being made in designing structures that will withstand the effects of earthquakes.

  2. An application of earthquake prediction algorithm M8 in eastern ...

    Indian Academy of Sciences (India)

    On 23rd October 2011, an M7.3 earthquake near the Turkish city of Van, killed more than 600 people, injured over 4000, and left about 60,000 homeless. It demolished hundreds of buildings and caused great damages to thousand others in Van, Ercis, Muradiye, and Çaldıran. The earthquake's epicenter is located about 70 ...

  3. An application of earthquake prediction algorithm M8 in eastern ...

    Indian Academy of Sciences (India)

    On 23rd October 2011, an M7.3 earthquake near the Turkish city of Van, killed more than 600 people, injured over 4000, and left about 60,000 homeless. It demolished hundreds of buildings and caused great damages to thousand others in Van, Ercis, Muradiye, and C¸aldıran. The earthquake's epicenter is located about 70 ...

  4. On Possibility To Using Deep-wells Geo-observatories For The Earthquake Prediction

    Science.gov (United States)

    Esipko, O. A.; Rosaev, A. E.

    The problem of earthquake prediction has a significant interest. Taking into account both internal and external factors are necessary. Some publications, attempt to correlate time of seismic events with tides, and show ability of the earthquake prediction, based geophysical fields observations, on are known. In according with our studying earthquake catalogue, most close before Spitak (07.12.1988), significant earthquake was at Caucasus 23.09.1988 in accompaniment Afganistan earthquake 25.09.1988. We had earthquake in Tajikistan after Spitak 22.01.1989 . All thus events take place approximately at similar phase of monthly tide. On the other side, measurements in geo-observatories, based on deep wells, show strong correlation in variations some of geophysical fields and cosmic factors. We study thermal field's variations in Tyrnyaus deep well (North Caucasus) before and after Spitak earthquake. The changes of thermal field, which may be related with catastrophic event were detected. The comparison of according isotherms show, that mean thermal gradient remarkable decrease just before earthquake. The development of monitoring over geothermic fields variations, understanding of their nature, and methods of taking into account seasonal gravitation and electromagnetic variations at the seismic variations detection give us an ability to close for a forecast problem solution. The main conclusions are: 1)Tidal forces are important factor for catastrophic Spitak earthquake generation; 2)Control over geophysical fields variations in well's geo-observatories based in seismic active regions, may allow us to understand the character of change physical parameters before earthquake. It gives ability to develop method of earthquake prediction.

  5. Earthquake prediction rumors can help in building earthquake awareness: the case of May the 11th 2011 in Rome (Italy)

    Science.gov (United States)

    Amato, A.; Arcoraci, L.; Casarotti, E.; Cultrera, G.; Di Stefano, R.; Margheriti, L.; Nostro, C.; Selvaggi, G.; May-11 Team

    2012-04-01

    Banner headlines in an Italian newspaper read on May 11, 2011: "Absence boom in offices: the urban legend in Rome become psychosis". This was the effect of a large-magnitude earthquake prediction in Rome for May 11, 2011. This prediction was never officially released, but it grew up in Internet and was amplified by media. It was erroneously ascribed to Raffaele Bendandi, an Italian self-taught natural scientist who studied planetary motions and related them to earthquakes. Indeed, around May 11, 2011, there was a planetary alignment and this increased the earthquake prediction credibility. Given the echo of this earthquake prediction, INGV decided to organize on May 11 (the same day the earthquake was predicted to happen) an Open Day in its headquarter in Rome to inform on the Italian seismicity and the earthquake physics. The Open Day was preceded by a press conference two days before, attended by about 40 journalists from newspapers, local and national TV's, press agencies and web news magazines. Hundreds of articles appeared in the following two days, advertising the 11 May Open Day. On May 11 the INGV headquarter was peacefully invaded by over 3,000 visitors from 9am to 9pm: families, students, civil protection groups and many journalists. The program included conferences on a wide variety of subjects (from social impact of rumors to seismic risk reduction) and distribution of books and brochures, in addition to several activities: meetings with INGV researchers to discuss scientific issues, visits to the seismic monitoring room (open 24h/7 all year), guided tours through interactive exhibitions on earthquakes and Earth's deep structure. During the same day, thirteen new videos have also been posted on our youtube/INGVterremoti channel to explain the earthquake process and hazard, and to provide real time periodic updates on seismicity in Italy. On May 11 no large earthquake happened in Italy. The initiative, built up in few weeks, had a very large feedback

  6. Stress Concentration Phenomenon Before the 2011 M9.0 Tohoku-Oki Earthquake: its Implication for Earthquake Prediction

    Science.gov (United States)

    Zhang, Y. Q.; Xie, F. R.

    2014-12-01

    seismicity pattern, may provide some valuable information on the stages of stress accumulation, and thus may be used for estimation of earthquake risk. KEYWORDSSource Region, Stress Change, 2011 Tohoku Earthquake, Earthquake Prediction.

  7. Building self-consistent, short-term earthquake probability (STEP models: improved strategies and calibration procedures

    Directory of Open Access Journals (Sweden)

    Damiano Monelli

    2010-11-01

    Full Text Available We present here two self-consistent implementations of a short-term earthquake probability (STEP model that produces daily seismicity forecasts for the area of the Italian national seismic network. Both implementations combine a time-varying and a time-invariant contribution, for which we assume that the instrumental Italian earthquake catalog provides the best information. For the time-invariant contribution, the catalog is declustered using the clustering technique of the STEP model; the smoothed seismicity model is generated from the declustered catalog. The time-varying contribution is what distinguishes the two implementations: 1 for one implementation (STEP-LG, the original model parameterization and estimation is used; 2 for the other (STEP-NG, the mean abundance method is used to estimate aftershock productivity. In the STEP-NG implementation, earthquakes with magnitude up to ML= 6.2 are expected to be less productive compared to the STEP-LG implementation, whereas larger earthquakes are expected to be more productive. We have retrospectively tested the performance of these two implementations and applied likelihood tests to evaluate their consistencies with observed earthquakes. Both of these implementations were consistent with the observed earthquake data in space: STEP-NG performed better than STEP-LG in terms of forecast rates. More generally, we found that testing earthquake forecasts issued at regular intervals does not test the full power of clustering models, and future experiments should allow for more frequent forecasts starting at the times of triggering events.

  8. Predicting earthquakes by analyzing accelerating precursory seismic activity

    Science.gov (United States)

    Varnes, D.J.

    1989-01-01

    During 11 sequences of earthquakes that in retrospect can be classed as foreshocks, the accelerating rate at which seismic moment is released follows, at least in part, a simple equation. This equation (1) is {Mathematical expression},where {Mathematical expression} is the cumulative sum until time, t, of the square roots of seismic moments of individual foreshocks computed from reported magnitudes;C and n are constants; and tfis a limiting time at which the rate of seismic moment accumulation becomes infinite. The possible time of a major foreshock or main shock, tf,is found by the best fit of equation (1), or its integral, to step-like plots of {Mathematical expression} versus time using successive estimates of tfin linearized regressions until the maximum coefficient of determination, r2,is obtained. Analyzed examples include sequences preceding earthquakes at Cremasta, Greece, 2/5/66; Haicheng, China 2/4/75; Oaxaca, Mexico, 11/29/78; Petatlan, Mexico, 3/14/79; and Central Chile, 3/3/85. In 29 estimates of main-shock time, made as the sequences developed, the errors in 20 were less than one-half and in 9 less than one tenth the time remaining between the time of the last data used and the main shock. Some precursory sequences, or parts of them, yield no solution. Two sequences appear to include in their first parts the aftershocks of a previous event; plots using the integral of equation (1) show that the sequences are easily separable into aftershock and foreshock segments. Synthetic seismic sequences of shocks at equal time intervals were constructed to follow equation (1), using four values of n. In each series the resulting distributions of magnitudes closely follow the linear Gutenberg-Richter relation log N=a-bM, and the product n times b for each series is the same constant. In various forms and for decades, equation (1) has been used successfully to predict failure times of stressed metals and ceramics, landslides in soil and rock slopes, and volcanic

  9. Exaggerated Claims About Success Rate of Earthquake Predictions: "Amazing Success" or "Remarkably Unremarkable"?

    Science.gov (United States)

    Kafka, A. L.; Ebel, J. E.

    2005-12-01

    On October 1, 2004, NASA announced on its web site, "Earthquake Forecast Program Has Amazing Success Rate." This announcement claimed that the Rundle-Tiampo earthquake forecast method has accurately predicted the locations of 15 of California's 16 largest earthquakes this decade. Since words like "amazing" carry a lot of meaning to consumers of scientific information, claims of "amazing success" should be limited only to cases where the success is truly amazing. We evaluated the statistical likelihood of the reported success rate of the Rundle-Tiampo prediction method by applying a cellular seismology approach to investigate whether proximity to past earthquakes is a sufficient hypothesis to yield the same level of success as the Rundle-Tiampo method. To delineate where to expect future earthquakes, we used the epicenters of the ANSS earthquake catalog for California from 1932 through 1999 with magnitude≥4.0 ("before" earthquakes). We then tested how many of the 15 events that are shown on the NASA web page ("after" earthquakes) occurred near the "before" earthquake epicenters. We found that with only a 4 km radius around each "before" earthquake epicenter, we successfully forecast the locations of 13/15 (87%) of the "after" earthquakes, and with a 7 km radius we successfully forecast 14/15 (93%) of the earthquakes. The zones created by filling in a 7 km radius around the "before" epicenters cover 18% of the study area. The scorecard maps on the JPL "QuakeSim" web site show an 11 km margin of error for the epicenters of the forecast earthquakes. With an 11 km radius around the past epicenters (covering 31% of the map area), we catch 14/15 of the "after" earthquakes. We conclude that the success rate referred to in the NASA announcement is perhaps better characterized as "remarkably unremarkable", rather than "amazing." The 14/15 success rate for the earthquakes listed on the NASA scorecard is not a rigorous test of the Rundle-Tiampo method, since it appears that

  10. First results from Japanese network for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Okado, Y.

    1984-12-06

    A very extensive high-quality digital network for micro-earthquake and ground-tilt observation has recently been completed in the south Kanto-Tokai area surrounding Tokyo. The first scientific results are now appearing and allow, among other things, a remarkable three-dimensional picture of the earthquake hypocenter distribution to be produced. The distribution maps out the boundaries of the three plates that meet under this region of Japan. The network is described and results on the distribution of earthquakes and cross-sections of hypocenters from July, 1979 to June, 1984 are given. 5 references, 3 figures.

  11. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes

    Science.gov (United States)

    Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai

    2018-01-01

    Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.

  12. 77 FR 53225 - National Earthquake Prediction Evaluation Council (NEPEC)

    Science.gov (United States)

    2012-08-31

    ... project intended to deliver an updated Uniform California Earthquake Rupture Forecast (UCERF3); and on the... School of Mines, 1711 Illinois Avenue, in Golden, Colorado 80401. The meeting will commence in the early...

  13. Short- and Long-Term Earthquake Forecasts Based on Statistical Models

    Science.gov (United States)

    Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner

    2017-04-01

    The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.

  14. [Medium- and long-term health effects of the L'Aquila earthquake (Central Italy, 2009) and of other earthquakes in high-income Countries: a systematic review].

    Science.gov (United States)

    Ripoll Gallardo, Alba; Alesina, Marta; Pacelli, Barbara; Serrone, Dario; Iacutone, Giovanni; Faggiano, Fabrizio; Della Corte, Francesco; Allara, Elias

    2016-01-01

    to compare the methodological characteristics of the studies investigating the middle- and long-term health effects of the L'Aquila earthquake with the features of studies conducted after other earthquakes occurred in highincome Countries. a systematic comparison between the studies which evaluated the health effects of the L'Aquila earthquake (Central Italy, 6th April 2009) and those conducted after other earthquakes occurred in comparable settings. Medline, Scopus, and 6 sources of grey literature were systematically searched. Inclusion criteria comprised measurement of health outcomes at least one month after the earthquake, investigation of earthquakes occurred in high-income Countries, and presence of at least one temporal or geographical control group. out of 2,976 titles, 13 studies regarding the L'Aquila earthquake and 51 studies concerning other earthquakes were included. The L'Aquila and the Kobe/Hanshin- Awaji (Japan, 17th January 1995) earthquakes were the most investigated. Studies on the L'Aquila earthquake had a median sample size of 1,240 subjects, a median duration of 24 months, and used most frequently a cross sectional design (7/13). Studies on other earthquakes had a median sample size of 320 subjects, a median duration of 15 months, and used most frequently a time series design (19/51). the L'Aquila studies often focussed on mental health, while the earthquake effects on mortality, cardiovascular outcomes, and health systems were less frequently evaluated. A more intensive use of routine data could benefit future epidemiological surveillance in the aftermath of earthquakes.

  15. Quantitative prediction of strong motion for a potential earthquake fault

    Directory of Open Access Journals (Sweden)

    Shamita Das

    2010-02-01

    Full Text Available This paper describes a new method for calculating strong motion records for a given seismic region on the basis of the laws of physics using information on the tectonics and physical properties of the earthquake fault. Our method is based on a earthquake model, called a «barrier model», which is characterized by five source parameters: fault length, width, maximum slip, rupture velocity, and barrier interval. The first three parameters may be constrained from plate tectonics, and the fourth parameter is roughly a constant. The most important parameter controlling the earthquake strong motion is the last parameter, «barrier interval». There are three methods to estimate the barrier interval for a given seismic region: 1 surface measurement of slip across fault breaks, 2 model fitting with observed near and far-field seismograms, and 3 scaling law data for small earthquakes in the region. The barrier intervals were estimated for a dozen earthquakes and four seismic regions by the above three methods. Our preliminary results for California suggest that the barrier interval may be determined if the maximum slip is given. The relation between the barrier interval and maximum slip varies from one seismic region to another. For example, the interval appears to be unusually long for Kilauea, Hawaii, which may explain why only scattered evidence of strong ground shaking was observed in the epicentral area of the Island of Hawaii earthquake of November 29, 1975. The stress drop associated with an individual fault segment estimated from the barrier interval and maximum slip lies between 100 and 1000 bars. These values are about one order of magnitude greater than those estimated earlier by the use of crack models without barriers. Thus, the barrier model can resolve, at least partially, the well known discrepancy between the stress-drops measured in the laboratory and those estimated for earthquakes.

  16. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries.

    Science.gov (United States)

    Last, Mark; Rabinowitz, Nitzan; Leonard, Gideon

    2016-01-01

    This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010) are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698) is reached by the Multi-Objective Info-Fuzzy Network (M-IFN) algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year.

  17. Predicting the Maximum Earthquake Magnitude from Seismic Data in Israel and Its Neighboring Countries.

    Directory of Open Access Journals (Sweden)

    Mark Last

    Full Text Available This paper explores several data mining and time series analysis methods for predicting the magnitude of the largest seismic event in the next year based on the previously recorded seismic events in the same region. The methods are evaluated on a catalog of 9,042 earthquake events, which took place between 01/01/1983 and 31/12/2010 in the area of Israel and its neighboring countries. The data was obtained from the Geophysical Institute of Israel. Each earthquake record in the catalog is associated with one of 33 seismic regions. The data was cleaned by removing foreshocks and aftershocks. In our study, we have focused on ten most active regions, which account for more than 80% of the total number of earthquakes in the area. The goal is to predict whether the maximum earthquake magnitude in the following year will exceed the median of maximum yearly magnitudes in the same region. Since the analyzed catalog includes only 28 years of complete data, the last five annual records of each region (referring to the years 2006-2010 are kept for testing while using the previous annual records for training. The predictive features are based on the Gutenberg-Richter Ratio as well as on some new seismic indicators based on the moving averages of the number of earthquakes in each area. The new predictive features prove to be much more useful than the indicators traditionally used in the earthquake prediction literature. The most accurate result (AUC = 0.698 is reached by the Multi-Objective Info-Fuzzy Network (M-IFN algorithm, which takes into account the association between two target variables: the number of earthquakes and the maximum earthquake magnitude during the same year.

  18. Update of the Graizer-Kalkan ground-motion prediction equations for shallow crustal continental earthquakes

    Science.gov (United States)

    Graizer, Vladimir; Kalkan, Erol

    2015-01-01

    A ground-motion prediction equation (GMPE) for computing medians and standard deviations of peak ground acceleration and 5-percent damped pseudo spectral acceleration response ordinates of maximum horizontal component of randomly oriented ground motions was developed by Graizer and Kalkan (2007, 2009) to be used for seismic hazard analyses and engineering applications. This GMPE was derived from the greatly expanded Next Generation of Attenuation (NGA)-West1 database. In this study, Graizer and Kalkan’s GMPE is revised to include (1) an anelastic attenuation term as a function of quality factor (Q0) in order to capture regional differences in large-distance attenuation and (2) a new frequency-dependent sedimentary-basin scaling term as a function of depth to the 1.5-km/s shear-wave velocity isosurface to improve ground-motion predictions for sites on deep sedimentary basins. The new model (GK15), developed to be simple, is applicable to the western United States and other regions with shallow continental crust in active tectonic environments and may be used for earthquakes with moment magnitudes 5.0–8.0, distances 0–250 km, average shear-wave velocities 200–1,300 m/s, and spectral periods 0.01–5 s. Directivity effects are not explicitly modeled but are included through the variability of the data. Our aleatory variability model captures inter-event variability, which decreases with magnitude and increases with distance. The mixed-effects residuals analysis shows that the GK15 reveals no trend with respect to the independent parameters. The GK15 is a significant improvement over Graizer and Kalkan (2007, 2009), and provides a demonstrable, reliable description of ground-motion amplitudes recorded from shallow crustal earthquakes in active tectonic regions over a wide range of magnitudes, distances, and site conditions.

  19. Predictability study on the aftershock sequence following the 2011 Tohoku-Oki, Japan, earthquake: first results

    Science.gov (United States)

    Nanjo, K. Z.; Tsuruoka, H.; Yokoi, S.; Ogata, Y.; Falcone, G.; Hirata, N.; Ishigaki, Y.; Jordan, T. H.; Kasahara, K.; Obara, K.; Schorlemmer, D.; Shiomi, K.; Zhuang, J.

    2012-08-01

    Although no deterministic and reliable earthquake precursor is known to date, we are steadily gaining insight into probabilistic forecasting that draws on space-time characteristics of earthquake clustering. Clustering-based models aiming to forecast earthquakes within the next 24 hours are under test in the global project 'Collaboratory for the Study of Earthquake Predictability' (CSEP). The 2011 March 11 magnitude 9.0 Tohoku-Oki earthquake in Japan provides a unique opportunity to test the existing 1-day CSEP models against its unprecedentedly active aftershock sequence. The original CSEP experiment performs tests after the catalogue is finalized to avoid bias due to poor data quality. However, this study differs from this tradition and uses the preliminary catalogue revised and updated by the Japan Meteorological Agency (JMA), which is often incomplete but is immediately available. This study is intended as a first step towards operability-oriented earthquake forecasting in Japan. Encouragingly, at least one model passed the test in most combinations of the target day and the testing method, although the models could not take account of the megaquake in advance and the catalogue used for forecast generation was incomplete. However, it can also be seen that all models have only limited forecasting power for the period immediately after the quake. Our conclusion does not change when the preliminary JMA catalogue is replaced by the finalized one, implying that the models perform stably over the catalogue replacement and are applicable to operational earthquake forecasting. However, we emphasize the need of further research on model improvement to assure the reliability of forecasts for the days immediately after the main quake. Seismicity is expected to remain high in all parts of Japan over the coming years. Our results present a way to answer the urgent need to promote research on time-dependent earthquake predictability to prepare for subsequent large

  20. On the possibility for earthquake prediction NETWORK in Balkan- Black Sea region

    Science.gov (United States)

    Mavrodiev, S. Cht

    The impressive development of the Earth sciences on the basis of new precise Crust parameters measurements permits to estimates the probabilities for earthquakes risk. But the prediction the time, epicenter and Magnitude of incoming earthquake is not a solved problem. Many scientists are state that this is not solvable. The local "when" Earthquake prediction is based on the connection between geomagnetic "quakes" and the next incoming minimum or maximum of tidal gravitational potential. The probability time window for the predicted earthquake is +/-1 day for the minimum and +/-2 days for the maximum. The preliminary statistic estimation on the basis of distribution of the time difference between predicted and occurred earthquakes for the period December 2002- September 2003 for Sofia region is given. The solving of earthquake's prediction problem and creating its theory need the efforts of wide interdisciplinary science group. The use of almost real time satellite GIS for data acquisition, visualization, archiving and analysis. The new techniques for solving step by step the nonlinear inverse problems, for testing the adequateness of physical models and the reliability of predictions is requisite. The monitoring should include standard geodetic data, seismic hazard map developments, electromagnetic field monitoring under (electrical signals in VAN method and its Thanassoulas's variant), on (electropotential distribution, geomagnetic variations)and over (VLF and ULF, vertical electropotential distribution) Earth surface, atmosphere effects (earthquake's clouds, electrical charge distribution), the behavior of Earth radiation belts, biological precursors. The statistical estimation for reliability of time, epicenter and magnitude prediction is obligatory. The Balkan- Black Sea region is proposed as polygon for testing the possibilities for creating short time earthquakes prediction NETWORK. The advantage of the proposal is that the geophysical seismic, geomagnetic

  1. Applications of the gambling score in evaluating earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2010-05-01

    This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.

  2. Earthquakes which could have been predicted and were not

    Energy Technology Data Exchange (ETDEWEB)

    Petrushevskiy, B.A.

    1977-01-01

    An examination of the history of study of seismicity and seismotectonics of the Kyzylkum indicates that the destructive earthquakes occurring in April and May 1976 in the southwestern part of the desert with epicenters not far from Gazli settlement must not be categorized as unexpected. In actuality, there were both seismic and seismogeological data on the possibility of the occurrence of local, quite strong earthquakes over almost the entire area of the Kyzylkum, including its southwestern part. The author feels that the principal factors hindering the timely awareness of peculiarities of seismicity in the Kyzylkum were, first, a serious neglect by scientists concerned with these matters to the data in the literature published by their predecessors, and, second, the lack of teamwork among geologists and geophysicists both in the carrying out of field work on the seismicity of the Kyzylkum and in the processing of the collected data.

  3. Long-Term Probabilistic Forecast for M ≥ 5.0 Earthquakes in Iran

    Science.gov (United States)

    Talebi, Mohammad; Zare, Mehdi; Peresan, Antonella; Ansari, Anooshiravan

    2017-04-01

    In this study, a long-term forecasting model is proposed to evaluate the probabilities of forthcoming M ≥ 5.0 earthquakes on a 0.2° grid for an area including the Iranian plateau. The model is built basically from smoothing the locations of preceding events, assuming a spatially heterogeneous and temporally homogeneous Poisson point process for seismicity. In order to calculate the expectations, the space distribution, from adaptively smoothed seismicity, has been scaled in time and magnitude by average number of events over a 5-year forecasting horizon and a tapered magnitude distribution, respectively. The model has been adjusted and applied considering two earthquake datasets: a regional unified catalog (MB14) and a global catalog (ISC). Only the events with M ≥ 4.5 have been retained from the datasets, based on preliminary completeness data analysis. A set of experiments has been carried out, testing different options in the model application, and the average probability gains for target earthquakes have been estimated. By optimizing the model parameters, which leads to increase of the predictive power of the model, it is shown that a declustered catalog has an advantage over a non-declustered one, and a low-magnitude threshold of a learning catalog can be preferred to a larger one. In order to examine the significance of the model results at 95% confidence level, a set of retrospective tests, namely, the L test, the N test, the R test, and the error diagram test, has been performed considering 13 target time windows. The error diagram test shows that the forecast results, obtained for both the two input catalogs, mostly fall outside the 5% critical region that is related to results from a random guess. The L test and the N test could not reject the model for most of the time intervals (i.e. 85 and 62% of times for the ISC and MB14 forecasts, respectively). Furthermore, after backwards extending the time span of the learning catalogs and repeating the L

  4. Long-term perspectives on giant earthquakes and tsunamis at subduction zones

    Science.gov (United States)

    Satake, K.; Atwater, B.F.; ,

    2007-01-01

    Histories of earthquakes and tsunamis, inferred from geological evidence, aid in anticipating future catastrophes. This natural warning system now influences building codes and tsunami planning in the United States, Canada, and Japan, particularly where geology demonstrates the past occurrence of earthquakes and tsunamis larger than those known from written and instrumental records. Under favorable circumstances, paleoseismology can thus provide long-term advisories of unusually large tsunamis. The extraordinary Indian Ocean tsunami of 2004 resulted from a fault rupture more than 1000 km in length that included and dwarfed fault patches that had broken historically during lesser shocks. Such variation in rupture mode, known from written history at a few subduction zones, is also characteristic of earthquake histories inferred from geology on the Pacific Rim. Copyright ?? 2007 by Annual Reviews. All rights reserved.

  5. Short-term earthquake forecasting based on an epidemic clustering model

    Science.gov (United States)

    Console, Rodolfo; Murru, Maura; Falcone, Giuseppe

    2016-04-01

    The application of rigorous statistical tools, with the aim of verifying any prediction method, requires a univocal definition of the hypothesis, or the model, characterizing the concerned anomaly or precursor, so as it can be objectively recognized in any circumstance and by any observer. This is mandatory to build up on the old-fashion approach consisting only of the retrospective anecdotic study of past cases. A rigorous definition of an earthquake forecasting hypothesis should lead to the objective identification of particular sub-volumes (usually named alarm volumes) of the total time-space volume within which the probability of occurrence of strong earthquakes is higher than the usual. The test of a similar hypothesis needs the observation of a sufficient number of past cases upon which a statistical analysis is possible. This analysis should be aimed to determine the rate at which the precursor has been followed (success rate) or not followed (false alarm rate) by the target seismic event, or the rate at which a target event has been preceded (alarm rate) or not preceded (failure rate) by the precursor. The binary table obtained from this kind of analysis leads to the definition of the parameters of the model that achieve the maximum number of successes and the minimum number of false alarms for a specific class of precursors. The mathematical tools suitable for this purpose may include the definition of Probability Gain or the R-Score, as well as the application of popular plots such as the Molchan error-diagram and the ROC diagram. Another tool for evaluating the validity of a forecasting method is the concept of the likelihood ratio (also named performance factor) of occurrence and non-occurrence of seismic events under different hypotheses. Whatever is the method chosen for building up a new hypothesis, usually based on retrospective data, the final assessment of its validity should be carried out by a test on a new and independent set of observations

  6. Ground Motion Prediction of Subduction Earthquakes using the Onshore-Offshore Ambient Seismic Field

    Science.gov (United States)

    Viens, L.; Miyake, H.; Koketsu, K.

    2014-12-01

    Seismic waves produced by earthquakes already caused plenty of damages all around the world and are still a real threat to human beings. To reduce seismic risk associated with future earthquakes, accurate ground motion predictions are required, especially for cities located atop sedimentary basins that can trap and amplify these seismic waves. We focus this study on long-period ground motions produced by subduction earthquakes in Japan which have the potential to damage large-scale structures, such as high-rise buildings, bridges, and oil storage tanks. We extracted the impulse response functions from the ambient seismic field recorded by two stations using one as a virtual source, without any preprocessing. This method allows to recover the reliable phases and relative, rather than absolute, amplitudes. To retrieve corresponding Green's functions, the impulse response amplitudes need to be calibrated using observational records of an earthquake which happened close to the virtual source. We show that Green's functions can be extracted between offshore submarine cable-based sea-bottom seismographic observation systems deployed by JMA located atop subduction zones and on-land NIED/Hi-net stations. In contrast with physics-based simulations, this approach has the great advantage to predict ground motions of moderate earthquakes (Mw ~5) at long-periods in highly populated sedimentary basin without the need of any external information about the velocity structure.

  7. Long-term slip deficit and the forecasting of slip in future earthquakes

    Science.gov (United States)

    McCloskey, John; NicBhloscaidh, Mairead; Simao, Nuno

    2014-05-01

    In the last decade a series of devastating earthquakes have between them killed more than three-quarters of a million people. None of the events were formally forecast and have been repeatedly referred to a seismological 'surprises'. Here we argue that while earthquakes within the wide swath of diffuse deformation comprising the Alpine-Himalayan belt pose a set of particularly difficult set of challenges, earthquakes which are driven by high strain-rates at plate boundaries and which have relatively short nominal recurrence times might be forecast if the data exists to perform long-term slip deficit modelling and stress reconstruction. We show that two instrumentally recorded event on the Sumatran margin in 2007 and 2010 occurred in regions of high slip deficit identified by reconstruction of slip in historical earthquakes in 1797 and 1833 under the Mentawai Islands using more than 200 years of geodetic data recorded in the stratigraphy of coral micro-atolls growing there. In the presentation we will describe the data and a new Bayesian-Monte Carlo slip reconstruction technique. The technique is based on the stochastic forward modelling of many slip distributions each using the same set of elastic Green's functions to estimate, by superposition of contributions from each fault cell, the vertical displacement at the coral locations resulting from each simulated event. Every solution, weighted by its goodness of fit to the data, is added to a stack whose final values contain an estimate of the most likely distribution of slip in the historical earthquakes. Further, we estimate the Kullback-Liebler divergence over the fault area providing a non-arbitrary assessment of the spatial distribution of information gain, identifying regions of low- and high- model confidence. We then model the long-term slip deficit on the megathrust assuming a zero of stress immediately after the 1652 Mentawai Islands earthquake. We use the resulting slip deficit field to compute the entire

  8. Long-term changes in regular and low-frequency earthquake inter-event times near Parkfield, CA

    Science.gov (United States)

    Wu, C.; Shelly, D. R.; Johnson, P. A.; Gomberg, J. S.; Peng, Z.

    2012-12-01

    The temporal evolution of earthquake inter-event time may provide important clues for the timing of future events and underlying physical mechanisms of earthquake nucleation. In this study, we examine inter-event times from 12-yr catalogs of ~50,000 earthquakes and ~730,000 LFEs in the vicinity of the Parkfield section of the San Andreas Fault. We focus on the long-term evolution of inter-event times after the 2003 Mw6.5 San Simeon and 2004 Mw6.0 Parkfield earthquakes. We find that inter-event times decrease by ~4 orders of magnitudes after the Parkfield and San Simeon earthquakes and are followed by a long-term recovery with time scales of ~3 years and more than 8 years for earthquakes along and to the southwest of the San Andreas fault, respectively. The differing long-term recovery of the earthquake inter-event times is likely a manifestation of different aftershock recovery time scales that reflect the different tectonic loading rates in the two regions. We also observe a possible decrease of LFE inter-event times in some LFE families, followed by a recovery with time scales of ~4 months to several years. The drop in the recurrence time of LFE after the Parkfield earthquake is likely caused by a combination of the dynamic and positive static stress induced by the Parkfield earthquake, and the long-term recovery in LFE recurrence time could be due to post-seismic relaxation or gradual recovery of the fault zone material properties. Our on-going work includes better constraining and understanding the physical mechanisms responsible for the observed long-term recovery in earthquake and LFE inter-event times.

  9. Implementation of short-term prediction

    Energy Technology Data Exchange (ETDEWEB)

    Landberg, L.; Joensen, A.; Giebel, G. [and others

    1999-03-01

    This paper will giver a general overview of the results from a EU JOULE funded project (`Implementing short-term prediction at utilities`, JOR3-CT95-0008). Reference will be given to specialised papers where applicable. The goal of the project was to implement wind farm power output prediction systems in operational environments at a number of utilities in Europe. Two models were developed, one by Risoe and one by the Technical University of Denmark (DTU). Both prediction models used HIRLAM predictions from the Danish Meteorological Institute (DMI). (au) EFP-94; EU-JOULE. 11 refs.

  10. Real-time numerical shake prediction and updating for earthquake early warning

    Science.gov (United States)

    Wang, Tianyun; Jin, Xing; Wei, Yongxiang; Huang, Yandan

    2017-12-01

    Ground motion prediction is important for earthquake early warning systems, because the region's peak ground motion indicates the potential disaster. In order to predict the peak ground motion quickly and precisely with limited station wave records, we propose a real-time numerical shake prediction and updating method. Our method first predicts the ground motion based on the ground motion prediction equation after P waves detection of several stations, denoted as the initial prediction. In order to correct the prediction error of the initial prediction, an updating scheme based on real-time simulation of wave propagation is designed. Data assimilation technique is incorporated to predict the distribution of seismic wave energy precisely. Radiative transfer theory and Monte Carlo simulation are used for modeling wave propagation in 2-D space, and the peak ground motion is calculated as quickly as possible. Our method has potential to predict shakemap, making the potential disaster be predicted before the real disaster happens. 2008 M S8.0 Wenchuan earthquake is studied as an example to show the validity of the proposed method.

  11. Prediction of Global Damage and Reliability Based Upon Sequential Identification and Updating of RC Structures Subject to Earthquakes

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Skjærbæk, P. S.; Köylüoglu, H. U.

    The paper deals with the prediction of global damage and future structural reliability with special emphasis on sensitivity, bias and uncertainty of these predictions dependent on the statistically equivalent realizations of the future earthquake. The predictions are based on a modified Clough......-Johnston single-degree-of-freedom (SDOF) oscillator with three parameters which are calibrated to fit the displacement response and the damage development in the past earthquake....

  12. Impact of great subduction earthquakes on the long-term forearc morphology, insight from mechanical modelling

    Science.gov (United States)

    Cubas, Nadaya

    2017-04-01

    The surge of great subduction earthquakes during the last fifteen years provided numerous observations requiring revisiting our understanding of large seismic events mechanics. For instance, we now have clear evidence that a significant part of the upper plate deformation is permanently acquired. The link between great earthquakes and long-term deformation offers a new perspective for the relief construction understanding. In addition, a better understanding of these relations could provide us with new constraints on earthquake mechanics. It is also of fundamental importance for seismic risk assessment. In this presentation, I will compile recent results obtained from mechanical modelling linking megathrust ruptures with upper-plate permanent deformation and discuss their impact. We will first show that, in good accordance with lab experiments, aseismic zones are characterized by frictions larger or equal to 0.1 whereas seismic asperities have dynamic frictions lower than 0.05. This difference will control the long-term upper-plate morphology. The larger values along aseismic zones allow the wedge to reach the critical state, and will lead to active thrust systems forming a relief. On the contrary, low dynamic friction along seismic asperities will place the taper in the sub-critical domain impeding any internal deformation. This will lead to the formation of forearc basins inducing negative gravity anomalies. Since aseismic zones have higher friction and larger taper, fully creeping segments will tend to develop peninsulas. On the contrary, fully locked segments with low dynamic friction and very low taper will favor subsiding coasts. The taper variation due to megathrust friction is also expressed through a correlation between coast-to-trench distance and forearc coupling (e.g., Mexican and South-American subduction zones). We will then discuss how variations of frictional properties along the megathrust can induce splay fault activation. For instance, we can

  13. Earthquake catalogs for the 2017 Central and Eastern U.S. short-term seismic hazard model

    Science.gov (United States)

    Mueller, Charles S.

    2017-01-01

    The U. S. Geological Survey (USGS) makes long-term seismic hazard forecasts that are used in building codes. The hazard models usually consider only natural seismicity; non-tectonic (man-made) earthquakes are excluded because they are transitory or too small. In the past decade, however, thousands of earthquakes related to underground fluid injection have occurred in the central and eastern U.S. (CEUS), and some have caused damage.  In response, the USGS is now also making short-term forecasts that account for the hazard from these induced earthquakes. Seismicity statistics are analyzed to develop recurrence models, accounting for catalog completeness. In the USGS hazard modeling methodology, earthquakes are counted on a map grid, recurrence models are applied to estimate the rates of future earthquakes in each grid cell, and these rates are combined with maximum-magnitude models and ground-motion models to compute the hazard The USGS published a forecast for the years 2016 and 2017.Here, we document the development of the seismicity catalogs for the 2017 CEUS short-term hazard model.  A uniform earthquake catalog is assembled by combining and winnowing pre-existing source catalogs. The initial, final, and supporting earthquake catalogs are made available here.

  14. The Virtual Quake earthquake simulator: a simulation-based forecast of the El Mayor-Cucapah region and evidence of predictability in simulated earthquake sequences

    Science.gov (United States)

    Yoder, Mark R.; Schultz, Kasey W.; Heien, Eric M.; Rundle, John B.; Turcotte, Donald L.; Parker, Jay W.; Donnellan, Andrea

    2015-12-01

    In this manuscript, we introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert-based forecasting metric, and show that it exhibits significant information gain compared to random forecasts. We also discuss the long-standing question of activation versus quiescent type earthquake triggering. We show that VQ exhibits both behaviours separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California, USA and northern Baja California Norte, Mexico.

  15. Predicting the spatial extent of liquefaction from geospatial and earthquake specific parameters

    Science.gov (United States)

    Zhu, Jing; Baise, Laurie G.; Thompson, Eric M.; Wald, David J.; Knudsen, Keith L.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M.

    2014-01-01

    The spatially extensive damage from the 2010-2011 Christchurch, New Zealand earthquake events are a reminder of the need for liquefaction hazard maps for anticipating damage from future earthquakes. Liquefaction hazard mapping as traditionally relied on detailed geologic mapping and expensive site studies. These traditional techniques are difficult to apply globally for rapid response or loss estimation. We have developed a logistic regression model to predict the probability of liquefaction occurrence in coastal sedimentary areas as a function of simple and globally available geospatial features (e.g., derived from digital elevation models) and standard earthquake-specific intensity data (e.g., peak ground acceleration). Some of the geospatial explanatory variables that we consider are taken from the hydrology community, which has a long tradition of using remotely sensed data as proxies for subsurface parameters. As a result of using high resolution, remotely-sensed, and spatially continuous data as a proxy for important subsurface parameters such as soil density and soil saturation, and by using a probabilistic modeling framework, our liquefaction model inherently includes the natural spatial variability of liquefaction occurrence and provides an estimate of spatial extent of liquefaction for a given earthquake. To provide a quantitative check on how the predicted probabilities relate to spatial extent of liquefaction, we report the frequency of observed liquefaction features within a range of predicted probabilities. The percentage of liquefaction is the areal extent of observed liquefaction within a given probability contour. The regional model and the results show that there is a strong relationship between the predicted probability and the observed percentage of liquefaction. Visual inspection of the probability contours for each event also indicates that the pattern of liquefaction is well represented by the model.

  16. Short-term wind power prediction

    DEFF Research Database (Denmark)

    Joensen, Alfred K.

    2003-01-01

    The present thesis consists of 10 research papers published during the period 1997-2002 together with a summary report. The objective of the work described in the thesis is to develop models and methods for calculation of high accuracy predictions of wind power generated electricity......, and to implement these models and methods in an on-line software application. The economical value of having predictions available is also briefly considered. The summary report outlines the background and motivation for developing wind power prediction models. The meteorological theory which is relevant...... where the Department of Informatics and Mathematical Modelling and the Department of Wind Energy and Atmospheric Physics have been two major participants. The first project entitled Implementing Short-term Prediction at Utilities , founded by the European Commission under the JOULE programme. The second...

  17. Conditional spectrum computation incorporating multiple causal earthquakes and ground-motion prediction models

    Science.gov (United States)

    Lin, Ting; Harmsen, Stephen C.; Baker, Jack W.; Luco, Nicolas

    2013-01-01

    The conditional spectrum (CS) is a target spectrum (with conditional mean and conditional standard deviation) that links seismic hazard information with ground-motion selection for nonlinear dynamic analysis. Probabilistic seismic hazard analysis (PSHA) estimates the ground-motion hazard by incorporating the aleatory uncertainties in all earthquake scenarios and resulting ground motions, as well as the epistemic uncertainties in ground-motion prediction models (GMPMs) and seismic source models. Typical CS calculations to date are produced for a single earthquake scenario using a single GMPM, but more precise use requires consideration of at least multiple causal earthquakes and multiple GMPMs that are often considered in a PSHA computation. This paper presents the mathematics underlying these more precise CS calculations. Despite requiring more effort to compute than approximate calculations using a single causal earthquake and GMPM, the proposed approach produces an exact output that has a theoretical basis. To demonstrate the results of this approach and compare the exact and approximate calculations, several example calculations are performed for real sites in the western United States. The results also provide some insights regarding the circumstances under which approximate results are likely to closely match more exact results. To facilitate these more precise calculations for real applications, the exact CS calculations can now be performed for real sites in the United States using new deaggregation features in the U.S. Geological Survey hazard mapping tools. Details regarding this implementation are discussed in this paper.

  18. Long- and short-term triggering and modulation of mud volcano eruptions by earthquakes

    Science.gov (United States)

    Bonini, Marco; Rudolph, Maxwell L.; Manga, Michael

    2016-03-01

    Earthquakes can trigger the eruption of mud. We use eruptions in Azerbaijan, Italy, Romania, Japan, Andaman Islands, Pakistan, Taiwan, Indonesia, and California to probe the nature of stress changes that induce new eruptions and modulate ongoing eruptions. Dynamic stresses produced by earthquakes are usually inferred to be the dominant triggering mechanism; however static stress changes acting on the feeder systems of mud volcanoes may also play a role. In Azerbaijan, eruptions within 2-10 fault lengths from the epicenter are favored in the year following earthquakes where the static stress changes cause compression of the mud source and unclamp feeder dikes. In Romania, Taiwan, and some Italian sites, increased activity is also favored where the static stress changes act to unclamp feeder dikes, but responses occur within days. The eruption in the Andaman Islands, and those of the Niikappu mud volcanoes, Japan are better correlated with amplitude of dynamic stresses produced by seismic waves. Similarly, a new island that emerged off the coast of Pakistan in 2013 was likely triggered by dynamic stresses, enhanced by directivity. At the southern end of the Salton Sea, California earthquakes increase the gas flux at small mud volcanoes. Responses are best correlated with dynamic stresses. The comparison of responses in these nine settings indicates that dynamic stresses are most often correlated with triggering, although permanent stress changes as small as, and possibly smaller than, 0.1 bar may be sufficient to also influence eruptions. Unclamping stresses with magnitude similar to Earth tides (0.01 bar) persist over time and may play a role in triggering delayed responses. Unclamping stresses may be important contributors to short-term triggering only if they exceed 0.1-1 bar.

  19. Recent development of the earthquake strong motion-intensity catalog and intensity prediction equations for Iran

    Science.gov (United States)

    Zare, Mehdi

    2017-07-01

    This study aims to develop a new earthquake strong motion-intensity catalog as well as intensity prediction equations for Iran based on the available data. For this purpose, all the sites which had both recorded strong motion and intensity values throughout the region were first searched. Then, the data belonging to the 306 identified sites were processed, and the results were compiled as a new strong motion-intensity catalog. Based on this new catalog, two empirical equations between the values of intensity and the ground motion parameters (GMPs) for the Iranian earthquakes were calculated. At the first step, earthquake "intensity" was considered as a function of five independent GMPs including "Log (PHA)," "moment magnitude (MW)," "distance to epicenter," "site type," and "duration," and a multiple stepwise regression was calculated. Regarding the correlations between the parameters and the effectiveness coefficients of the predictors, the Log (PHA) was recognized as the most effective parameter on the earthquake "intensity," while the parameter "site type" was removed from the equations since it was determines as the least significant variable. Then, at the second step, a simple ordinary least squares (OLS) regression was fitted only between the parameters intensity and the Log (PHA) which resulted in more over/underestimated intensity values comparing to the results of the multiple intensity-GMPs regression. However, for rapid response purposes, the simple OLS regression may be more useful comparing to the multiple regression due to its data availability and simplicity. In addition, according to 50 selected earthquakes, an empirical relation between the macroseismic intensity (I0) and MW was developed.

  20. A Strategy for Short-Term Earthquake Forecasting Based on Combined Ground and Space-Based Observations

    Science.gov (United States)

    Kafatos, M.; Papadopoulos, G. A.; Karastathis, V. K.; Minadakis, G.; Ouzounov, D.; Pulinets, S. A.; Tramutoli, V.; Tsinganos, K.

    2014-12-01

    No standard methodologies regarding the short-term (hours, days, few weeks) forecasting of earthquakes have been widely adopted so far. However, promising approaches from ground-based (e.g. foreshocks) and space-based (e.g. thermal anomalies) observations have been described. We propose to apply a multidisciplinary strategy by performing real-time experiments towards the identification of space-time windows having increased probability beyond chance for the occurrence of strong earthquakes (M>5.5). This is a new collaborative study which will continue the best practices achieved from other projects such as the EU-FP7 PRE-EARTHQUAKE and the ongoing ISSI project LAICa. The test region covers the entire Greece which is of the highest seismicity all over western Eurasia, while closer attention will be given to the Corinth Rift (Central Greece) which is an asymmetric half-graben of high seismicity opening rapidly with geodetic extension rates up to about 15mmyr-1. Ground-based observations will mainly include seismicity, magnetometers and radon measurements while space observations will include the ones that may provide thermal anomalies, GPS and TEC. The strategy will include the development of a system operating in real-time basis with strong tools and protocols for the collection, archiving and evaluation of the different types of data. The software part of the system may incorporate three basic interfaces implemented via open source technology: (1) The up-streaming software interface for the collection and archiving of data; (2) The backend real-time software interface incorporating all the available models; (3) The frontend WEBGIS software interface that will allow for data representation and mapping. The establishment of some certain rules for issuing non-public seismic alerts is needed. Therefore, in this paper we will also discuss the significance of the proposed work for the issues of earthquake forecasting/prediction statements and what critical new

  1. Prediction of the area affected by earthquake-induced landsliding based on seismological parameters

    Directory of Open Access Journals (Sweden)

    O. Marc

    2017-07-01

    Full Text Available We present an analytical, seismologically consistent expression for the surface area of the region within which most landslides triggered by an earthquake are located (landslide distribution area. This expression is based on scaling laws relating seismic moment, source depth, and focal mechanism with ground shaking and fault rupture length and assumes a globally constant threshold of acceleration for onset of systematic mass wasting. The seismological assumptions are identical to those recently used to propose a seismologically consistent expression for the total volume and area of landslides triggered by an earthquake. To test the accuracy of the model we gathered geophysical information and estimates of the landslide distribution area for 83 earthquakes. To reduce uncertainties and inconsistencies in the estimation of the landslide distribution area, we propose an objective definition based on the shortest distance from the seismic wave emission line containing 95 % of the total landslide area. Without any empirical calibration the model explains 56 % of the variance in our dataset, and predicts 35 to 49 out of 83 cases within a factor of 2, depending on how we account for uncertainties on the seismic source depth. For most cases with comprehensive landslide inventories we show that our prediction compares well with the smallest region around the fault containing 95 % of the total landslide area. Aspects ignored by the model that could explain the residuals include local variations of the threshold of acceleration and processes modulating the surface ground shaking, such as the distribution of seismic energy release on the fault plane, the dynamic stress drop, and rupture directivity. Nevertheless, its simplicity and first-order accuracy suggest that the model can yield plausible and useful estimates of the landslide distribution area in near-real time, with earthquake parameters issued by standard detection routines.

  2. Prediction of the area affected by earthquake-induced landsliding based on seismological parameters

    Science.gov (United States)

    Marc, Odin; Meunier, Patrick; Hovius, Niels

    2017-07-01

    We present an analytical, seismologically consistent expression for the surface area of the region within which most landslides triggered by an earthquake are located (landslide distribution area). This expression is based on scaling laws relating seismic moment, source depth, and focal mechanism with ground shaking and fault rupture length and assumes a globally constant threshold of acceleration for onset of systematic mass wasting. The seismological assumptions are identical to those recently used to propose a seismologically consistent expression for the total volume and area of landslides triggered by an earthquake. To test the accuracy of the model we gathered geophysical information and estimates of the landslide distribution area for 83 earthquakes. To reduce uncertainties and inconsistencies in the estimation of the landslide distribution area, we propose an objective definition based on the shortest distance from the seismic wave emission line containing 95 % of the total landslide area. Without any empirical calibration the model explains 56 % of the variance in our dataset, and predicts 35 to 49 out of 83 cases within a factor of 2, depending on how we account for uncertainties on the seismic source depth. For most cases with comprehensive landslide inventories we show that our prediction compares well with the smallest region around the fault containing 95 % of the total landslide area. Aspects ignored by the model that could explain the residuals include local variations of the threshold of acceleration and processes modulating the surface ground shaking, such as the distribution of seismic energy release on the fault plane, the dynamic stress drop, and rupture directivity. Nevertheless, its simplicity and first-order accuracy suggest that the model can yield plausible and useful estimates of the landslide distribution area in near-real time, with earthquake parameters issued by standard detection routines.

  3. Nowcasting Earthquakes

    Science.gov (United States)

    Rundle, J. B.; Donnellan, A.; Grant Ludwig, L.; Turcotte, D. L.; Luginbuhl, M.; Gail, G.

    2016-12-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system, and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(nearthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(nearthquake cycle in the defined region at the current time.

  4. GPS station short-term dynamic characteristics of micro displacement before Menyuan M6.4 earthquake

    Directory of Open Access Journals (Sweden)

    Wei Feng

    2016-07-01

    Full Text Available Continuous observation data from 24 GPS stations are selected in the area (33.0°N–41.0°N, 95.0°E–105.0°E for this study (the period is from Jan. 1, 2015 to Jan. 20, 2016. Three components, NS, EW and UD, of the daily solutions are filtered by the Hilbert–Huang transform (HHT with frequency band of 5.787 × 10−7–7.716 × 10−8 Hz (20–150 days in period. And short-term dynamic characteristics of micro displacement before Menyuan M6.4 earthquake are studied by using the temporal dependencies and cross spectrum analysis. The results show that before the earthquake the horizontal undulatory motions are higher than the average level in the series data which indicate the disturbance feature of regional stress before the earthquake. Three GPS stations on Qinghai-Tibet Plateau with their setting perpendicular to the seismogenic fault have consistent movement. The increase of amplitude of the horizontal micro motion observed before the quake is conducive to the earthquake occurrence. However, we could not be sure if the undulatory motion triggered the earthquake. It is quite necessary to build more GPS continuous observation stations and optimize the monitoring network so as to improve the understanding of the short-term dynamic crustal variation before earthquake.

  5. Real-time 3-D space numerical shake prediction for earthquake early warning

    Science.gov (United States)

    Wang, Tianyun; Jin, Xing; Huang, Yandan; Wei, Yongxiang

    2017-12-01

    In earthquake early warning systems, real-time shake prediction through wave propagation simulation is a promising approach. Compared with traditional methods, it does not suffer from the inaccurate estimation of source parameters. For computation efficiency, wave direction is assumed to propagate on the 2-D surface of the earth in these methods. In fact, since the seismic wave propagates in the 3-D sphere of the earth, the 2-D space modeling of wave direction results in inaccurate wave estimation. In this paper, we propose a 3-D space numerical shake prediction method, which simulates the wave propagation in 3-D space using radiative transfer theory, and incorporate data assimilation technique to estimate the distribution of wave energy. 2011 Tohoku earthquake is studied as an example to show the validity of the proposed model. 2-D space model and 3-D space model are compared in this article, and the prediction results show that numerical shake prediction based on 3-D space model can estimate the real-time ground motion precisely, and overprediction is alleviated when using 3-D space model.

  6. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes.

    Science.gov (United States)

    Shapira, Stav; Novack, Lena; Bar-Dayan, Yaron; Aharonson-Daniel, Limor

    2016-01-01

    A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences. An innovative model was formulated based on a widely used loss estimation model (HAZUS) by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status) that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel. the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard. The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.

  7. Operational Earthquake Forecasting: State of Knowledge and Guidelines for Implementation.

    OpenAIRE

    Koshun Yamaoka; Gerassimos Papadopoulos; Gennady Sobolev; Warner Marzocchi; Ian Main; Raul Madariaga; Paolo Gasparini; Yun-Tai Chen; Jordan, Thomas H.; Jochen Zschau

    2011-01-01

    Following the 2009 L'Aquila earthquake, the Dipartimento della Protezione Civile Italiana (DPC), appointed an International Commission on Earthquake Forecasting for Civil Protection (ICEF) to report on the current state of knowledge of short-term prediction and forecasting of tectonic earthquakes and indicate guidelines for utilization of possible forerunners of large earthquakes to drive civil protection actions, including the use of probabilistic seismic hazard analysis in the wake of a lar...

  8. Ground motion prediction and earthquake scenarios in the volcanic region of Mt. Etna (Southern Italy

    Science.gov (United States)

    Langer, Horst; Tusa, Giuseppina; Luciano, Scarfi; Azzaro, Raffaela

    2013-04-01

    One of the principal issues in the assessment of seismic hazard is the prediction of relevant ground motion parameters, e. g., peak ground acceleration, radiated seismic energy, response spectra, at some distance from the source. Here we first present ground motion prediction equations (GMPE) for horizontal components for the area of Mt. Etna and adjacent zones. Our analysis is based on 4878 three component seismograms related to 129 seismic events with local magnitudes ranging from 3.0 to 4.8, hypocentral distances up to 200 km, and focal depth shallower than 30 km. Accounting for the specific seismotectonic and geological conditions of the considered area we have divided our data set into three sub-groups: (i) Shallow Mt. Etna Events (SEE), i.e., typically volcano-tectonic events in the area of Mt. Etna having a focal depth less than 5 km; (ii) Deep Mt. Etna Events (DEE), i.e., events in the volcanic region, but with a depth greater than 5 km; (iii) Extra Mt. Etna Events (EEE), i.e., purely tectonic events falling outside the area of Mt. Etna. The predicted PGAs for the SEE are lower than those predicted for the DEE and the EEE, reflecting their lower high-frequency energy content. We explain this observation as due to the lower stress drops. The attenuation relationships are compared to the ones most commonly used, such as by Sabetta and Pugliese (1987)for Italy, or Ambraseys et al. (1996) for Europe. Whereas our GMPEs are based on small earthquakes, the magnitudes covered by the two above mentioned attenuation relationships regard moderate to large magnitudes (up to 6.8 and 7.9, respectively). We show that the extrapolation of our GMPEs to magnitues beyond the range covered by the data is misleading; at the same time also the afore mentioned relationships fail to predict ground motion parameters for our data set. Despite of these discrepancies, we can exploit our data for setting up scenarios for strong earthquakes for which no instrumental recordings are

  9. Intermediate-Term Declines in Seismicity at Two Volcanoes in Alaska Following the Mw7.9 Denali Fault Earthquake

    Science.gov (United States)

    McNutt, S. R.; Sanchez, J. J.; Moran, S. C.; Power, J. A.

    2002-12-01

    The Mw7.9 Denali Fault earthquake provided an opportunity to look for intermediate-term (days to weeks) responses of Alaskan volcanoes to shaking from a large regional earthquake. The Alaska Volcano Observatory monitors 24 volcanoes with seismic networks. We examined one station for each volcano, generally the closest (typically 5 km from the vent) unless noise, site response, or other factors made the data unusable. Data were digitally bandpass filtered between 0.8 and 5 Hz to reduce noise from microseisms and wind. Data for the period three days before to three days after the Mw7.9 earthquake were then plotted at a standard scale used for AVO routine monitoring. Shishaldin volcano, which has a background rate of several hundred seismic events per day on station SSLS, showed no change from before to after the earthquake. Veniaminof volcano, which has had recent mild eruptions and a rate of several dozen seismic events per day on station VNNF, suffered a drop in seismicity at the time of the earthquake by a factor of 2.5; this lasted for 15 days. We tested this result using a different station, VNSS, and a different method of counting (non-filtered data on helicorder records) and found the same result. We infer that Veniaminof's activity was modified by the Mw7.9 earthquake. Wrangell, the closest volcano, had a background rate of about 10 events per day. Data from station WANC could not be measured for 8 days after the Mw7.9 earthquake because the large number of aftershocks precluded identification of local seismicity. For the following eight days, however, its seismicity rate was 30 percent lower than before. While subtle, we infer that this may be related to the earthquake. It is known that Wrangell increased its heat output after the Mw9.2 Alaska earthquake of 1964 and again after the Ms7.1 St. Elias earthquake of 1979. The other 21 volcanoes showed no changes in seismicity from 3 days before to 3 days after the Mw7.9 event. We conclude that intermediate-term

  10. The effects of spatially varying earthquake impacts on mood and anxiety symptom treatments among long-term Christchurch residents following the 2010/11 Canterbury earthquakes, New Zealand.

    Science.gov (United States)

    Hogg, Daniel; Kingham, Simon; Wilson, Thomas M; Ardagh, Michael

    2016-09-01

    This study investigates the effects of disruptions to different community environments, community resilience and cumulated felt earthquake intensities on yearly mood and anxiety symptom treatments from the New Zealand Ministry of Health's administrative databases between September 2009 and August 2012. The sample includes 172,284 long-term residents from different Christchurch communities. Living in a better physical environment was associated with lower mood and anxiety treatment rates after the beginning of the Canterbury earthquake sequence whereas an inverse effect could be found for social community environment and community resilience. These results may be confounded by pre-existing patterns, as well as intensified treatment-seeking behaviour and intervention programmes in severely affected areas. Nevertheless, the findings indicate that adverse mental health outcomes can be found in communities with worse physical but stronger social environments or community resilience post-disaster. Also, they do not necessarily follow felt intensities since cumulative earthquake intensity did not show a significant effect. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. An Integrated and Interdisciplinary Model for Predicting the Risk of Injury and Death in Future Earthquakes.

    Directory of Open Access Journals (Sweden)

    Stav Shapira

    Full Text Available A comprehensive technique for earthquake-related casualty estimation remains an unmet challenge. This study aims to integrate risk factors related to characteristics of the exposed population and to the built environment in order to improve communities' preparedness and response capabilities and to mitigate future consequences.An innovative model was formulated based on a widely used loss estimation model (HAZUS by integrating four human-related risk factors (age, gender, physical disability and socioeconomic status that were identified through a systematic review and meta-analysis of epidemiological data. The common effect measures of these factors were calculated and entered to the existing model's algorithm using logistic regression equations. Sensitivity analysis was performed by conducting a casualty estimation simulation in a high-vulnerability risk area in Israel.the integrated model outcomes indicated an increase in the total number of casualties compared with the prediction of the traditional model; with regard to specific injury levels an increase was demonstrated in the number of expected fatalities and in the severely and moderately injured, and a decrease was noted in the lightly injured. Urban areas with higher populations at risk rates were found more vulnerable in this regard.The proposed model offers a novel approach that allows quantification of the combined impact of human-related and structural factors on the results of earthquake casualty modelling. Investing efforts in reducing human vulnerability and increasing resilience prior to an occurrence of an earthquake could lead to a possible decrease in the expected number of casualties.

  12. Seismologically-consistent prediction of earthquake induced landsliding: Towards near-real time prediction of total landslide volume, total landslide area and regional area affected by earthquake-induced landsliding.

    Science.gov (United States)

    Marc, Odin; Hovius, Niels; Meunier, Patrick

    2017-04-01

    Earthquakes are an important trigger of landslides and can contribute significantly to sedimentary and organic matter fluxes. We present a new seismologically-consistent expression for the prediction of total area and volume of populations of earthquake-induced landslides (Marc et al., 2016) as well as for their regional area of occurrence (Marc et al., 2017). This model implements essential seismic processes, linking key parameters such as ground acceleration, fault size, earthquake source depth and seismic moment. To assess the model we have compiled and normalized estimates of total landslide volume, total landslide area and regional area affected by landslides for 40, 17 and 83 earthquakes, respectively. We have found that low landscape steepness systematically leads to overprediction of the total area and volume of landslides. When this effect is accounted for, the model is able to predict within a factor of 2 the landslide areas and associated volumes for about 70% of the cases in our databases. The prediction of regional area affected do not require a calibration for the landscape steepness and gives a prediction within a factor of 2 for 60% of the database. For 7 out of 10 comprehensive inventories we show that our prediction compares well with the smallest region around the fault containing 95% of the total landslide area. This is a significant improvement on a previously published empirical expression based only on earthquake moment. We discuss some additional parameters that seem required in the model to explain some outliers, such as exceptional rock mass strength in the epicentral area, shaking duration and other seismic source complexities ignored by the model. However, most cases in our catalogue seem to be relatively unaffected by these effects despite the variety of lithologies and tectonic settings they cover. This makes the model suitable for integration into landscape evolution models, and application to the assessment of secondary hazards and

  13. The potential of continuous, local atomic clock measurements for earthquake prediction and volcanology

    Directory of Open Access Journals (Sweden)

    Bondarescu Mihai

    2015-01-01

    Full Text Available Modern optical atomic clocks along with the optical fiber technology currently being developed can measure the geoid, which is the equipotential surface that extends the mean sea level on continents, to a precision that competes with existing technology. In this proceeding, we point out that atomic clocks have the potential to not only map the sea level surface on continents, but also look at variations of the geoid as a function of time with unprecedented timing resolution. The local time series of the geoid has a plethora of applications. These include potential improvement in the predictions of earthquakes and volcanoes, and closer monitoring of ground uplift in areas where hydraulic fracturing is performed.

  14. Long‐term creep rates on the Hayward Fault: evidence for controls on the size and frequency of large earthquakes

    Science.gov (United States)

    Lienkaemper, James J.; McFarland, Forrest S.; Simpson, Robert W.; Bilham, Roger; Ponce, David A.; Boatwright, John; Caskey, S. John

    2012-01-01

    The Hayward fault (HF) in California exhibits large (Mw 6.5–7.1) earthquakes with short recurrence times (161±65 yr), probably kept short by a 26%–78% aseismic release rate (including postseismic). Its interseismic release rate varies locally over time, as we infer from many decades of surface creep data. Earliest estimates of creep rate, primarily from infrequent surveys of offset cultural features, revealed distinct spatial variation in rates along the fault, but no detectable temporal variation. Since the 1989 Mw 6.9 Loma Prieta earthquake (LPE), monitoring on 32 alinement arrays and 5 creepmeters has greatly improved the spatial and temporal resolution of creep rate. We now identify significant temporal variations, mostly associated with local and regional earthquakes. The largest rate change was a 6‐yr cessation of creep along a 5‐km length near the south end of the HF, attributed to a regional stress drop from the LPE, ending in 1996 with a 2‐cm creep event. North of there near Union City starting in 1991, rates apparently increased by 25% above pre‐LPE levels on a 16‐km‐long reach of the fault. Near Oakland in 2007 an Mw 4.2 earthquake initiated a 1–2 cm creep event extending 10–15 km along the fault. Using new better‐constrained long‐term creep rates, we updated earlier estimates of depth to locking along the HF. The locking depths outline a single, ∼50‐km‐long locked or retarded patch with the potential for an Mw∼6.8 event equaling the 1868 HF earthquake. We propose that this inferred patch regulates the size and frequency of large earthquakes on HF.

  15. The 26 January 2001 M 7.6 Bhuj, India, earthquake: Observed and predicted ground motions

    Science.gov (United States)

    Hough, S.E.; Martin, S.; Bilham, R.; Atkinson, G.M.

    2002-01-01

    Although local and regional instrumental recordings of the devastating 26, January 2001, Bhuj earthquake are sparse, the distribution of macroseismic effects can provide important constraints on the mainshock ground motions. We compiled available news accounts describing damage and other effects and interpreted them to obtain modified Mercalli intensities (MMIs) at >200 locations throughout the Indian subcontinent. These values are then used to map the intensity distribution throughout the subcontinent using a simple mathematical interpolation method. Although preliminary, the maps reveal several interesting features. Within the Kachchh region, the most heavily damaged villages are concentrated toward the western edge of the inferred fault, consistent with western directivity. Significant sediment-induced amplification is also suggested at a number of locations around the Gulf of Kachchh to the south of the epicenter. Away from the Kachchh region, intensities were clearly amplified significantly in areas that are along rivers, within deltas, or on coastal alluvium, such as mudflats and salt pans. In addition, we use fault-rupture parameters inferred from teleseismic data to predict shaking intensity at distances of 0-1000 km. We then convert the predicted hard-rock ground-motion parameters to MMI by using a relationship (derived from Internet-based intensity surveys) that assigns MMI based on the average effects in a region. The predicted MMIs are typically lower by 1-3 units than those estimated from news accounts, although they do predict near-field ground motions of approximately 80%g and potentially damaging ground motions on hard-rock sites to distances of approximately 300 km. For the most part, this discrepancy is consistent with the expected effect of sediment response, but it could also reflect other factors, such as unusually high building vulnerability in the Bhuj region and a tendency for media accounts to focus on the most dramatic damage, rather than

  16. Inelastic spectra to predict period elongation of structures under earthquake loading

    DEFF Research Database (Denmark)

    Katsanos, Evangelos; Sextos, A.G.

    2015-01-01

    (expressed by the force reduction factor, Ry), post-yield stiffness (ay) and hysteretic laws are examined for a large number of strong motions. Constant-strength, inelastic spectra in terms of Tin/Tel are calculated to assess the extent of period elongation for various levels of structural inelasticity....... Moreover, the influence that structural characteristics (Ry, ay and degrading level) and strong-motion parameters (epicentral distance, frequency content and duration) exert on period lengthening are studied. Determined by regression analyses of the data obtained, simplified equations are proposed...... for period lengthening as a function of Ry and Tel. These equations may be used in the framework of the earthquake record selection and scaling....

  17. Long Term Ocean Bottom Seismographic Observation on the Rupture Areas of the Tonankai and Nankai Earthquakes

    Science.gov (United States)

    Yamada, T.; Kanazawa, T.; Shinohara, M.; Sakai, S.; Mochizuki, K.; Nakahigashi, K.

    2005-12-01

    The Nankai trough, which runs off the coast of the southwestern part of Japan, is widely known for its systematic activity of large earthquakes with the estimated magnitudes as large as 8. The recurrence periods are approximately 100 years after 1361, and the latest events are the 1944 Tonankai and the 1946 Nankai earthquakes. However, the number of smaller earthquakes observed by the on-land seismic stations is very small, and the location had not been resolved precisely. We start earthquake monitoring on the estimated rupture areas of the Tonankai and the Nankai earthquakes using ocean bottom seismometers, which are pop-up types with the life of one year continuous recording. We deployed 9 OBSs and start observation in 2003. We tried to retrieve them, redeployed 9 OBSs and deployed new 14 OBSs in 2004. In 2005, 23 OBSs were retrieved and redeployed. Moreover, we performed urgent aftershock observation of the 2004 Off Kii-Peninsula earthquake (Mw7.4) occurred on September 5, 2004. In this presentation, we focus the first observation with 9 OBSs. This network covered the boundary zone between the Tonankai and the Nankai earthquake rupture areas. The spatial intervals among OBSs were about 20 km. Earthquakes are located after correction of sedimentary layer estimated from active source data. One seismicity group, which locates near trench axis, is about 20km deep. The other seismicity, which locates along subducting plate, may be in the uppermost mantle within the subducting Philippine Sea Plate, and exist over 60km from trench axis. This work is a part of the Research on the Tonankai and Nankai earthquakes funded by the Ministry of Education, Culture, Sports, Science, and Technology, Japan.

  18. Interpreting Financial Market Crashes as Earthquakes : A New early Warning System for Medium Term Crashes

    NARCIS (Netherlands)

    F. Gresnigt (Francine); H.J.W.G. Kole (Erik); Ph.H.B.F. Franses (Philip Hans)

    2014-01-01

    textabstractWe propose a modeling framework which allows for creating probability predictions on a future market crash in the medium term, like sometime in the next five days. Our framework draws upon noticeable similarities between stock returns around a financial market crash and seismic activity

  19. Short-term Prediction - An Overview

    DEFF Research Database (Denmark)

    Landberg, Lars; Giebel, Gregor; Nielsen, Henrik Aalborg

    2003-01-01

    This article gives an overview of the different methods used today for predicting the power output from wind farms on the 1-2 day time horizon. It describes the general set-up of such prediction systems and also gives examples of their performance. Copyright (C) 2003 John Wiley Sons, Ltd....

  20. Prediction

    CERN Document Server

    Sornette, Didier

    2010-01-01

    This chapter first presents a rather personal view of some different aspects of predictability, going in crescendo from simple linear systems to high-dimensional nonlinear systems with stochastic forcing, which exhibit emergent properties such as phase transitions and regime shifts. Then, a detailed correspondence between the phenomenology of earthquakes, financial crashes and epileptic seizures is offered. The presented statistical evidence provides the substance of a general phase diagram for understanding the many facets of the spatio-temporal organization of these systems. A key insight is to organize the evidence and mechanisms in terms of two summarizing measures: (i) amplitude of disorder or heterogeneity in the system and (ii) level of coupling or interaction strength among the system's components. On the basis of the recently identified remarkable correspondence between earthquakes and seizures, we present detailed information on a class of stochastic point processes that has been found to be particu...

  1. Ground Motion Prediction for M7+ scenarios on the San Andreas Fault using the Virtual Earthquake Approach

    Science.gov (United States)

    Denolle, M.; Dunham, E. M.; Prieto, G.; Beroza, G. C.

    2013-05-01

    There is no clearer example of the increase in hazard due to prolonged and amplified shaking in sedimentary, than the case of Mexico City in the 1985 Michoacan earthquake. It is critically important to identify what other cities might be susceptible to similar basin amplification effects. Physics-based simulations in 3D crustal structure can be used to model and anticipate those effects, but they rely on our knowledge of the complexity of the medium. We propose a parallel approach to validate ground motion simulations using the ambient seismic field. We compute the Earth's impulse response combining the ambient seismic field and coda-wave enforcing causality and symmetry constraints. We correct the surface impulse responses to account for the source depth, mechanism and duration using a 1D approximation of the local surface-wave excitation. We call the new responses virtual earthquakes. We validate the ground motion predicted from the virtual earthquakes against moderate earthquakes in southern California. We then combine temporary seismic stations on the southern San Andreas Fault and extend the point source approximation of the Virtual Earthquake Approach to model finite kinematic ruptures. We confirm the coupling between source directivity and amplification in downtown Los Angeles seen in simulations.

  2. Multiparameter monitoring of short-term earthquake precursors and its physical basis. Implementation in the Kamchatka region

    Directory of Open Access Journals (Sweden)

    Pulinets Sergey

    2016-01-01

    Full Text Available We apply experimental approach of the multiparameter monitoring of short-term earthquake precursors which reliability was confirmed by the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC model created recently [1]. A key element of the model is the process of Ion induced Nucleation (IIN and formation of cluster ions occurring as a result of the ionization of near surface air layer by radon emanating from the Earth's crust within the earthquake preparation zone. This process is similar to the formation of droplet’s embryos for cloud formation under action of galactic cosmic rays. The consequence of this process is the generation of a number of precursors that can be divided into two groups: a thermal and meteorological, and b electromagnetic and ionospheric. We demonstrate elements of prospective monitoring of some strong earthquakes in Kamchatka region and statistical results for the Chemical potential correction parameter for more than 10 years of observations for earthquakes with M≥6. As some experimental attempt, the data of Kamchatka volcanoes monitoring will be demonstrated.

  3. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  4. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1997-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  5. Long-term effect of early-life stress from earthquake exposure on working memory in adulthood.

    Science.gov (United States)

    Li, Na; Wang, Yumei; Zhao, Xiaochuan; Gao, Yuanyuan; Song, Mei; Yu, Lulu; Wang, Lan; Li, Ning; Chen, Qianqian; Li, Yunpeng; Cai, Jiajia; Wang, Xueyi

    2015-01-01

    The present study aimed to investigate the long-term effect of 1976 Tangshan earthquake exposure in early life on performance of working memory in adulthood. A total of 907 study subjects born and raised in Tangshan were enrolled in this study. They were divided into three groups according to the dates of birth: infant exposure (3-12 months, n=274), prenatal exposure (n=269), and no exposure (born at least 1 year after the earthquake, n=364). The prenatal group was further divided into first, second, and third trimester subgroups based on the timing of exposure during pregnancy. Hopkins Verbal Learning Test-Revised and Brief Visuospatial Memory Test-Revised (BVMT-R) were used to measure the performance of working memory. Unconditional logistic regression analysis was used to analyze the influential factors for impaired working memory. The Hopkins Verbal Learning Test-Revised scores did not show significant difference across the three groups. Compared with no exposure group, the BVMT-R scores were slightly lower in the prenatal exposure group and markedly decreased in the infant exposure group. When the BVMT-R scores were analyzed in three subgroups, the results showed that the subjects whose mothers were exposed to earthquake in the second and third trimesters of pregnancy had significantly lower BVMT-R scores compared with those in the first trimester. Education level and early-life earthquake exposure were identified as independent risk factors for reduced performance of visuospatial memory indicated by lower BVMT-R scores. Infant exposure to earthquake-related stress impairs visuospatial memory in adulthood. Fetuses in the middle and late stages of development are more vulnerable to stress-induced damage that consequently results in impaired visuospatial memory. Education and early-life trauma can also influence the performance of working memory in adulthood.

  6. NGA-West 2 Equations for predicting PGA, PGV, and 5%-Damped PSA for shallow crustal earthquakes

    Science.gov (United States)

    Boore, David M.; Stewart, Jon P.; Seyhan, Emel; Atkinson, Gail M.

    2013-01-01

    We provide ground-motion prediction equations for computing medians and standard deviations of average horizontal component intensity measures (IMs) for shallow crustal earthquakes in active tectonic regions. The equations were derived from a global database with M 3.0–7.9 events. We derived equations for the primary M- and distance-dependence of the IMs after fixing the VS30-based nonlinear site term from a parallel NGA-West 2 study. We then evaluated additional effects using mixed effects residuals analysis, which revealed no trends with source depth over the M range of interest, indistinct Class 1 and 2 event IMs, and basin depth effects that increase and decrease long-period IMs for depths larger and smaller, respectively, than means from regional VS30-depth relations. Our aleatory variability model captures decreasing between-event variability with M, as well as within-event variability that increases or decreases with M depending on period, increases with distance, and decreases for soft sites.

  7. Short-term predictions in forex trading

    Science.gov (United States)

    Muriel, A.

    2004-12-01

    Using a kinetic equation that is used to model turbulence (Physica A, 1985-1988, Physica D, 2001-2003), we redefine variables to model the time evolution of the foreign exchange rates of three major currencies. We display live and predicted data for one period of trading in October, 2003.

  8. Statistical tests of simple earthquake cycle models

    Science.gov (United States)

    DeVries, Phoebe M. R.; Evans, Eileen L.

    2016-12-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  9. Storage tanks under earthquake loading, a case study. Evaluation of Eurocode 8 predictions

    NARCIS (Netherlands)

    Pandohi-Mishre, P.R.; Courage, W.M.G.; Gresnigt, A.M.

    2000-01-01

    The behaviour under earthquake loading of an LNG (liquefied natura) gas)-tank with a diameter of 75 meters and a height of 32 meters has been studied analytically and numerically. Design rules for achieving a safe design, provided by Eurocode 8 'Design of structures for earthquake resistance - Part

  10. Long-term Ocean Bottom Monitoring for Shallow Slow Earthquakes in the Hyuga-nada, Nankai Subduction Zone

    Science.gov (United States)

    Yamashita, Y.; Shinohara, M.; Yamada, T.; Nakahigashi, K.; Shiobara, H.; Mochizuki, K.; Maeda, T.; Obara, K.

    2015-12-01

    The Hyuga-nada region, nearby the western end of the Nankai Trough in Japan, is one of the most active areas of shallow slow earthquakes in the world. Recently, ocean-bottom observation of offshore seismicity near the trench succeeded in detecting shallow tremor. The observed traces contained a complete episode lasting for one month exhibiting similar migration property of deep tremor [Yamashita et al., 2015]. This activity was associated with shallow very-low-frequency earthquake (VLFE) activity documented by land-based broadband seismic network. The coincidence between tremor and VLFE activities and similarity of their migration pattern show strong resemblance with the episodic tremor and slip episodes; this similarity suggests that the tremor activity in the shallow plate boundary may also be coupled with VLFE and short-term slow slip events in this area. It is important clarifying the seismicity including slow earthquakes to understand the slip behavior at a shallow plate boundary, and to improve assessments of the possibility of tsunamigenic megathrust earthquake that is anticipated to occur at the Nankai Trough. Motivated by these issues, we started long-term ocean-bottom monitoring in this area from May 2014 using 3 broadband and 7 short-period seismometers. In January 2015, we replaced the instruments and obtained the first data which includes minor shallow tremor and VLFE activity on June 1-3, 2014. Preliminary results of data processing show that the shallow tremor activity occurred at the northwestern part of the 2013 activity. The location corresponds the point where the tremors stopped migrating to further north direction and turned sharply eastward in the 2013 activity. On the other hand, clear tremor migration was not found in the 2014 activity. This local activity may imply that regional/small-scale heterogeneous structures such as a subducting sea mount affect the activity pattern. During the 2014 observation, many ordinary earthquakes also

  11. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  12. Physically-Based Ground Motion Prediction and Validation A Case Study: Mid-sized Marmara Sea Earthquakes

    Science.gov (United States)

    Mert, A.

    2015-12-01

    In this study we have two main purposes. The first one is to simulate five midsize earthquakes (Mw≈5.0) recorded in the Marmara region, which has a geologically complex and heterogeneous crustal structure. We synthesize ground motion for the full wave train on three components, and applied a 'physics based' solution of earthquake rupture. The simulation methodology is based on the studies by Hutchings et al. (2007), Scognamiglio and Hutchings (2009). For each earthquake, we synthesized seismograms using by 500 different rupture scenarios that were generated by Monte Carlo selection of parameters within the range. Synthetic ground motion is a major challenge for seismic hazard assessment studies. Especially after the adoption of performance-based design approach with the Earthquake resistant design of engineering structures. To compute realistic time histories for different locations around Marmara region can be helpful for engineering design, retrofitting the existing structures, hazard and risk management studies and developing new seismic codes and standards.The second purpose is to validate synthetic seismograms with real seismograms. We follow the methodology presented by Anderson (2003) for validation. This methodology proposes a similarity score based on averages of the quality of fit measuring ground motion characteristics and uses a suite of measurements. Namely, the synthetics are compared to real data by ten representative ground motion criteria. The applicability of Empirical Green's functions methodology and physics based solution of earthquake rupture had been assessed in terms of modeling in complex geologic structure. Because the methodology produces source and site specific synthetic ground motion time histories and goodness-of-fit scores of obtained synthetics is between 'fair' to 'good' range based on Anderson's score, we concluded that it can be tried to produce ground motion that has not previously been recorded during catastrophic earthquake

  13. On the Correction of Spatial and Statistical Uncertainties in Systematic Measurements of 222Rn for Earthquake Prediction

    Science.gov (United States)

    Külahcı, Fatih; Şen, Zekâi

    2013-12-01

    In earthquake prediction studies, the regional behaviour of accurate 222Rn measurements at a set of sites plays a significant role. Here, measurements are obtained using active and passive radon detector systems in an earthquake-active region of Turkey. Two new methods are proposed to explain the spatial behaviours and the statistical uncertainties in the 222Rn emission measurements along fault lines in relation to earthquake occurrence. The absolute point cumulative semivariogram (APCSV) and perturbation method (PM) help to depict the spatial distribution patterns of 222Rn in addition to the joint effects of the K dr, the radon distribution coefficient, and the perturbation radon distribution coefficient (PRDC). The K dr coefficient assists in identifying the spatial distributional behaviour in 222Rn concentrations and their migration along the Earth's surface layers. The PRDC considers not only the arithmetic averages but also the variances (or standard deviations) and the correlation coefficients, in addition to the size of the error among the 222Rn measurements. The applications of these methodologies are performed for 13,000 222Rn measurements that are deemed to be sufficient for the characterization of tectonics in the Keban Reservoir along the East Anatolian Fault System (EAFS) in Turkey. The results are evaluated for the İçme earthquake (M L 5.4, 5.7 km, 23 June 2011), which occurred in the vicinity of the EAFS.

  14. Long-term predictions using natural analogues

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, R.C. [Univ. of New Mexico, Albuquerque, NM (United States)

    1995-09-01

    One of the unique and scientifically most challenging aspects of nuclear waste isolation is the extrapolation of short-term laboratory data (hours to years) to the long time periods (10{sup 3}-10{sup 5} years) required by regulatory agencies for performance assessment. The direct validation of these extrapolations is not possible, but methods must be developed to demonstrate compliance with government regulations and to satisfy the lay public that there is a demonstrable and reasonable basis for accepting the long-term extrapolations. Natural systems (e.g., {open_quotes}natural analogues{close_quotes}) provide perhaps the only means of partial {open_quotes}validation,{close_quotes} as well as data that may be used directly in the models that are used in the extrapolation. Natural systems provide data on very large spatial (nm to km) and temporal (10{sup 3}-10{sup 8} years) scales and in highly complex terranes in which unknown synergisms may affect radionuclide migration. This paper reviews the application (and most importantly, the limitations) of data from natural analogue systems to the {open_quotes}validation{close_quotes} of performance assessments.

  15. Predicted Surface Displacements for Scenario Earthquakes in the San Francisco Bay Region

    Science.gov (United States)

    Murray-Moraleda, Jessica R.

    2008-01-01

    In the immediate aftermath of a major earthquake, the U.S. Geological Survey (USGS) will be called upon to provide information on the characteristics of the event to emergency responders and the media. One such piece of information is the expected surface displacement due to the earthquake. In conducting probabilistic hazard analyses for the San Francisco Bay Region, the Working Group on California Earthquake Probabilities (WGCEP) identified a series of scenario earthquakes involving the major faults of the region, and these were used in their 2003 report (hereafter referred to as WG03) and the recently released 2008 Uniform California Earthquake Rupture Forecast (UCERF). Here I present a collection of maps depicting the expected surface displacement resulting from those scenario earthquakes. The USGS has conducted frequent Global Positioning System (GPS) surveys throughout northern California for nearly two decades, generating a solid baseline of interseismic measurements. Following an earthquake, temporary GPS deployments at these sites will be important to augment the spatial coverage provided by continuous GPS sites for recording postseismic deformation, as will the acquisition of Interferometric Synthetic Aperture Radar (InSAR) scenes. The information provided in this report allows one to anticipate, for a given event, where the largest displacements are likely to occur. This information is valuable both for assessing the need for further spatial densification of GPS coverage before an event and prioritizing sites to resurvey and InSAR data to acquire in the immediate aftermath of the earthquake. In addition, these maps are envisioned to be a resource for scientists in communicating with emergency responders and members of the press, particularly during the time immediately after a major earthquake before displacements recorded by continuous GPS stations are available.

  16. Long-term effect of early-life stress from earthquake exposure on working memory in adulthood

    Directory of Open Access Journals (Sweden)

    Li N

    2015-11-01

    Full Text Available Na Li,1-3* Yumei Wang,1-3* Xiaochuan Zhao,1-3 Yuanyuan Gao,1-3 Mei Song,1-3 Lulu Yu,1-3 Lan Wang,1-3 Ning Li,1-3 Qianqian Chen,1-3 Yunpeng Li,1-3 Jiajia Cai,1-3 Xueyi Wang1-31Department of Psychiatry, The First Hospital of Hebei Medical University, 2Mental Health Institute of Hebei Medical University, 3Brain Ageing and Cognitive Neuroscience Laboratory, Hebei, People’s Republic of China *These authors contributed equally to this work Objective: The present study aimed to investigate the long-term effect of 1976 Tangshan earthquake exposure in early life on performance of working memory in adulthood.Methods: A total of 907 study subjects born and raised in Tangshan were enrolled in this study. They were divided into three groups according to the dates of birth: infant exposure (3–12 months, n=274, prenatal exposure (n=269, and no exposure (born at least 1 year after the earthquake, n=364. The prenatal group was further divided into first, second, and third trimester subgroups based on the timing of exposure during pregnancy. Hopkins Verbal Learning Test-Revised and Brief Visuospatial Memory Test-Revised (BVMT-R were used to measure the performance of working memory. Unconditional logistic regression analysis was used to analyze the influential factors for impaired working memory.Results: The Hopkins Verbal Learning Test-Revised scores did not show significant difference across the three groups. Compared with no exposure group, the BVMT-R scores were slightly lower in the prenatal exposure group and markedly decreased in the infant exposure group. When the BVMT-R scores were analyzed in three subgroups, the results showed that the subjects whose mothers were exposed to earthquake in the second and third trimesters of pregnancy had significantly lower BVMT-R scores compared with those in the first trimester. Education level and early-life earthquake exposure were identified as independent risk factors for reduced performance of

  17. A recent deep earthquake doublet in light of long-term evolution of Nazca subduction.

    Science.gov (United States)

    Zahradník, J; Čížková, H; Bina, C R; Sokos, E; Janský, J; Tavera, H; Carvalho, J

    2017-03-31

    Earthquake faulting at ~600 km depth remains puzzling. Here we present a new kinematic interpretation of two Mw7.6 earthquakes of November 24, 2015. In contrast to teleseismic analysis of this doublet, we use regional seismic data providing robust two-point source models, further validated by regional back-projection and rupture-stop analysis. The doublet represents segmented rupture of a ∼30-year gap in a narrow, deep fault zone, fully consistent with the stress field derived from neighbouring 1976-2015 earthquakes. Seismic observations are interpreted using a geodynamic model of regional subduction, incorporating realistic rheology and major phase transitions, yielding a model slab that is nearly vertical in the deep-earthquake zone but stagnant below 660 km, consistent with tomographic imaging. Geodynamically modelled stresses match the seismically inferred stress field, where the steeply down-dip orientation of compressive stress axes at ∼600 km arises from combined viscous and buoyant forces resisting slab penetration into the lower mantle and deformation associated with slab buckling and stagnation. Observed fault-rupture geometry, demonstrated likelihood of seismic triggering, and high model temperatures in young subducted lithosphere, together favour nanometric crystallisation (and associated grain-boundary sliding) attending high-pressure dehydration as a likely seismogenic mechanism, unless a segment of much older lithosphere is present at depth.

  18. Long term prediction of flood occurrence

    Directory of Open Access Journals (Sweden)

    C. Aguilar

    2016-05-01

    Full Text Available How long a river remembers its past is still an open question. Perturbations occurring in large catchments may impact the flow regime for several weeks and months, therefore providing a physical explanation for the occasional tendency of floods to occur in clusters. The research question explored in this paper may be stated as follows: can higher than usual river discharges in the low flow season be associated to a higher probability of floods in the subsequent high flow season? The physical explanation for such association may be related to the presence of higher soil moisture storage at the beginning of the high flow season, which may induce lower infiltration rates and therefore higher river runoff. Another possible explanation is persistence of climate, due to presence of long-term properties in atmospheric circulation. We focus on the Po River at Pontelagoscuro, whose catchment area amounts to 71 000 km2. We look at the stochastic connection between average river flows in the pre-flood season and the peak flows in the flood season by using a bivariate probability distribution. We found that the shape of the flood frequency distribution is significantly impacted by the river flow regime in the low flow season. The proposed technique, which can be classified as a data assimilation approach, may allow one to reduce the uncertainty associated to the estimation of the flood probability.

  19. A NOVEL APPROACH FOR LONG TERM SOLAR RADIATION PREDICTION

    Directory of Open Access Journals (Sweden)

    Manju Khanna

    2018-10-01

    Full Text Available With present stress, being laid on green energy worldwide, harnessing solar energy for commercial use has importance in sizing and long-term prediction of solar radiation. However, with continuous changing environment parameters, it is quite difficult for long-term prediction of solar radiation. In the past research scholars, have carried out solar prediction only for a few days, which is insufficient to exploit the radiation for sizing and harnessing the solar energy for commercial use. To overcome this gap, present work utilizes application of lifting wavelet transform along with ANFIS to predict the radiation for long duration.

  20. Long-Term Effects of the 2011 Japan Earthquake and Tsunami on Incidence of Fatal and Nonfatal Myocardial Infarction.

    Science.gov (United States)

    Nakamura, Motoyuki; Tanaka, Kentarou; Tanaka, Fumitaka; Matsuura, Yuuki; Komi, Ryousuke; Niiyama, Masanobu; Kawakami, Mikio; Koeda, Yorihiko; Sakai, Toshiaki; Onoda, Toshiyuki; Itoh, Tomonori

    2017-08-01

    This study aimed to examine the long-term effects of the 2011 Japan earthquake and tsunami on the incidence of fatal and nonfatal myocardial infarction (MI). In the present study, the incidence of 2 types of cardiac events was comprehensively recorded. The study area was divided into 2 zones based on the severity of tsunami damage, which was determined by the percentage of the inundated area within the residential area (tsunami (r = 0.77; p tsunami was associated with a continual increase in the incidence of fatal MI among disaster survivors. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  1. Child and Adolescent Mental Health in Haiti: Developing Long-Term Mental Health Services After the 2010 Earthquake.

    Science.gov (United States)

    Legha, Rupinder K; Solages, Martine

    2015-10-01

    This article presents an overview of child and adolescent mental health in Haiti, emphasizing the role of structural violence and the factors shaping child protection. The 2010 Haiti earthquake is discussed as an acute on chronic event that highlighted the lack of pre-existing formal biomedical mental health services and worsened the impact of structural violence. Considerations for long-term, sustainable, culturally relevant child and adolescent mental health care in Haiti are also provided. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Tohoku earthquake: a surprise?

    CERN Document Server

    Kagan, Yan Y

    2011-01-01

    We consider three issues related to the 2011 Tohoku mega-earthquake: (1) how to evaluate the earthquake maximum size in subduction zones, (2) what is the repeat time for the largest earthquakes in Tohoku area, and (3) what are the possibilities of short-term forecasts during the 2011 sequence. There are two quantitative methods which can be applied to estimate the maximum earthquake size: a statistical analysis of the available earthquake record and the moment conservation principle. The latter technique studies how much of the tectonic deformation rate is released by earthquakes. For the subduction zones, the seismic or historical record is not sufficient to provide a reliable statistical measure of the maximum earthquake. The moment conservation principle yields consistent estimates of maximum earthquake size: for all the subduction zones the magnitude is of the order 9.0--9.7, and for major subduction zones the maximum earthquake size is statistically indistinguishable. Starting in 1999 we have carried out...

  3. Intermediate-Term Declines in Seismicity at Mt. Wrangell and Mt. Veniaminof Volcanoes, Alaska, Following the November 3, 2002 Mw 7.9 Denali Fault Earthquake

    Science.gov (United States)

    Sanchez, J. J.; McNutt, S. R.

    2003-12-01

    On November 3, 2002 a Mw 7.9 earthquake ruptured segments of the Denali Fault and adjacent faults in interior Alaska providing a unique opportunity to look for intermediate-term (days to weeks) responses of Alaskan volcanoes to shaking from a large regional earthquake. The Alaska Volcano Observatory (AVO) monitors 24 volcanoes with seismograph networks. We examined one station per volcano, generally the closest to the vent (typically within 5 km) unless noise, or other factors made the data unusable. Data were digitally filtered between 0.8 and 5 Hz to enhance the signal-to-noise ratio. Data for the period four weeks before to four weeks after the Mw 7.9 earthquake were then plotted at a standard scale used for AVO routine monitoring. Mt. Veniaminof volcano, which has had recent mild eruptions and a rate of ten earthquakes per day on station VNNF, suffered a drop in seismicity by a factor of two after the earthquake; this lasted for 15 days. Wrangell, the closest volcano to the epicenter, had a background rate of about 16 earthquakes per day. Data from station WANC could not be measured for 3 days after the Mw 7.9 earthquake because the large number and size of aftershocks impeded identification of local earthquakes. For the following 30 days, however, its seismicity rate dropped by a factor of two. Seismicity then remained low for an additional 4 months at Wrangell, whereas that at Veniaminof returned to normal within weeks. The seismicity at both Mt. Veniaminof and Mt. Wrangell is dominated by low-frequency volcanic events. The detection thresholds for both seismograph networks are low and stations VNNF and WANC operated normally during the time of our study, thus we infer that the changes in seismicity may be related to the earthquake. It is known that Wrangell increased its heat output after the Mw 9.2 Alaska earthquake of 1964 and again after the Ms 7.1 St.Elias earthquake of 1979. The other volcanoes showed no changes in seismicity that can be attributable to

  4. Predicting the long-term citation impact of recent publications

    NARCIS (Netherlands)

    Stegehuis, Clara; Litvak, Nelli; Waltman, Ludo

    2015-01-01

    A fundamental problem in citation analysis is the prediction of the long-term citation impact of recent publications. We propose a model to predict a probability distribution for the future number of citations of a publication. Two predictors are used: The impact factor of the journal in which a

  5. Predicting the long-term citation impact of recent publications

    NARCIS (Netherlands)

    Stegehuis, Clara; Litvak, Nelli; Waltman, Ludo

    A fundamental problem in citation analysis is the prediction of the long-term citation impact of recent publications. We propose a model to predict a probability distribution for the future number of citations of a publication. Two predictors are used: the impact factor of the journal in which a

  6. Parallel Earthquake Simulations on Large-Scale Multicore Supercomputers

    KAUST Repository

    Wu, Xingfu

    2011-01-01

    Earthquakes are one of the most destructive natural hazards on our planet Earth. Hugh earthquakes striking offshore may cause devastating tsunamis, as evidenced by the 11 March 2011 Japan (moment magnitude Mw9.0) and the 26 December 2004 Sumatra (Mw9.1) earthquakes. Earthquake prediction (in terms of the precise time, place, and magnitude of a coming earthquake) is arguably unfeasible in the foreseeable future. To mitigate seismic hazards from future earthquakes in earthquake-prone areas, such as California and Japan, scientists have been using numerical simulations to study earthquake rupture propagation along faults and seismic wave propagation in the surrounding media on ever-advancing modern computers over past several decades. In particular, ground motion simulations for past and future (possible) significant earthquakes have been performed to understand factors that affect ground shaking in populated areas, and to provide ground shaking characteristics and synthetic seismograms for emergency preparation and design of earthquake-resistant structures. These simulation results can guide the development of more rational seismic provisions for leading to safer, more efficient, and economical50pt]Please provide V. Taylor author e-mail ID. structures in earthquake-prone regions.

  7. Connecting depth limits of interseismic locking, microseismicity, and large earthquakes in models of long-term fault slip

    Science.gov (United States)

    Jiang, Junle; Lapusta, Nadia

    2017-08-01

    Thickness of the seismogenic zone is commonly determined based on the depth of microseismicity or the fault locking depth inferred from geodetic observations. The relation between the two estimates and their connection to the depth limit of large earthquakes remain elusive. Here we explore the seismic and geodetic observables in models of faults governed by laboratory-based friction laws that combine quasi-static rate-and-state friction and enhanced dynamic weakening. Our models suggest that the transition between the locked and fully creeping regions can occur over a broad depth range. The effective locking depth, Delock, associated with concentrated loading and promoting microseismicity, is located at the top of this transition zone; the geodetic locking depth, Dglock, inverted from surface geodetic observations, corresponds to the depth of fault creeping with approximately half of the long-term rate. Following large earthquakes, Delock either stays unchanged or becomes shallower due to creep penetrating into the shallower locked areas, whereas Dglock deepens as the slip deficit region expands, compensating for the afterslip. As the result, the two locking depths diverge in the late interseismic period, consistent with available seismic and geodetic observations from several major fault segments in Southern California. We find that Dglock provides a bound on the depth limit of large earthquakes in our models. However, the assumed layered distribution of fault friction and simple depth estimates are insufficient to characterize more heterogeneous faults, e.g., ones with significant along-strike variations. Improved observations and models are needed to illuminate physical properties and seismic potential of fault zones.

  8. AN EFFECTIVE HYBRID SUPPORT VECTOR REGRESSION WITH CHAOS-EMBEDDED BIOGEOGRAPHY-BASED OPTIMIZATION STRATEGY FOR PREDICTION OF EARTHQUAKE-TRIGGERED SLOPE DEFORMATIONS

    Directory of Open Access Journals (Sweden)

    A. A. Heidari

    2015-12-01

    Full Text Available Earthquake can pose earth-shattering health hazards to the natural slops and land infrastructures. One of the chief consequences of the earthquakes can be land sliding, which is instigated by durable shaking. In this research, an efficient procedure is proposed to assist the prediction of earthquake-originated slope displacements (EIDS. New hybrid SVM-CBBO strategy is implemented to predict the EIDS. For this purpose, first, chaos paradigm is combined with initialization of BBO to enhance the diversification and intensification capacity of the conventional BBO optimizer. Then, chaotic BBO is developed as the searching scheme to investigate the best values of SVR parameters. In this paper, it will be confirmed that how the new computing approach is effective in prediction of EIDS. The outcomes affirm that the SVR-BBO strategy with chaos can be employed effectively as a predicting tool for evaluating the EIDS.

  9. Reply to “The VAN earthquake predictions,” by D. A. Rhoades and F. F. Evison

    Science.gov (United States)

    Varotsos, P.; Lazaridou, M.

    Rhoades and Evison [1996] indicated a technical flaw in the procedure of Mulargia and Gasperini [1992] and also made many useful remarks and fundamental suggestions on the correct way for performing a statistical evaluation of an earthquake prediction method. We think that these suggestions should be carefully followed by statisticians in the future. However, we do not agree with Rhoades and Evison's [1996] opinion that objective tests of the performance of the VAN-method, using independent data, cannot begin until the method and the null hypothesis have been fully formulated. In our opinion the main question in the present debate is whether our predictions can be ascribed to chance. Such a test has already been carried out in this issue by Aceves et al. [1996], although they make it clear that they have tested the significance of our predictions and not the overall success of our method.

  10. A Test of a Strong Ground Motion Prediction Methodology for the 7 September 1999, Mw=6.0 Athens Earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Hutchings, L; Ioannidou, E; Voulgaris, N; Kalogeras, I; Savy, J; Foxall, W; Stavrakakis, G

    2004-08-06

    We test a methodology to predict the range of ground-motion hazard for a fixed magnitude earthquake along a specific fault or within a specific source volume, and we demonstrate how to incorporate this into probabilistic seismic hazard analyses (PSHA). We modeled ground motion with empirical Green's functions. We tested our methodology with the 7 September 1999, Mw=6.0 Athens earthquake, we: (1) developed constraints on rupture parameters based on prior knowledge of earthquake rupture processes and sources in the region; (2) generated impulsive point shear source empirical Green's functions by deconvolving out the source contribution of M < 4.0 aftershocks; (3) used aftershocks that occurred throughout the area and not necessarily along the fault to be modeled; (4) ran a sufficient number of scenario earthquakes to span the full variability of ground motion possible; (5) found that our distribution of synthesized ground motions span what actually occurred and their distribution is realistically narrow; (6) determined that one of our source models generates records that match observed time histories well; (7) found that certain combinations of rupture parameters produced ''extreme'' ground motions at some stations; (8) identified that the ''best fitting'' rupture models occurred in the vicinity of 38.05{sup o} N 23.60{sup o} W with center of rupture near 12 km, and near unilateral rupture towards the areas of high damage, and this is consistent with independent investigations; and (9) synthesized strong motion records in high damage areas for which records from the earthquake were not recorded. We then developed a demonstration PSHA for a source region near Athens utilizing synthesized ground motion rather that traditional attenuation. We synthesized 500 earthquakes distributed throughout the source zone likely to have Mw=6.0 earthquakes near Athens. We assumed an average return period of 1000 years for this

  11. The Effects of a Short-term Cognitive Behavioral Group Intervention on Bam Earthquake Related PTSD Symptoms in Adolescents

    Directory of Open Access Journals (Sweden)

    Fatemeh Naderi

    2009-04-01

    Full Text Available "n "n "nObjective :Post traumatic stress disorder (PTSD may be the first reaction after disasters. Many studies have shown the efficacy of cognitive- behavioral therapy in treatment of post traumatic stress disorder. The main objective of this study is to evaluate the effect of group CBT in adolescent survivors of a large scale disaster (Bam earthquake. "n "nMethods: In a controlled trial, we evaluated the efficacy of a short term method of group cognitive-behavioral therapy in adolescent survivors of Bam earthquake who had PTSD symptoms and compared it with a control group. The adolescents who had severe PTSD or other psychiatric disorders that needed pharmacological interventions were excluded. We evaluated PTSD symptoms using Post traumatic Stress Scale (PSS pre and post intervention and compared them with a control group. "n "nResults: 100 adolescents were included in the study and 15 were excluded during the intervention. The mean age of the participants was 14.6±2.1 years. The mean score of total PTSD symptoms and the symptoms of avoidance was reduced after interventions, and was statistically significant. The mean change of re-experience and hyper arousal symptoms of PTSD were not significant. "n "nConclusion: Psychological debriefing and group cognitive behavioral therapy may be effective in reducing some of the PTSD symptoms.

  12. Long-term slow slip events along the Nankai trough subduction zone after the 2011 Tohoku earthquake in Japan

    Science.gov (United States)

    Ozawa, Shinzaburo

    2017-04-01

    The global navigation satellite system (GNSS) network in Japan has detected transient crustal deformation in regions along the Nankai trough subduction zone in southwest Japan from approximately 2013, after the 2011 Tohoku earthquake. Using the GNSS data, we estimated the spatiotemporal evolution of long-term aseismic slip along the Nankai trough. The result indicates that aseismic slip has occurred on the plate interface in the Bungo, northern Miyazaki, and southern Miyazaki regions, southwest Japan. The estimated time evolution between October 2013 and April 2015 shows the simultaneous occurrence of northern and southern Miyazaki slow slips with different durations followed by a Bungo slow slip in 2014. A southern Miyazaki slow slip occurred from approximately July 2015, which was followed by a northern Miyazaki slow slip and a Bungo slow slip in 2016. The 2016 Bungo slow slip occurred in a shallow area that did not slip at the time of the 2014 Bungo slow slip. The two different rupture processes from 2013 to 2015 and from 2015 to 2016 may be an important clue toward understanding subduction tectonics in southwest Japan. These interplate slow slip events are changing the stress state in favor of the occurrence of Nankai and Hyuga-nada earthquakes together with Tokai and Kii channel slow slips, which have been occurring since approximately 2013 and 2014, respectively.[Figure not available: see fulltext.

  13. Unusual Animal Behavior Preceding the 2011 Earthquake off the Pacific Coast of Tohoku, Japan: A Way to Predict the Approach of Large Earthquakes

    Directory of Open Access Journals (Sweden)

    Hiroyuki Yamauchi

    2014-04-01

    Full Text Available Unusual animal behaviors (UABs have been observed before large earthquakes (EQs, however, their mechanisms are unclear. While information on UABs has been gathered after many EQs, few studies have focused on the ratio of emerged UABs or specific behaviors prior to EQs. On 11 March 2011, an EQ (Mw 9.0 occurred in Japan, which took about twenty thousand lives together with missing and killed persons. We surveyed UABs of pets preceding this EQ using a questionnaire. Additionally, we explored whether dairy cow milk yields varied before this EQ in particular locations. In the results, 236 of 1,259 dog owners and 115 of 703 cat owners observed UABs in their pets, with restless behavior being the most prominent change in both species. Most UABs occurred within one day of the EQ. The UABs showed a precursory relationship with epicentral distance. Interestingly, cow milk yields in a milking facility within 340 km of the epicenter decreased significantly about one week before the EQ. However, cows in facilities farther away showed no significant decreases. Since both the pets’ behavior and the dairy cows’ milk yields were affected prior to the EQ, with careful observation they could contribute to EQ predictions.

  14. Human Cervicovaginal Fluid Biomarkers to Predict Term and Preterm Labour

    Directory of Open Access Journals (Sweden)

    Yujing Jan Heng

    2015-05-01

    Full Text Available Preterm birth (PTB; birth before 37 completed weeks of gestation remains the major cause of neonatal morbidity and mortality. The current generation of biomarkers predictive of PTB have limited utility. In pregnancy, the human cervicovaginal fluid (CVF proteome is a reflection of the local biochemical milieu and is influenced by the physical changes occurring in the vagina, cervix and adjacent overlying fetal membranes. Term and preterm labour (PTL share common pathways of cervical ripening, myometrial activation and fetal membranes rupture leading to birth. We therefore hypothesise that CVF biomarkers predictive of labour may be similar in both the term and preterm labour setting. In this review, we summarise some of the existing published literature as well as our team’s breadth of work utilising the CVF for the discovery and validation of putative CVF biomarkers predictive of human labour.Our team established an efficient method for collecting serial CVF samples for optimal 2-dimensional gel electrophoresis resolution and analysis. We first embarked on CVF biomarker discovery for the prediction of spontaneous onset of term labour using 2D-electrophoresis and solution array multiple analyte profiling. 2D-electrophoretic analyses were subsequently performed on CVF samples associated with PTB. Several proteins have been successfully validated and demonstrate that these biomarkers are associated with term and PTL and may be predictive of both term and PTL. In addition, the measurement of these putative biomarkers was found to be robust to the influences of vaginal microflora and/or semen. The future development of a multiple biomarker bed-side test would help improve the prediction of PTB and the clinical management of patients.

  15. Physical approach to short-term wind power prediction

    CERN Document Server

    Lange, Matthias

    2006-01-01

    Offers an approach to the ultimate goal of the short-term prediction of the power output of winds farms. This book addresses scientists and engineers working in wind energy related R and D and industry, as well as graduate students and nonspecialists researchers in the fields of atmospheric physics and meteorology.

  16. Controls on the long term earthquake behavior of an intraplate fault revealed by U-Th and stable isotope analyses of syntectonic calcite veins

    Science.gov (United States)

    Williams, Randolph; Goodwin, Laurel; Sharp, Warren; Mozley, Peter

    2017-04-01

    U-Th dates on calcite precipitated in coseismic extension fractures in the Loma Blanca normal fault zone, Rio Grande rift, NM, USA, constrain earthquake recurrence intervals from 150-565 ka. This is the longest direct record of seismicity documented for a fault in any tectonic environment. Combined U-Th and stable isotope analyses of these calcite veins define 13 distinct earthquake events. These data show that for more than 400 ka the Loma Blanca fault produced earthquakes with a mean recurrence interval of 40 ± 7 ka. The coefficient of variation for these events is 0.40, indicating strongly periodic seismicity consistent with a time-dependent model of earthquake recurrence. Stochastic statistical analyses further validate the inference that earthquake behavior on the Loma Blanca was time-dependent. The time-dependent nature of these earthquakes suggests that the seismic cycle was fundamentally controlled by a stress renewal process. However, this periodic cycle was punctuated by an episode of clustered seismicity at 430 ka. Recurrence intervals within the earthquake cluster were as low as 5-11 ka. Breccia veins formed during this episode exhibit carbon isotope signatures consistent with having formed through pronounced degassing of a CO2 charged brine during post-failure, fault-localized fluid migration. The 40 ka periodicity of the long-term earthquake record of the Loma Blanca fault is similar in magnitude to recurrence intervals documented through paleoseismic studies of other normal faults in the Rio Grande rift and Basin and Range Province. We propose that it represents a background rate of failure in intraplate extension. The short-term, clustered seismicity that occurred on the fault records an interruption of the stress renewal process, likely by elevated fluid pressure in deeper structural levels of the fault, consistent with fault-valve behavior. The relationship between recurrence interval and inferred fluid degassing suggests that pore fluid pressure

  17. Short-term volcano-tectonic earthquake forecasts based on a moving mean recurrence time algorithm: the El Hierro seismo-volcanic crisis experience

    Science.gov (United States)

    García, Alicia; De la Cruz-Reyna, Servando; Marrero, José M.; Ortiz, Ramón

    2016-05-01

    Under certain conditions, volcano-tectonic (VT) earthquakes may pose significant hazards to people living in or near active volcanic regions, especially on volcanic islands; however, hazard arising from VT activity caused by localized volcanic sources is rarely addressed in the literature. The evolution of VT earthquakes resulting from a magmatic intrusion shows some orderly behaviour that may allow the occurrence and magnitude of major events to be forecast. Thus governmental decision makers can be supplied with warnings of the increased probability of larger-magnitude earthquakes on the short-term timescale. We present here a methodology for forecasting the occurrence of large-magnitude VT events during volcanic crises; it is based on a mean recurrence time (MRT) algorithm that translates the Gutenberg-Richter distribution parameter fluctuations into time windows of increased probability of a major VT earthquake. The MRT forecasting algorithm was developed after observing a repetitive pattern in the seismic swarm episodes occurring between July and November 2011 at El Hierro (Canary Islands). From then on, this methodology has been applied to the consecutive seismic crises registered at El Hierro, achieving a high success rate in the real-time forecasting, within 10-day time windows, of volcano-tectonic earthquakes.

  18. An overview about the pre-earthquake signals observed in relation to Tohoku M9 EQ and the current status of EQ prediction study in Japan

    Science.gov (United States)

    Nagao, T.

    2012-12-01

    On 11 March 2011, M:9.0 Tohoku EQ, with a huge tsunami, occurred resulting in a devastation of the Pacific side of entire northeastern Japan. As of now, this EQ disaster turned into nuclear hazard raised by the Fukushima #1 nuclear power plant. Now "Fukushima" becomes very famous Japanese word like "Hiroshima". In the presentation, the author would like to summarize seismic, geodetic and electromagnetic pre-earthquake changes except ionospheric phenomena such as OLR and GPS-TEC anomalies. Seismic and geodetic anomalies: Concerning seismicity, many authors present seismic quiescence. Katasumata (2011, EPS, 63, 709-712) claimed more than 20 years seismic quiescence by using the JMA seismic catalog since 1965. The Institute of Statistical Mathematics also claimed notable seismic quiescence all over the Japanese island 15 years ago. Furthermore, the author's group applied the weighted coefficient method in time, space and magnitude, called the RTM method (Nagao et al, 2011, EPS, 63, 315-324). The RTM method also showed clear seismicity change almost two years before the EQ. Concerning b-value of the GR law, it many researchers stated that the b-value of the foreshock activity was very small (0.4). Tanaka (2012, GRL, 39, L00G26, doi:10.1029/2012GL051179) reported that strong long-term statistical correlations between tidally-induced stresses and earthquake occurrence times. The author considers this phenomenon was one of the most notable pre-earthquake changes before the EQ. Kato (2012, Science, 335, 705-708, doi: 10.1126/science.1215141) identified two distinct sequences of foreshocks migrating at rates of 2 to 10 kilometers per day along the trench axis toward the epicenter. According to the GPS base-line observations, GSI reported that non-steady state changes were started around 2003 in Tohoku region. Furthermore, large back-slip is also recognized around the epicentral area since 2007. Electromagnetic anomalies According to Hase (pers. comm.), after correction of

  19. Improving Transit Predictions of Known Exoplanets with TERMS

    Directory of Open Access Journals (Sweden)

    Mahadevan S.

    2011-02-01

    Full Text Available Transiting planet discoveries have largely been restricted to the short-period or low-periastron distance regimes due to the bias inherent in the geometric transit probability. Through the refinement of planetary orbital parameters, and hence reducing the size of transit windows, long-period planets become feasible targets for photometric follow-up. Here we describe the TERMS project that is monitoring these host stars at predicted transit times.

  20. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru

    Science.gov (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Background Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. Methods We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey’s ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case – an 8.4 magnitude earthquake that hit southern Peru in 2001. Results and conclusions Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post- earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters. PMID:26090999

  1. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Science.gov (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  2. Long-term time series prediction using OP-ELM.

    Science.gov (United States)

    Grigorievskiy, Alexander; Miche, Yoan; Ventelä, Anne-Mari; Séverin, Eric; Lendasse, Amaury

    2014-03-01

    In this paper, an Optimally Pruned Extreme Learning Machine (OP-ELM) is applied to the problem of long-term time series prediction. Three known strategies for the long-term time series prediction i.e. Recursive, Direct and DirRec are considered in combination with OP-ELM and compared with a baseline linear least squares model and Least-Squares Support Vector Machines (LS-SVM). Among these three strategies DirRec is the most time consuming and its usage with nonlinear models like LS-SVM, where several hyperparameters need to be adjusted, leads to relatively heavy computations. It is shown that OP-ELM, being also a nonlinear model, allows reasonable computational time for the DirRec strategy. In all our experiments, except one, OP-ELM with DirRec strategy outperforms the linear model with any strategy. In contrast to the proposed algorithm, LS-SVM behaves unstably without variable selection. It is also shown that there is no superior strategy for OP-ELM: any of three can be the best. In addition, the prediction accuracy of an ensemble of OP-ELM is studied and it is shown that averaging predictions of the ensemble can improve the accuracy (Mean Square Error) dramatically. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Long-term associative learning predicts verbal short-term memory performance.

    Science.gov (United States)

    Jones, Gary; Macken, Bill

    2017-10-02

    Studies using tests such as digit span and nonword repetition have implicated short-term memory across a range of developmental domains. Such tests ostensibly assess specialized processes for the short-term manipulation and maintenance of information that are often argued to enable long-term learning. However, there is considerable evidence for an influence of long-term linguistic learning on performance in short-term memory tasks that brings into question the role of a specialized short-term memory system separate from long-term knowledge. Using natural language corpora, we show experimentally and computationally that performance on three widely used measures of short-term memory (digit span, nonword repetition, and sentence recall) can be predicted from simple associative learning operating on the linguistic environment to which a typical child may have been exposed. The findings support the broad view that short-term verbal memory performance reflects the application of long-term language knowledge to the experimental setting.

  4. [Hyperbilirubinemia in full-term newborns. Predictive factors].

    Science.gov (United States)

    Carbonell Estrany, X; Botet Mussons, F; Figueras Aloy, J; Riu Godó, A

    1999-04-01

    Nowadays economical criteria lead to early maternal hospital discharge, even before 48 hours after labor, producing an increase in neonatal readmissions for hyperbilirubinemia. We tried to predict the healthy term newborns that may develop a significant hyperbilirubinemia (> or = 17 mg/dl in the first 4 days of life). Bilirubin in umbilical cord blood, transcutaneous measurements of bilirubin at 24, 48 and between 60 and 96 hours of life and bilirubin in blood obtained from heel-sticks at 96 hours was analyzed in 610 newborns. Moreover, serum bilirubin was determined at the same time-points in 169 newborns submitted to blood extractions for different reasons. The transcutaneous bilirubinometer used was a Minolta/Air-Shields JM-102. A significant hyperbilirubinemia was present in 2.95% of the newborns. The correlation between serum and transcutaneous bilirubin was high (r = 0.92; p or = 6 mg/dl and > or = 9 mg/dl, respectively, predicted a subsequent hyperbilirubinemia with a sensitivity of 100% at both time-points, specificity of 47.5% and 64.3%, positive predictive value of 7.3% and 16.4%, respectively, and a negative predictive value of 100% for both. Transcutaneous measurement at 48 hours with a cut-off point of 13 (equivalent to a bilirubinemia of 9 mg/dl) predicts hyperbilirubinemia with a sensitivity of 94.4%, specificity of 51.7%, positive predictive value of 6.0% and negative predictive value of 99.6%. If the newborn presents a bilirubinemia > or = 6 mg/dl at 24 hours and > or = 9 mg/dl or a transcutaneous measurement > or = 13 at 48 hours a new bilirubin measurement must be performed between 48 and 72 hours of life.

  5. Summary of the GK15 ground‐motion prediction equation for horizontal PGA and 5% damped PSA from shallow crustal continental earthquakes

    Science.gov (United States)

    Graizer, Vladimir;; Kalkan, Erol

    2016-01-01

    We present a revised ground‐motion prediction equation (GMPE) for computing medians and standard deviations of peak ground acceleration (PGA) and 5% damped pseudospectral acceleration (PSA) response ordinates of the horizontal component of randomly oriented ground motions to be used for seismic‐hazard analyses and engineering applications. This GMPE is derived from the expanded Next Generation Attenuation (NGA)‐West 1 database (see Data and Resources; Chiou et al., 2008). The revised model includes an anelastic attenuation term as a function of quality factor (Q0) to capture regional differences in far‐source (beyond 150 km) attenuation, and a new frequency‐dependent sedimentary‐basin scaling term as a function of depth to the 1.5  km/s shear‐wave velocity isosurface to improve ground‐motion predictions at sites located on deep sedimentary basins. The new Graizer–Kalkan 2015 (GK15) model, developed to be simple, is applicable for the western United States and other similar shallow crustal continental regions in active tectonic environments for earthquakes with moment magnitudes (M) 5.0–8.0, distances 0–250 km, average shear‐wave velocities in the upper 30 m (VS30) 200–1300  m/s, and spectral periods (T) 0.01–5 s. Our aleatory variability model captures interevent (between‐event) variability, which decreases with magnitude and increases with distance. The mixed‐effect residuals analysis reveals that the GK15 has no trend with respect to the independent predictor parameters. Compared to our 2007–2009 GMPE, the PGA values are very similar, whereas spectral ordinates predicted are larger at T<0.2  s and they are smaller at longer periods.

  6. Long term (2004-2013) correlation analysis among SSTAs (Significant Sequences of TIR Anomalies) and Earthquakes (M>4) occurrence over Greece: examples of application within a multi-parametric system for continuous seismic hazard monitoring.

    Science.gov (United States)

    Tramutoli, Valerio; Coviello, Irina; Eleftheriou, Alexander; Filizzola, Carolina; Genzano, Nicola; Lacava, Teodosio; Lisi, Mariano; Makris, John P.; Paciello, Rossana; Pergola, Nicola; Satriano, Valeria; vallianatos, filippos

    2015-04-01

    Real-time integration of multi-parametric observations is expected to significantly contribute to the development of operational systems for time-Dependent Assessment of Seismic Hazard (t-DASH) and earthquake short term (from days to weeks) forecast. However a very preliminary step in this direction is the identification of those parameters (chemical, physical, biological, etc.) whose anomalous variations can be, to some extent, associated to the complex process of preparation of major earthquakes. In this paper one of these parameter (the Earth's emitted radiation in the Thermal Infra-Red spectral region) is considered for its possible correlation with M≥4 earthquakes occurred in Greece in between 2004 and 2013. The RST (Robust Satellite Technique) data analysis approach and RETIRA (Robust Estimator of TIR Anomalies) index were used to preliminarily define, and then to identify, Significant Sequences of TIR Anomalies (SSTAs) in 10 years (2004-2013) of daily TIR images acquired by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) on board the Meteosat Second Generation (MSG) satellite. Taking into account physical models proposed for justifying the existence of a correlation among TIR anomalies and earthquakes occurrence, specific validation rules (in line with the ones used by the Collaboratory for the Study of Earthquake Predictability - CSEP - Project) have been defined to drive the correlation analysis process. The analysis shows that more than 93% of all identified SSTAs occur in the pre-fixed space-time window around (M≥4) earthquakes time and location of occurrence with a false positive rate smaller than 7%. Achieved results, and particularly the very low rate of false positives registered on a so long testing period, seems already sufficient (at least) to qualify TIR anomalies (identified by RST approach and RETIRA index) among the parameters to be considered in the framework of a multi-parametric approach to time-Dependent Assessment of

  7. Anomalous crustal movements before great Wenchuan earthquake observed by GPS

    Directory of Open Access Journals (Sweden)

    Gu Guohua

    2011-05-01

    Full Text Available Studies of GPS data carried out before and after the great Wenchuan earthquake of Ms8.0 on May 12, 2008 show that anomalous crustal movements occurred before the earthquake. Data from 4 pre-earthquake observation sessions at a dense network of stations show that there were prominent broad-ranged long- and mid-term anomalies in horizontal displacements and strain and in vertical displacements. Data from the fewer-numbered reference stations of continuous GPS observations since 1999 in West and South China showed short-term preseismic anomalies in horizontal displacements. The detection of co-seismic horizontal displacements at these stations supports the existence of the pre-earthquake anomalies. Results of single-epoch solutions of data from continuous-observation stations near the epicenter also show large imminent anomalies in vertical displacements. Although the Wenchuan earthquake was not predicted, these results give a strong indication that GPS should be the main observation technique for long-term, mid-term, short-term and imminent earthquake predictions.

  8. Long-term weather predictability: Ural case study

    Science.gov (United States)

    Kubyshen, Alexander; Shopin, Sergey

    2016-04-01

    The accuracy of the state-of-the-art long-term meteorological forecast (at the seasonal level) is still low. Here it is presented approach (RAMES method) realizing different forecasting methodology. It provides prediction horizon of up to 19-22 years under equal probabilities of determination of parameters in every analyzed period [1]. Basic statements of the method are the following. 1. Long-term forecast on the basis of numerical modeling of the global meteorological process is principally impossible. Extension of long-term prediction horizon could be obtained only by the revealing and using a periodicity of meteorological situations at one point of observation. 2. Conventional calendar is unsuitable for generalization of meteorological data and revealing of cyclicity of meteorological processes. RAMES method uses natural time intervals: one day, synodic month and one year. It was developed a set of special calendars using these natural periods and the Metonic cycle. 3. Long-term time series of meteorological data is not a uniform universal set, it is a sequence of 28 universal sets appropriately superseding each other in time. The specifics of the method are: 1. Usage of the original research toolkit consisting of - a set of calendars based on the Metonic cycle; - a set of charts (coordinate systems) for the construction of sequence diagrams (of daily variability of a meteorological parameter during the analyzed year; of daily variability of a meteorological parameter using long-term dynamical time series of periods-analogues; of monthly and yearly variability of accumulated value of meteorological parameter). 2. Identification and usage of new virtual meteorological objects having several degrees of generalization appropriately located in the used coordinate systems. 3. All calculations are integrated into the single technological scheme providing comparison and mutual verification of calculation results. During the prolonged testing in the Ural region, it was

  9. Linking short- and long-term deformation along an active margin: regional tectono-geomorphic patterns in light of the 2010 Maule Chile earthquake (M8.8)

    Science.gov (United States)

    Jara-Muñoz, Julius; Melnick, Daniel; Brill, Dominik; Strecker, Manfred

    2014-05-01

    Strongly coupled subduction zones are known to have generated some of the largest earthquakes on Earth (megathrust earthquakes). These regions are also associated with an array of tectonic landforms, including multiple marine and fluvial terraces, which are intimately coupled with the long-term effects of seismogenic processes. Thus understanding the parameters that control the along-strike propagation of megathrust earthquake ruptures combined with the analysis of tectonic landforms is fundamental for the assessment of seismic hazards and risk mitigation. Here we report on the 2010 Maule earthquake that ruptured ~500 km of the central Chile margin. Modeling of GPS data during the interseismic and co-seismic periods have revealed segmentation in two main areas of high slip release and coupling. However, the spatiotemporal persistence of these segments and their relation with mechanical properties of the forearc is still poorly understood. To elucidate the relationships between short-term rupture segments and long-term tectono-geomorphic entities of the forearc we quantified permanent, long-term deformation using marine terraces in the Maule rupture zone and evaluate its relation with inter- and co-seismic patterns. We used the MIS-5 marine terrace, an ubiquitous geomorphic reference surface along the coast of central Chile, which we correlated with LiDAR images, field observations and new OSL ages. Furthermore, we evaluated the mechanisms of uplift by forward modeling of plate boundary slip. Coeval terraces are sharply offset across discrete crustal faults and also deformed in areas of broad crustal warping with wavelengths of ~100 km, reflecting activity of deep-seated structures within the interplate zone, both at the southern and northern sectors of the Maule rupture, where uplift rates reach 1.8 mm/yr. The central part, in turn, is characterized by a lesser degree of permanent uplift. Based on the similarities between seismic-cycle deformation and historical

  10. Comparison of ground motions estimated from prediction equations and from observed damage during the M = 4.6 1983 Liège earthquake (Belgium

    Directory of Open Access Journals (Sweden)

    D. García Moreno

    2013-08-01

    Full Text Available On 8 November 1983 an earthquake of magnitude 4.6 damaged more than 16 000 buildings in the region of Liège (Belgium. The extraordinary damage produced by this earthquake, considering its moderate magnitude, is extremely well documented, giving the opportunity to compare the consequences of a recent moderate earthquake in a typical old city of Western Europe with scenarios obtained by combining strong ground motions and vulnerability modelling. The present study compares 0.3 s spectral accelerations estimated from ground motion prediction equations typically used in Western Europe with those obtained locally by applying the statistical distribution of damaged masonry buildings to two fragility curves, one derived from the HAZUS programme of FEMA (FEMA, 1999 and another developed for high-vulnerability buildings by Lang and Bachmann (2004, and to a method proposed by Faccioli et al. (1999 relating the seismic vulnerability of buildings to the damage and ground motions. The results of this comparison reveal good agreement between maxima spectral accelerations calculated from these vulnerability and fragility curves and those predicted from attenuation law equations, suggesting peak ground accelerations for the epicentral area of the 1983 earthquake of 0.13–0.20 g (g: gravitational acceleration.

  11. Earthquakes in British Columbia

    National Research Council Canada - National Science Library

    1991-01-01

    This pamphlet provides information about the causes of earthquakes, where earthquakes occur, British Columbia plate techtonics, earthquake patterns, earthquake intensity, geology and earthquake impact...

  12. Prediction of long-term failure in Kevlar 49 composites

    Energy Technology Data Exchange (ETDEWEB)

    Gerstle, F.P. Jr.

    1982-01-01

    Creep rupture data in Kevlar 49 epoxy usually exhibit considerable scatter: the coefficient of variation (CV) about the mean failure time at a given stress exceeds 100%. Quasi-static strength data, in contrast, shows little scatter: <4% CV for pressure vessels and <10% for impregnated strands. In this paper analysis of existing creep rupture data on Kevlar epoxy vessels at four storage pressures has produced an interesting and useful result. It was found that a significant portion of the scatter in failure times for pressure vessels is due to spool-to-spool variation in the eight spools of Kevlar fibers used to wind the vessels. The order rank of mean times to failure was consistent over a pressure range from 3400 to 4300 psi, 68 to 86% of short term burst. Also, the coefficient of variation about the mean failure time for each spool was less than that for the total sample. The statistical inference that the sample is nonhomogeneous was supported by a nonparametric check using the Kruskal-Wallis test, and by a parametric analysis of variance. The order rank found in long-term tests did not unequivocally agree with static strength ranks; several spool sets were distinctly high or low. The implication is that, while static strengths are not valid predictors of long-term behavior, short term creep rupture tests at high stress definitely are. The material difference which causes the spool-to-spool variations has not yet been identified for all eight spools. However, it appears that Kevlar behavior at lower pressures may be predicted through the use of curves fitted to the data for each spool. A power law relating failure time to pressure, t = t/sub 0/(p/p/sub 0/)/sup m/, was found to fit the data reasonably well. The implication is that, both in composite vessel design and in creep rupture experiments, the pressure (or stress) level be carefully controlled.

  13. Can mine tremors be predicted? Observational studies of earthquake nucleation, triggering and rupture in South African mines

    CSIR Research Space (South Africa)

    Durrheim, RJ

    2012-05-01

    Full Text Available Mining-induced earthquakes pose a risk to workers in deep mines, while natural earthquakes pose a risk to people living close to plate boundaries and even in stable continental regions. A five-year ca. US$3 million Japan-SA collaborative project...

  14. Human short-term spatial memory: precision predicts capacity.

    Science.gov (United States)

    Banta Lavenex, Pamela; Boujon, Valérie; Ndarugendamwo, Angélique; Lavenex, Pierre

    2015-03-01

    Here, we aimed to determine the capacity of human short-term memory for allocentric spatial information in a real-world setting. Young adults were tested on their ability to learn, on a trial-unique basis, and remember over a 1-min interval the location(s) of 1, 3, 5, or 7 illuminating pads, among 23 pads distributed in a 4m×4m arena surrounded by curtains on three sides. Participants had to walk to and touch the pads with their foot to illuminate the goal locations. In contrast to the predictions from classical slot models of working memory capacity limited to a fixed number of items, i.e., Miller's magical number 7 or Cowan's magical number 4, we found that the number of visited locations to find the goals was consistently about 1.6 times the number of goals, whereas the number of correct choices before erring and the number of errorless trials varied with memory load even when memory load was below the hypothetical memory capacity. In contrast to resource models of visual working memory, we found no evidence that memory resources were evenly distributed among unlimited numbers of items to be remembered. Instead, we found that memory for even one individual location was imprecise, and that memory performance for one location could be used to predict memory performance for multiple locations. Our findings are consistent with a theoretical model suggesting that the precision of the memory for individual locations might determine the capacity of human short-term memory for spatial information. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Predicting Short-term Performance of Multifocal Contact Lenses.

    Science.gov (United States)

    Diec, Jennie; Tilia, Daniel; Naduvilath, Thomas; Bakaraju, Ravi C

    2017-11-01

    To investigate if initial multifocal contact lens (MFCL) performance predicts short-term dispensing performance. A retrospective analysis of 55 participants (Px) in a masked, crossover, clinical trial, using ACUVUE OASYS for Presbyopia and AIR OPTIX AQUA Multifocal. Subjective questionnaires were administered at the following instances: initial fitting, two take home questionnaires (THQ) completed between days 2 and 4 and at assessment, ≥5 days after fitting. Questionnaires included vision clarity and lack of ghosting at distance, intermediate and near at day/night time points rated on a 1 to 10 (1-step, 10 most favorable) rating scale. Vision stability, vision while driving, overall vision satisfaction, willingness to purchase and comfort, as well as acuity-based measures were also collected. There were no statistical differences in comfort and vision at all distances, in vision stability or driving at either time points between THQ and assessment (P>0.05). However, there was a statistical decline in subjective overall vision satisfaction and comfort between fitting and assessment visits (P<0.001). Willingness to purchase remained the same at fitting and assessment in 68% of Px, whereas only 4% of Px converted to a positive willingness to purchase at assessment. The majority of acuity-based measures remained constant between fitting and assessment visits. Initial performance at fitting was not able to predict short-term performance of MFCL. Subjective measures peaked at fitting and declined thereafter whereas acuity-based measures remained constant. Utility of subjective rating tools may aid practitioners to gauge success of MFCL.

  16. After the damages: Lessons learned from recent earthquakes for ground-motion prediction and seismic hazard assessment (C.F. Gauss Lecture)

    Science.gov (United States)

    Cotton, Fabrice

    2017-04-01

    Recent damaging earthquakes (e.g. Japan 2011, Nepal 2014, Italy 2016) and associated ground-shaking (ground-motion) records challenge the engineering models used to quantify seismic hazard. The goal of this presentation is to present the lessons learned from these recent events and discuss their implications for ground-motion prediction and probabilistic seismic hazard assessment. The following points will be particularly addressed: 1) Recent observations clearly illustrate the dependency of ground-shaking on earthquake source related factors (e.g. fault properties and geometry, earthquake depth, directivity). The weaknesses of classical models and the impact of these factors on hazard evaluation will be analysed and quantified. 2) These observations also show that events of similar magnitude and style of faulting are producing ground-motions which are highly variable. We will analyse this variability and show that the exponential growth of recorded data give a unique opportunity to quantify regional or between-events shaking variations. Indeed, most seismic-hazard evaluations do not consider the regional specificities of earthquake or wave-propagation properties. There is little guidance in the literature on how this should be done and we will show that this challenge is interdisciplinary, as structural geology, neotectonic and tomographic images can provide key understanding of these regional variations. 3) One of the key lessons of recent earthquakes is that extreme hazard scenarios and ground-shaking are difficult to predict. In other words, we need to mobilize "scientific imagination" and define new strategies based on the latest research results to capture epistemic uncertainties and integrate them in engineering seismology projects. We will discuss these strategies and show an example of their implementation to develop new seismic hazard maps of Europe (Share and Sera FP7 projects) and Germany.

  17. Prediction of long-term effects of juvenile idiopathic arthritis

    Directory of Open Access Journals (Sweden)

    Ye.M. Zaytseva

    2017-12-01

    Full Text Available Background. Juvenile idiopathic arthritis (JIA is characterized by chronic inflammation of the joints with progressive course and a strong tendency to development of early disability. Long-term follow up of a patients with JIA allows us to trace the evolution of the disease, and to assess the correlation between the course of pathological process and the type of disease onset, age, sex, degree of activity and other factors, which contribute to the use of more exact methods for early detection of the adverse health consequences of JIA and the correct treatment. The purpose of the research was an improvement of the prognostic criteria for JIA by the retrospective chart review in patients with disease duration 1 year and 5–7–10 years. Materials and methods. The study included 47 children with joints form of JIA aged from 2 to 18 years. For diagnosis, clinical examination, standard laboratory and instrumental tests were used. A regression analysis was used for the determination of prognostic criteria, the informative value of the features and the predictive factors for each of criteria. Results. It was established that for prediction of the further 5–7-year course of JIA, the patient’s gender, inflammation of small joints of the hands, the presence of the rheumatoid factor (RF played an essential role; and for 10-year and longer prognosis — the presence of eye lesions, the number of both affected and active joints were important. The significance of the following parameters has increased in prognostic model: the presence of RF, the duration of morning stiffness, radiological stage of the disease, laboratory indicators, such as erythrocyte sedimentation rate, level of C-reactive protein. For the 10-year prognosis of JIA, the effectiveness of treatment with methotrexate on initial stage was beneficial. These models determine variants of long-term course of JIA and include remission (up to 2 points, stabilization of the pathological process

  18. NGA-West2 equations for predicting vertical-component PGA, PGV, and 5%-damped PSA from shallow crustal earthquakes

    Science.gov (United States)

    Stewart, Jonathan P.; Boore, David M.; Seyhan, Emel; Atkinson, Gail M.

    2016-01-01

    We present ground motion prediction equations (GMPEs) for computing natural log means and standard deviations of vertical-component intensity measures (IMs) for shallow crustal earthquakes in active tectonic regions. The equations were derived from a global database with M 3.0–7.9 events. The functions are similar to those for our horizontal GMPEs. We derive equations for the primary M- and distance-dependence of peak acceleration, peak velocity, and 5%-damped pseudo-spectral accelerations at oscillator periods between 0.01–10 s. We observe pronounced M-dependent geometric spreading and region-dependent anelastic attenuation for high-frequency IMs. We do not observe significant region-dependence in site amplification. Aleatory uncertainty is found to decrease with increasing magnitude; within-event variability is independent of distance. Compared to our horizontal-component GMPEs, attenuation rates are broadly comparable (somewhat slower geometric spreading, faster apparent anelastic attenuation), VS30-scaling is reduced, nonlinear site response is much weaker, within-event variability is comparable, and between-event variability is greater.

  19. Long-term psychological consequences among adolescent survivors of the Wenchuan earthquake in China: A cross-sectional survey six years after the disaster.

    Science.gov (United States)

    Tanaka, Eizaburo; Tsutsumi, Atsuro; Kawakami, Norito; Kameoka, Satomi; Kato, Hiroshi; You, Yongheng

    2016-11-01

    Most epidemiological studies on adolescent survivors' mental health have been conducted within 2 years after the disaster. Longer-term psychological consequences remain unclear. This study explored psychological symptoms in secondary school students who were living in Sichuan province 6 years after the Wenchuan earthquake. A secondary data analysis was performed on data from a final survey of survivors conducted 6 years after the Wenchuan earthquake as part of the five-year mental health and psychosocial support project. A total of 2641 participants were divided into three groups, according to the level of traumatic experience exposure during the earthquake (0, 1, and 2 or more). ANCOVA was used to compare the mean scores of the Symptom Checklist-90 (SCL-90) among the three groups, adjusting for covariates such as age, gender, ethnicity, having a sibling, parents' divorce, and socio-economic status. Logistic regression analysis was used to identify relationships between the traumatic experiences and suicidality after the disaster. Having two or more kinds of traumatic experiences was associated with higher psychological symptom scores on the SCL-90 (Cohen's d=0.23-0.33) and suicidal ideation (OR 1.98, 95% CIs:1.35-2.89) and attempts (OR 3.32, 95% CIs:1.65-6.68), as compared with having no traumatic experience. Causality cannot be inferred from this cross-sectional survey, and results may not generalize to other populations due to convenience sampling. Severely traumatized adolescent survivors of the earthquake may suffer from psychological symptoms even 6 years after the disaster. Long-term psychological support will be needed for these individuals. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Discussion of the design of satellite-laser measurement stations in the eastern Mediterranean under the geological aspect. Contribution to the earthquake prediction research by the Wegener Group and to NASA's Crustal Dynamics Project

    Science.gov (United States)

    Paluska, A.; Pavoni, N.

    1983-01-01

    Research conducted for determining the location of stations for measuring crustal dynamics and predicting earthquakes is discussed. Procedural aspects, the extraregional kinematic tendencies, and regional tectonic deformation mechanisms are described.

  1. A first look at global flash drought: long term change and short term predictability

    Science.gov (United States)

    Yuan, Xing; Wang, Linying; Ji, Peng

    2017-04-01

    "Flash drought" became popular after the unexpected 2012 central USA drought, mainly due to its rapid development, low predictability and devastating impacts on water resources and crop yields. A pilot study by Mo and Lettenmaier (2015) found that flash drought, based on a definition of concurrent heat extreme, soil moisture deficit and evapotranspiration (ET) enhancement at pentad scale, were in decline over USA during recent 100 years. Meanwhile, a recent work indicated that the occurrence of flash drought in China was doubled during the past 30 years, where a severe flash drought in the summer of 2013 ravaged 13 provinces in southern China. As global warming increases the frequency of heat waves and accelerates the hydrological cycle, the flash drought is expected to increase in general, but its trend might also be affected by interannual to decadal climate oscillations. To consolidate the hotspots of flash drought and the effects of climate change on flash drought, a global inventory is being conducted by using multi-source observations (in-situ, satellite and reanalysis), CMIP5 historical simulations and future projections under different forcing scenarios, as well as global land surface hydrological modeling for key variables including surface air temperature, soil moisture and ET. In particular, a global picture of the flash drought distribution, the contribution of naturalized and anthropogenic forcings to global flash drought change, and the risk of global flash drought in the future, will be presented. Besides investigating the long-term change of flash drought, providing reliable early warning is also essential to developing adaptation strategies. While regional drought early warning systems have been emerging in recent decade, forecasting of flash drought is still at an exploratory stage due to limited understanding of flash drought predictability. Here, a set of sub-seasonal to seasonal (S2S) hindcast datasets are being used to assess the short term

  2. Short-term wind speed predictions with machine learning techniques

    Science.gov (United States)

    Ghorbani, M. A.; Khatibi, R.; FazeliFard, M. H.; Naghipour, L.; Makarynskyy, O.

    2016-02-01

    Hourly wind speed forecasting is presented by a modeling study with possible applications to practical problems including farming wind energy, aircraft safety and airport operations. Modeling techniques employed in this paper for such short-term predictions are based on the machine learning techniques of artificial neural networks (ANNs) and genetic expression programming (GEP). Recorded values of wind speed were used, which comprised 8 years of collected data at the Kersey site, Colorado, USA. The January data over the first 7 years (2005-2011) were used for model training; and the January data for 2012 were used for model testing. A number of model structures were investigated for the validation of the robustness of these two techniques. The prediction results were compared with those of a multiple linear regression (MLR) method and with the Persistence method developed for the data. The model performances were evaluated using the correlation coefficient, root mean square error, Nash-Sutcliffe efficiency coefficient and Akaike information criterion. The results indicate that forecasting wind speed is feasible using past records of wind speed alone, but the maximum lead time for the data was found to be 14 h. The results show that different techniques would lead to different results, where the choice between them is not easy. Thus, decision making has to be informed of these modeling results and decisions should be arrived at on the basis of an understanding of inherent uncertainties. The results show that both GEP and ANN are equally credible selections and even MLR should not be dismissed, as it has its uses.

  3. OPERATIONAL EARTHQUAKE FORECASTING. State of Knowledge and Guidelines for Utilization

    Directory of Open Access Journals (Sweden)

    Koshun Yamaoka

    2011-08-01

    Full Text Available Following the 2009 L'Aquila earthquake, the Dipartimento della Protezione Civile Italiana (DPC, appointed an International Commission on Earthquake Forecasting for Civil Protection (ICEF to report on the current state of knowledge of short-term prediction and forecasting of tectonic earthquakes and indicate guidelines for utilization of possible forerunners of large earthquakes to drive civil protection actions, including the use of probabilistic seismic hazard analysis in the wake of a large earthquake. The ICEF reviewed research on earthquake prediction and forecasting, drawing from developments in seismically active regions worldwide. A prediction is defined as a deterministic statement that a future earthquake will or will not occur in a particular geographic region, time window, and magnitude range, whereas a forecast gives a probability (greater than zero but less than one that such an event will occur. Earthquake predictability, the degree to which the future occurrence of earthquakes can be determined from the observable behavior of earthquake systems, is poorly understood. This lack of understanding is reflected in the inability to reliably predict large earthquakes in seismically active regions on short time scales. Most proposed prediction methods rely on the concept of a diagnostic precursor; i.e., some kind of signal observable before earthquakes that indicates with high probability the location, time, and magnitude of an impending event. Precursor methods reviewed here include changes in strain rates, seismic wave speeds, and electrical conductivity; variations of radon concentrations in groundwater, soil, and air; fluctuations in groundwater levels; electromagnetic variations near and above Earth's surface; thermal anomalies; anomalous animal behavior; and seismicity patterns. The search for diagnostic precursors has not yet produced a successful short-term prediction scheme. Therefore, this report focuses on operational

  4. Long-term predictions of minewater geothermal systems heat resources

    Science.gov (United States)

    Harcout-Menou, Virginie; de ridder, fjo; laenen, ben; ferket, helga

    2014-05-01

    Abandoned underground mines usually flood due to the natural rise of the water table. In most cases the process is relatively slow giving the mine water time to equilibrate thermally with the the surrounding rock massif. Typical mine water temperature is too low to be used for direct heating, but is well suited to be combined with heat pumps. For example, heat extracted from the mine can be used during winter for space heating, while the process could be reversed during summer to provide space cooling. Altough not yet widely spread, the use of low temperature geothermal energy from abandoned mines has already been implemented in the Netherlands, Spain, USA, Germany and the UK. Reliable reservoir modelling is crucial to predict how geothermal minewater systems will react to predefined exploitation schemes and to define the energy potential and development strategy of a large-scale geothermal - cold/heat storage mine water systems. However, most numerical reservoir modelling software are developed for typical environments, such as porous media (a.o. many codes developed for petroleum reservoirs or groundwater formations) and cannot be applied to mine systems. Indeed, mines are atypical environments that encompass different types of flow, namely porous media flow, fracture flow and open pipe flow usually described with different modelling codes. Ideally, 3D models accounting for the subsurface geometry, geology, hydrogeology, thermal aspects and flooding history of the mine as well as long-term effects of heat extraction should be used. A new modelling approach is proposed here to predict the long-term behaviour of Minewater geothermal systems in a reactive and reliable manner. The simulation method integrates concepts for heat and mass transport through various media (e.g., back-filled areas, fractured rock, fault zones). As a base, the standard software EPANET2 (Rossman 1999; 2000) was used. Additional equations for describing heat flow through the mine (both

  5. Data-Interpretation Methodologies for Non-Linear Earthquake Response Predictions of Damaged Structures

    Directory of Open Access Journals (Sweden)

    Yves Reuland

    2017-07-01

    Full Text Available Seismic exposure of buildings presents difficult engineering challenges. The principles of seismic design involve structures that sustain damage and still protect inhabitants. Precise and accurate knowledge of the residual capacity of damaged structures is essential for informed decision-making regarding clearance for occupancy after major seismic events. Unless structures are permanently monitored, modal properties derived from ambient vibrations are most likely the only source of measurement data that are available. However, such measurement data are linearly elastic and limited to a low number of vibration modes. Structural identification using hysteretic behavior models that exclusively relies on linear measurement data is a complex inverse engineering task that is further complicated by modeling uncertainty. Three structural identification methodologies that involve probabilistic approaches to data interpretation are compared: error-domain model falsification, Bayesian model updating with traditional assumptions as well as modified Bayesian model updating. While noting the assumptions regarding uncertainty definitions, the accuracy and robustness of identification and subsequent predictions are compared. A case study demonstrates limits on non-linear parameter identification performance and identification of potentially wrong prediction ranges for inappropriate model uncertainty distributions.

  6. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    Science.gov (United States)

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  7. Maternal, Newborn, and Child Health After the 2015 Nepal Earthquakes: An Investigation of the Long-term Gendered Impacts of Disasters.

    Science.gov (United States)

    Brunson, Jan

    2017-12-01

    Introduction Natural disasters in resource-poor countries have differential effects on socially disadvantaged groups such as women. In addition to the acute reproductive health needs of women during the immediate response phase of a disaster, research suggests that maternal, newborn, and child health (MNCH) may continue to be seriously impacted for numerous months, even years, after the event. Methods This ethnographic field research investigates the impacts of the 2015 Nepal earthquakes on mothers and children under five on the 6-month anniversary of the earthquakes. Results Though families were not channeling household funds away from health care expenses for pregnant and lactating women and children under five, the findings suggest that a delayed response by the Nepali government in administering funds for rebuilding combined with an ongoing fuel crisis were negatively impacting families' abilities to provide adequate shelter, warmth, cooking gas, and transportation for mothers and young children. This study highlights the importance of understanding the impacts of specific social and political contexts on intra-household family finances as they relate to MNCH, not just variables related to the disaster itself. Discussion Future research and policies on MNCH during the long-term recovery period after a natural disaster such as the 2015 Nepal earthquakes therefore should take into account the social and political context as well as institute multiple periodic assessments of MNCH in the first few years following the disaster.

  8. Long-Term b Value Variations of Shallow Earthquakes in New Zealand: A HMM-Based Analysis

    Science.gov (United States)

    Lu, Shaochuan

    2017-04-01

    The magnitude-frequency relationship is a fundamental statistic in seismology. Customarily, the temporal variations of b values in the magnitude-frequency distribution are demonstrated via "sliding-window" approach. However, the window size is often only tuned empirically, which may cause difficulties in interpretation of b value variability. In this study, a continuous-time hidden Markov model (HMM) is applied to characterize b value variations of New Zealand shallow earthquakes over decades. HMM-based approach to the b value estimation has some appealing properties over the popular sliding-window approach. The estimation of b value is stable over a range of magnitude thresholds, which is ideal for the interpretation of b value variability. The overall b values of medium and large earthquakes across North Island and northern South Island in New Zealand vary roughly at a decade scale. It is noteworthy that periods of low b values are typically associated with the occurrences of major large earthquakes. The overall temporal variations of b values seem prevailing over many grids in space as evidenced by a comparison of spatial b values in many grids made between two periods with low or high b values, respectively. We also carry out a pre-seismic b value analysis for recent Darfield earthquake and Cook Strait swarm. it is suggested that the mainshock rupture is nucleated at the margin of or right at low b value asperities. In addition, short period of pre-seismic b value decrease is observed in both cases. The overall time-varying behavior of b values over decades is an indication of broad scale of time-varying behavior associated with subduction process, probably related to the convergence rate of the plates. The advance in the method of b value estimation will enhance our understanding of earthquake occurrence and may lead to improved risk forecasting.

  9. Short-term changes in arterial inflammation predict long-term changes in atherosclerosis progression

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Philip [Massachusetts General Hospital and Harvard Medical School, Cardiology Division and Cardiac MR PET CT Program, Boston, MA (United States); McMaster University, Population Health Research Institute, Department of Medicine, and Department of Radiology, Hamilton, ON (Canada); Ishai, Amorina; Tawakol, Ahmed [Massachusetts General Hospital and Harvard Medical School, Cardiology Division and Cardiac MR PET CT Program, Boston, MA (United States); Mani, Venkatesh [Icahn School of Medicine at Mount Sinai School of Medicine, Translational and Molecular Imaging Institute and Department of Radiology, New York, NY (United States); Kallend, David [The Medicines Company, Parsippany, NJ (United States); Rudd, James H.F. [University of Cambridge, Division of Cardiovascular Medicine, Cambridge (United Kingdom); Fayad, Zahi A. [Icahn School of Medicine at Mount Sinai School of Medicine, Translational and Molecular Imaging Institute and Department of Radiology, New York, NY (United States); Icahn School of Medicine at Mount Sinai School of Medicine, Hess CSM Building Floor TMII, Rm S1-104, Translational and Molecular Imaging Institute and Department of Radiology, New York, NY (United States)

    2017-01-15

    It remains unclear whether changes in arterial wall inflammation are associated with subsequent changes in the rate of structural progression of atherosclerosis. In this sub-study of the dal-PLAQUE clinical trial, multi-modal imaging was performed using 18-fludeoxyglucose (FDG) positron emission tomography (PET, at 0 and 6 months) and magnetic resonance imaging (MRI, at 0 and 24 months). The primary objective was to determine whether increasing FDG uptake at 6 months predicted atherosclerosis progression on MRI at 2 years. Arterial inflammation was measured by the carotid FDG target-to-background ratio (TBR), and atherosclerotic plaque progression was defined as the percentage change in carotid mean wall area (MWA) and mean wall thickness (MWT) on MRI between baseline and 24 months. A total of 42 participants were included in this sub-study. The mean age of the population was 62.5 years, and 12 (28.6 %) were women. In participants with (vs. without) any increase in arterial inflammation over 6 months, the long-term changes in both MWT (% change MWT: 17.49 % vs. 1.74 %, p = 0.038) and MWA (% change MWA: 25.50 % vs. 3.59 %, p = 0.027) were significantly greater. Results remained significant after adjusting for clinical and biochemical covariates. Individuals with no increase in arterial inflammation over 6 months had no significant structural progression of atherosclerosis over 24 months as measured by MWT (p = 0.616) or MWA (p = 0.373). Short-term changes in arterial inflammation are associated with long-term structural atherosclerosis progression. These data support the concept that therapies that reduce arterial inflammation may attenuate or halt progression of atherosclerosis. (orig.)

  10. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  11. The NHV rehabilitation services program improves long-term physical functioning in survivors of the 2008 Sichuan earthquake: a longitudinal quasi experiment.

    Directory of Open Access Journals (Sweden)

    Xia Zhang

    Full Text Available BACKGROUND: Long-term disability following natural disasters significantly burdens survivors and the impacted society. Nevertheless, medical rehabilitation programming has been historically neglected in disaster relief planning. 'NHV' is a rehabilitation services program comprised of non-governmental organizations (NGOs (N, local health departments (H, and professional rehabilitation volunteers (V which aims to improve long-term physical functioning in survivors of the 2008 Sichuan earthquake. We aimed to evaluate the effectiveness of the NHV program. METHODS/FINDINGS: 510 of 591 enrolled earthquake survivors participated in this longitudinal quasi-experimental study (86.3%. The early intervention group (NHV-E consisted of 298 survivors who received institutional-based rehabilitation (IBR followed by community-based rehabilitation (CBR; the late intervention group (NHV-L was comprised of 101 survivors who began rehabilitation one year later. The control group of 111 earthquake survivors did not receive IBR/CBR. Physical functioning was assessed using the Barthel Index (BI. Data were analyzed with a mixed-effects Tobit regression model. Physical functioning was significantly increased in the NHV-E and NHV-L groups at follow-up but not in the control group after adjustment for gender, age, type of injury, and time to measurement. We found significant effects of both NHV (11.14, 95% CI 9.0-13.3 and sponaneaous recovery (5.03; 95% CI 1.73-8.34. The effect of NHV-E (11.3, 95% CI 9.0-13.7 was marginally greater than that of NHV-L (10.7, 95% CI 7.9-13.6. It could, however, not be determined whether specific IBR or CBR program components were effective since individual component exposures were not evaluated. CONCLUSION: Our analysis shows that the NHV improved the long-term physical functioning of Sichuan earthquake survivors with disabling injuries. The comprehensive rehabilitation program benefitted the individual and society, rehabilitation services

  12. A Short Term Seismic Hazard Assessment in Christchurch, New Zealand, After the M 7.1, 4 September 2010 Darfield Earthquake: An Application of a Smoothing Kernel and Rate-and-State Friction Model

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan

    2012-01-01

    Full Text Available The Mw 6.3, 21 February 2011 Christchurch, New Zealand, earthquake is regarded as an aftershock of the M 7.1, 4 September 2010 Darfield earthquake. However, it caused severe damage in the downtown Christchurch. Such a circumstance points out the importance of an aftershock sequence in seismic hazard evaluation and suggests the re-evaluation of a seismic hazard immediately after a large earthquake occurrence. For this purpose, we propose a probabilistic seismic hazard assessment (PSHA, which takes the disturbance of a short-term seismicity rate into account and can be easily applied in comparison with the classical PSHA. In our approach, the treatment of the background seismicity rate is the same as in the zoneless approach, which considers a bandwidth function as a smoothing Kernel in neighboring region of earthquakes. The rate-and-state friction model imparted by the Coulomb stress change of large earthquakes is used to calculate the fault-interaction-based disturbance in seismicity rate for PSHA. We apply this approach to evaluate the seismic hazard in Christchurch after the occurrence of the M 7.1, 4 September 2010 Darfield earthquake. Results show an increase of seismic hazards due to the stress increase in the region around the rupture plane, which extended to Christchurch. This provides a suitable basis for the application of a time-dependent PSHA using updating earthquake information.

  13. Short-term prediction of local wind conditions

    DEFF Research Database (Denmark)

    Landberg, L.; Watson, S.J.

    1994-01-01

    Using Numerical Weather Prediction (NWP) models it has been shown that they, combined with models (either physical or statistical) taking local effects into account, can be used to predict the wind locally better than the models commonly used today (as eg persistence). By ''local'' is meant at on...

  14. Statistical validation of earthquake related observations

    Science.gov (United States)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  15. Earthquake forecasting in Italy, before and after Umbria-Marche seismic sequence 1997. A review of the earthquake occurrence modeling at different spatio-temporal-magnitude scales.

    Directory of Open Access Journals (Sweden)

    W. Marzocchi

    2008-06-01

    Full Text Available The main goal of this work is to review the scientific researches carried out before and after the Umbria-Marche sequence related to the earthquake forecasting/prediction in Italy. In particular, I focus the attention on models that aim addressing three main practical questions: was (is Umbria-Marche a region with high probability of occurrence of a destructive earthquake? Was a precursory activity recorded before the mainshock(s? What was our capability to model the spatio-temporal-magnitude evolution of that seismic sequence? The models are reviewed pointing out what we have learned after the Umbria-Marche earthquakes, in terms of physical understanding of earthquake occurrence process, and of improving our capability to forecast earthquakes and to track in real-time seismic sequences.

  16. Comment on the report "Operational Earthquake Forecasting" by the International Commission on Earthquake Forecasting for Civil Protection

    Directory of Open Access Journals (Sweden)

    Stuart Crampin

    2012-04-01

    Full Text Available

    The recently published report Operational Earthquake Forecasting: State of Knowledge and Guidelines for Utilization by the International Commission on Earthquake Forecasting for Civil Protection (ICEF presupposes that there is no method for the short-term prediction of large earthquakes that has been demonstrated to be both reliable and skillful. This is no longer correct. Earthquakes can be deterministically stress-forecast by using shear-wave splitting to monitor stress-accumulation in the rock mass surrounding the earthquake source. This new understanding of fluid-rock deformation means that the recommendations of the ICEF Report are no longer appropriate. This comment reviews this new understanding and suggests that the way forward for operational earthquake forecasting in Italy is to install one or more controlled-source three-borehole Stress-Monitoring Sites and use shear-wave splitting to monitor stress-accumulation and stress-forecast all damaging (M ≥ 5 earthquakes in Italy.

  17. Prediction and evaluation of nonlinear site response with potentially liquefiable layers in the area of Nafplion (Peloponnesus, Greece for a repeat of historical earthquakes

    Directory of Open Access Journals (Sweden)

    V. K. Karastathis

    2010-11-01

    Full Text Available We examine the possible non-linear behaviour of potentially liquefiable layers at selected sites located within the expansion area of the town of Nafplion, East Peloponnese, Greece. Input motion is computed for three scenario earthquakes, selected on the basis of historical seismicity data, using a stochastic strong ground motion simulation technique, which takes into account the finite dimensions of the earthquake sources. Site-specific ground acceleration synthetics and soil profiles are then used to evaluate the liquefaction potential at the sites of interest. The activation scenario of the Iria fault, which is the closest one to Nafplion (M=6.4, is found to be the most hazardous in terms of liquefaction initiation. In this scenario almost all the examined sites exhibit liquefaction features at depths of 6–12 m. For scenario earthquakes at two more distant seismic sources (Epidaurus fault – M6.3; Xylokastro fault – M6.7 strong ground motion amplification phenomena by the shallow soft soil layer are expected to be observed.

  18. Post-Traumatic Stress Symptoms and Post-Traumatic Growth: Evidence from a Longitudinal Study following an Earthquake Disaster.

    Science.gov (United States)

    Chen, Jieling; Zhou, Xiao; Zeng, Min; Wu, Xinchun

    2015-01-01

    The current longitudinal study aims to examine the bidirectional relationship between post-traumatic stress symptoms (PTSS) and post-traumatic growth (PTG). One hundred twenty-two adults in the most severely affected area were investigated by self-report questionnaires at 12 months and 18 months after the Wenchuan Earthquake occurred in China. The autoregressive cross-lagged structure equation analysis revealed that PTG at 12 months post-earthquake could negatively predict PTSS at 18 months post-earthquake above and beyond PTSS stability, whereas PTSS at 12 months post-earthquake could not significantly predict subsequent PTG. Moreover, PTG at 12 months post-earthquake could predict fewer subsequent intrusions, numbing and hyper-arousal symptoms but not avoidance symptoms. Growth can play a role in reducing long-term post-traumatic stress symptoms, and the implication of a positive perspective in post-trauma circumstance is discussed.

  19. LONG TERM WIND SPEED PREDICTION USING WAVELET COEFFICIENTS AND SOFT COMPUTING

    Directory of Open Access Journals (Sweden)

    Manju Khanna

    2016-10-01

    Full Text Available In the past researches, scholars have carried out short-term prediction for wind speed. The present work deals with long-term wind speed prediction, required for hybrid power generation design and contract planning. As the total database is quite large for long-term prediction, feature extraction of data by application of Lifting wavelet coefficients is exploited, along with soft computing techniques for time series data, which is scholastic in nature.

  20. Implications of the Regional Earthquake Likelihood Models test of earthquake forecasts in California

    Directory of Open Access Journals (Sweden)

    Michael Karl Sachs

    2012-09-01

    Full Text Available The Regional Earthquake Likelihood Models (RELM test was the first competitive comparison of prospective earthquake forecasts. The test was carried out over 5 years from 1 January 2006 to 31 December 2010 over a region that included all of California. The test area was divided into 7682 0.1°x0.1° spatial cells. Each submitted forecast gave the predicted numbers of earthquakes Nemi larger than M=4.95 in 0.1 magnitude bins for each cell. In this paper we present a method that separates the forecast of the number of test earthquakes from the forecast of their locations. We first obtain the number Nem of forecast earthquakes in magnitude bin m. We then determine the conditional probability λemi=Nemi/Nem that an earthquake in magnitude bin m will occur in cell i. The summation of λemi over all 7682 cells is unity. A random (no skill forecast gives equal values of λemi for all spatial cells and magnitude bins. The skill of a forecast, in terms of the location of the earthquakes, is measured by the success in assigning large values of λemi to the cells in which earthquakes occur and low values of λemi to the cells where earthquakes do not occur. Thirty-one test earthquakes occurred in 27 different combinations of spatial cells i and magnitude bins m, we had the highest value of λemi for that mi cell. We evaluate the performance of eleven submitted forecasts in two ways. First, we determine the number of mi cells for which the forecast λemi was the largest, the best forecast is the one with the highest number. Second, we determine the mean value of λemi for the 27 mi cells for each forecast. The best forecast has the highest mean value of λemi. The success of a forecast during the test period is dependent on the allocation of the probabilities λemi between the mi cells, since the sum over the mi cells is unity. We illustrate the forecast distributions of λemi and discuss their differences. We conclude that the RELM test was successful in

  1. An Artificial Neural Network Based Short-term Dynamic Prediction of Algae Bloom

    Directory of Open Access Journals (Sweden)

    Yao Junyang

    2014-06-01

    Full Text Available This paper proposes a method of short-term prediction of algae bloom based on artificial neural network. Firstly, principal component analysis is applied to water environmental factors in algae bloom raceway ponds to get main factors that influence the formation of algae blooms. Then, a model of short-term dynamic prediction based on neural network is built with the current chlorophyll_a values as input and the chlorophyll_a values in the next moment as output to realize short-term dynamic prediction of algae bloom. Simulation results show that the model can realize short-term prediction of algae bloom effectively.

  2. Major depressive disorder subtypes to predict long-term course.

    Science.gov (United States)

    van Loo, Hanna M; Cai, Tianxi; Gruber, Michael J; Li, Junlong; de Jonge, Peter; Petukhova, Maria; Rose, Sherri; Sampson, Nancy A; Schoevers, Robert A; Wardenaar, Klaas J; Wilcox, Marsha A; Al-Hamzawi, Ali Obaid; Andrade, Laura Helena; Bromet, Evelyn J; Bunting, Brendan; Fayyad, John; Florescu, Silvia E; Gureje, Oye; Hu, Chiyi; Huang, Yueqin; Levinson, Daphna; Medina-Mora, Maria Elena; Nakane, Yoshibumi; Posada-Villa, Jose; Scott, Kate M; Xavier, Miguel; Zarkov, Zahari; Kessler, Ronald C

    2014-09-01

    Variation in the course of major depressive disorder (MDD) is not strongly predicted by existing subtype distinctions. A new subtyping approach is considered here. Two data mining techniques, ensemble recursive partitioning and Lasso generalized linear models (GLMs), followed by k-means cluster analysis are used to search for subtypes based on index episode symptoms predicting subsequent MDD course in the World Mental Health (WMH) surveys. The WMH surveys are community surveys in 16 countries. Lifetime DSM-IV MDD was reported by 8,261 respondents. Retrospectively reported outcomes included measures of persistence (number of years with an episode, number of years with an episode lasting most of the year) and severity (hospitalization for MDD, disability due to MDD). Recursive partitioning found significant clusters defined by the conjunctions of early onset, suicidality, and anxiety (irritability, panic, nervousness-worry-anxiety) during the index episode. GLMs found additional associations involving a number of individual symptoms. Predicted values of the four outcomes were strongly correlated. Cluster analysis of these predicted values found three clusters having consistently high, intermediate, or low predicted scores across all outcomes. The high-risk cluster (30.0% of respondents) accounted for 52.9-69.7% of high persistence and severity, and it was most strongly predicted by index episode severe dysphoria, suicidality, anxiety, and early onset. A total symptom count, in comparison, was not a significant predictor. Despite being based on retrospective reports, results suggest that useful MDD subtyping distinctions can be made using data mining methods. Further studies are needed to test and expand these results with prospective data. © 2014 Wiley Periodicals, Inc.

  3. Next-Term Student Performance Prediction: A Recommender Systems Approach

    Science.gov (United States)

    Sweeney, Mack; Rangwala, Huzefa; Lester, Jaime; Johri, Aditya

    2016-01-01

    An enduring issue in higher education is student retention to successful graduation. National statistics indicate that most higher education institutions have four-year degree completion rates around 50%, or just half of their student populations. While there are prediction models which illuminate what factors assist with college student success,…

  4. Major depressive disorder subtypes to predict long-term course

    NARCIS (Netherlands)

    van Loo, Hanna M.; Cai, Tianxi; Gruber, Michael J.; Li, Junlong; de Jonge, Peter; Petukhova, Maria; Rose, Sherri; Sampson, Nancy A.; Schoevers, Robert A.; Wardenaar, Klaas J.; Wilcox, Marsha A.; Al-Hamzawi, Ali Obaid; Andrade, Laura Helena; Bromet, Evelyn J.; Bunting, Brendan; Fayyad, John; Florescu, Silvia E.; Gureje, Oye; Hu, Chiyi; Huang, Yueqin; Levinson, Daphna; Medina-Mora, Maria Elena; Nakane, Yoshibumi; Posada-Villa, Jose; Scott, Kate M.; Xavier, Miguel; Zarkov, Zahari; Kessler, Ronald C.

    BACKGROUND: Variation in the course of major depressive disorder (MDD) is not strongly predicted by existing subtype distinctions. A new subtyping approach is considered here. METHODS: Two data mining techniques, ensemble recursive partitioning and Lasso generalized linear models (GLMs), followed by

  5. Tectonic geomorphology of the 21 May 2003 Zemmouri earthquake area (Mw 6.8, Tell Atlas, Algeria) : An analysis of the long-term coastal uplift

    Science.gov (United States)

    Bagdi-Issaad, Souhila; Meghraoui, Mustapha; Nedjari, Ahmed

    2017-04-01

    Geomorphological, geological and structural markers attest for successive uplift during the late Quaternary along the Algerian coastal region, a section of the Africa-Eurasia plate boundary. Large and moderate shallow earthquakes with Mw ≥ 6 occurred on E-W to NE-SW active thrust-related-fold structures an among them the 21 May 2003 Zemmouri earthquake (Mw 6.8) that caused 0.5 m uplift on 55 km coastal. In this work, we study the correlation between the 2003 coseismic uplift with the long-term active deformation using the distribution of Quaternary marine and alluvial terraces where indicators show three pre-2003 main notch levels formed in the last 21.9 ka along with five alluvial terrace levels formed in Pleistocene. The analysis of drainage system and related tectonic geomorphology along the coastal area show over 500 small and large rivers that document the trend of present-day and past stream channels, their longitudinal profiles, the arrangement of Quaternary deposits and the response of river mouths to the successive past and recent uplift. The analysis of remote sensing images combined with high-resolution Digital Elevation Model and field observations reveal concave downward shape of most river profiles and river mouth deflections near the coastline. Data previously obtained on the coseismic deformation using coastal tectonics, seismology and geodetic (InSAR and GPS) investigations are combined to our analysis of coastal deformation. The results confirm the impact of the offshore thrust fault responsible of the coastal deformation through successive coseismic uplift with an estimated average 0.9 to 2.1 mm/year during the late Pleistocene - Holocene (Maouche et al.,2011). The short-term and long-term deformation and related surface slip distribution controls the drainage system and related distribution of Quaternary deposits. Our results indicate how the tectonic geomorphology can be a decisive marker for the identification of coastal active faults and

  6. Resource loss, self-efficacy, and family support predict posttraumatic stress symptoms: a 3-year study of earthquake survivors.

    Science.gov (United States)

    Warner, Lisa Marie; Gutiérrez-Doña, Benicio; Villegas Angulo, Maricela; Schwarzer, Ralf

    2015-01-01

    Social support and self-efficacy are regarded as coping resources that may facilitate readjustment after traumatic events. The 2009 Cinchona earthquake in Costa Rica serves as an example for such an event to study resources to prevent subsequent severity of posttraumatic stress symptoms. At Time 1 (1-6 months after the earthquake in 2009), N=200 survivors were interviewed, assessing resource loss, received family support, and posttraumatic stress response. At Time 2 in 2012, severity of posttraumatic stress symptoms and general self-efficacy beliefs were assessed. Regression analyses estimated the severity of posttraumatic stress symptoms accounted for by all variables. Moderator and mediator models were examined to understand the interplay of received family support and self-efficacy with posttraumatic stress symptoms. Baseline posttraumatic stress symptoms and resource loss (T1) accounted for significant but small amounts of the variance in the severity of posttraumatic stress symptoms (T2). The main effects of self-efficacy (T2) and social support (T1) were negligible, but social support buffered resource loss, indicating that only less supported survivors were affected by resource loss. Self-efficacy at T2 moderated the support-stress relationship, indicating that low levels of self-efficacy could be compensated by higher levels of family support. Receiving family support at T1 enabled survivors to feel self-efficacious, underlining the enabling hypothesis. Receiving social support from relatives shortly after an earthquake was found to be an important coping resource, as it alleviated the association between resource loss and the severity of posttraumatic stress response, compensated for deficits of self-efficacy, and enabled self-efficacy, which was in turn associated with more adaptive adjustment 3 years after the earthquake.

  7. Coulomb Stress Changes near seismogenic zone after 2015 Lamjung (Nepal) Mw 7.8 Earthquake: an elastic model prediction

    Science.gov (United States)

    Cheng, X.

    2016-12-01

    After Lamjung Mw 7.8 earthquake in Nepal happened, rearrangement of stresses in the crust commonly leads to subsequent damaging earthquakes. We carry out a calculation to investigate the Coulomb stress changes in certain filed. Using models of regional faults designed according to south Tibet-Nepal structure, we present calculations of the coseismic stress changes that resulted from the 25 April, 2015 Lamjung earthquake and show some indicative significant stress increase. We use the Coulomb 3.3 model to estimate Coulomb stress and the calculations are conducted in an elastic half-space with uniform isotropic elastic medium. We set up receiver faults to be normal faults to the north of the seismogenic fault according to tectonic features. The results show that the aftershocks are associated with a zone of increased Coulomb stress caused by the main rupture, and the normal faults to the north of the seismogenic faults mainly have increased Coulomb stress, with the western normal faults having greater changes in Coulomb stress than the easternmost faults (cf. figure). This indicates greater risk from the western normal faults than from the easternmost faults, and we should pay attention to studying these normal faults in the future. As for seismogenic fault, which is considered as the Main Himalaya Thrust (MHT), the western part, at 10 20 km, has clearly higher Coulomb stress changes of values up to 0.1bar (0.01MPa). This indicates that the stress may have transferred to the west of the MHT more easily than to the east. This big event might have led to more sufficient rupture to the eastern part than to the west, which is useful for identifying potential future rupture zones and carrying out earthquake mitigation.

  8. Use of "Crowd-Sourcing" and other collaborations to solve the short-term, earthquake forecasting problem

    Science.gov (United States)

    Bleier, T.; Heraud, J. A.; Dunson, J. C.

    2015-12-01

    QuakeFinder (QF) and its international collaborators have installed and currently maintain 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. The data from these instruments are being analyzed for pre-quake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, PUCC in Chile, NOA in Greece, Syiah Kuala University in Indonesia, LASP at U of Colo., Stanford, and USGS). Recently, NASA Hq and QuakeFinder tried a new approach to help with the analysis of this huge (50+TB) data archive. A collaboration with Apirio/TopCoder, Harvard University, Amazon, QuakeFinder, and NASA Hq. resulted in an open algorithm development contest called "Quest for Quakes" in which contestants (freelance algorithm developers) attempted to identify quakes from a subset of the QuakeFinder data (3TB). The contest included a $25K prize pool, and contained 100 cases where earthquakes (and null sets) included data from up to 5 remote sites, near and far from quakes greater than M4. These data sets were made available through Amazon.com to hundreds of contestants over a two week contest period. In a more traditional approach, several new algorithms were tried by actively sharing the QF data with universities over a longer period. These algorithms included Principal Component Analysis-PCA and deep neural networks in an effort to automatically identify earthquake signals within typical, noise-filled environments. This presentation examines the pros and cons of employing these two approaches, from both logistical and scientific perspectives.

  9. Role of Subdural Electrocorticography in Prediction of Long-Term Seizure Outcome in Epilepsy Surgery

    Science.gov (United States)

    Asano, Eishi; Juhasz, Csaba; Shah, Aashit; Sood, Sandeep; Chugani, Harry T.

    2009-01-01

    Since prediction of long-term seizure outcome using preoperative diagnostic modalities remains suboptimal in epilepsy surgery, we evaluated whether interictal spike frequency measures obtained from extraoperative subdural electrocorticography (ECoG) recording could predict long-term seizure outcome. This study included 61 young patients (age…

  10. Possible deep fault slip preceding the 2004 Parkfield earthquake, inferred from detailed observations of tectonic tremor

    Science.gov (United States)

    Shelly, David R.

    2009-01-01

    Earthquake predictability depends, in part, on the degree to which sudden slip is preceded by slow aseismic slip. Recently, observations of deep tremor have enabled inferences of deep slow slip even when detection by other means is not possible, but these data are limited to certain areas and mostly the last decade. The region near Parkfield, California, provides a unique convergence of several years of high-quality tremor data bracketing a moderate earthquake, the 2004 magnitude 6.0 event. Here, I present detailed observations of tectonic tremor from mid-2001 through 2008 that indicate deep fault slip both before and after the Parkfield earthquake that cannot be detected with surface geodetic instruments. While there is no obvious short-term precursor, I find unidirectional tremor migration accompanied by elevated tremor rates in the 3 months prior to the earthquake, which suggests accelerated creep on the fault ∼16 km beneath the eventual earthquake hypocenter.

  11. Thermal Infrared Anomalies of Several Strong Earthquakes

    OpenAIRE

    Congxin Wei; Yuansheng Zhang; Xiao Guo; Shaoxing Hui; Manzhong Qin; Ying Zhang

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method...

  12. Modeling Seismic Cycles of Great Megathrust Earthquakes Across the Scales With Focus at Postseismic Phase

    Science.gov (United States)

    Sobolev, Stephan V.; Muldashev, Iskander A.

    2017-12-01

    Subduction is substantially multiscale process where the stresses are built by long-term tectonic motions, modified by sudden jerky deformations during earthquakes, and then restored by following multiple relaxation processes. Here we develop a cross-scale thermomechanical model aimed to simulate the subduction process from 1 min to million years' time scale. The model employs elasticity, nonlinear transient viscous rheology, and rate-and-state friction. It generates spontaneous earthquake sequences and by using an adaptive time step algorithm, recreates the deformation process as observed naturally during the seismic cycle and multiple seismic cycles. The model predicts that viscosity in the mantle wedge drops by more than three orders of magnitude during the great earthquake with a magnitude above 9. As a result, the surface velocities just an hour or day after the earthquake are controlled by viscoelastic relaxation in the several hundred km of mantle landward of the trench and not by the afterslip localized at the fault as is currently believed. Our model replicates centuries-long seismic cycles exhibited by the greatest earthquakes and is consistent with the postseismic surface displacements recorded after the Great Tohoku Earthquake. We demonstrate that there is no contradiction between extremely low mechanical coupling at the subduction megathrust in South Chile inferred from long-term geodynamic models and appearance of the largest earthquakes, like the Great Chile 1960 Earthquake.

  13. Learning from physics-based earthquake simulators: a minimal approach

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2017-04-01

    Physics-based earthquake simulators are aimed to generate synthetic seismic catalogs of arbitrary length, accounting for fault interaction, elastic rebound, realistic fault networks, and some simple earthquake nucleation process like rate and state friction. Through comparison of synthetic and real catalogs seismologists can get insights on the earthquake occurrence process. Moreover earthquake simulators can be used to to infer some aspects of the statistical behavior of earthquakes within the simulated region, by analyzing timescales not accessible through observations. The develoment of earthquake simulators is commonly led by the approach "the more physics, the better", pushing seismologists to go towards simulators more earth-like. However, despite the immediate attractiveness, we argue that this kind of approach makes more and more difficult to understand which physical parameters are really relevant to describe the features of the seismic catalog at which we are interested. For this reason, here we take an opposite minimal approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple model may be more informative than a complex one for some specific scientific objectives, because it is more understandable. The model has three main components: the first one is a realistic tectonic setting, i.e., a fault dataset of California; the other two components are quantitative laws for earthquake generation on each single fault, and the Coulomb Failure Function for modeling fault interaction. The final goal of this work is twofold. On one hand, we aim to identify the minimum set of physical ingredients that can satisfactorily reproduce the features of the real seismic catalog, such as short-term seismic cluster, and to investigate on the hypothetical long-term behavior, and faults synchronization. On the other hand, we want to investigate the limits of predictability of the model itself.

  14. Earthquakes in Zimbabwe | Clark | Zimbabwe Science News

    African Journals Online (AJOL)

    Earthquakes are one of the most destructive natural forces, in both human and economic terms. For example, since 1900, 10 earthquakes have occurred that each killed over 50 000 people. Earthquakes in modern industrialized areas can be also be very costly, even if well designed and constructed buildings save many ...

  15. A preliminary evaluation of surface latent heat flux as an earthquake precursor

    Directory of Open Access Journals (Sweden)

    W. Zhang

    2013-10-01

    Full Text Available The relationship between variations in surface latent heat flux (SLHF and marine earthquakes has been a popular subject of recent seismological studies. So far, there are two key problems: how to identify the abnormal SLHF variations from complicated background signals, and how to ensure that the anomaly results from an earthquake. In this paper, we proposed four adjustable parameters for identification, classified the relationship and analyzed SLHF changes several months before six marine earthquakes by employing daily SLHF data. Additionally, we also quantitatively evaluate the long-term relationship between earthquakes and SLHF anomalies for the six study areas over a 20 yr period preceding each earthquake. The results suggest the following: (1 before the South Sandwich Islands, Papua, Samoa and Haiti earthquakes, the SLHF variations above their individual background levels have relatively low amplitudes and are difficult to be considered as precursory anomalies; (2 after removing the clustering effect, most of the anomalies prior to these six earthquakes are not temporally related to any earthquake in each study area in time sequence; (3 for each case, apart from Haiti, more than half of the studied earthquakes, which were moderate and even devastating earthquakes (larger than Mw = 5.3, had no precursory variations in SLHF; and (4 the correlation between SLHF and seismic activity depends largely on data accuracy and parameter settings. Before any application of SLHF data on earthquake prediction, we suggest that anomaly-identifying standards should be established based on long-term regional analysis to eliminate subjectivity. Furthermore, other factors that may result in SLHF variations should also be carefully considered.

  16. Mental Health Response in Haiti in the Aftermath of the 2010 Earthquake: A Case Study for Building Long-Term Solutions

    Science.gov (United States)

    Raviola, Giuseppe; Eustache, Eddy; Oswald, Catherine; Belkin, Gary S

    2012-01-01

    Significant challenges exist in providing safe, effective, and culturally sound mental health and psychosocial services when an unforeseen disaster strikes in a low-resource setting. We present here a case study describing the experience of a transnational team in expanding mental health and psychosocial services delivered by two health care organizations, one local (Zanmi Lasante) and one international (Partners in Health), acting collaboratively as part of the emergency response to the 2010 Haiti earthquake. In the year and a half following the earthquake, Zanmi Lasante and Partners in Health provided 20,000 documented individual and group appointments for mental health and psychosocial needs. During the delivery of disaster response services, the collaboration led to the development of a model to guide the expansion and scaling up of community-based mental health services in the Zanmi Lasante health care system over the long-term, with potential for broader scale-up in Haiti. This model identifies key skill packages and implementation rules for developing evidence-based pathways and algorithms for treating common mental disorders. Throughout the collaboration, efforts were made to coordinate planning with multiple organizations interested in supporting the development of mental health programs following the disaster, including national governmental bodies, nongovernmental organizations, universities, foreign academic medical centers, and corporations. The collaborative interventions are framed here in terms of four overarching categories of action: direct service delivery, research, training, and advocacy. This case study exemplifies the role of psychiatrists working in low-resource settings as public health program implementers and as members of multidisciplinary teams. (Harv Rev Psychiatry 2012;20:68–77.) PMID:22335184

  17. Mental health response in Haiti in the aftermath of the 2010 earthquake: a case study for building long-term solutions.

    Science.gov (United States)

    Raviola, Giuseppe; Eustache, Eddy; Oswald, Catherine; Belkin, Gary S

    2012-01-01

    Significant challenges exist in providing safe, effective, and culturally sound mental health and psychosocial services when an unforeseen disaster strikes in a low-resource setting. We present here a case study describing the experience of a transnational team in expanding mental health and psychosocial services delivered by two health care organizations, one local (Zanmi Lasante) and one international (Partners in Health), acting collaboratively as part of the emergency response to the 2010 Haiti earthquake. In the year and a half following the earthquake, Zanmi Lasante and Partners in Health provided 20,000 documented individual and group appointments for mental health and psychosocial needs. During the delivery of disaster response services, the collaboration led to the development of a model to guide the expansion and scaling up of community-based mental health services in the Zanmi Lasante health care system over the long-term, with potential for broader scale-up in Haiti. This model identifies key skill packages and implementation rules for developing evidence-based pathways and algorithms for treating common mental disorders. Throughout the collaboration, efforts were made to coordinate planning with multiple organizations interested in supporting the development of mental health programs following the disaster, including national governmental bodies, nongovernmental organizations, universities, foreign academic medical centers, and corporations. The collaborative interventions are framed here in terms of four overarching categories of action: direct service delivery, research, training, and advocacy. This case study exemplifies the role of psychiatrists working in low-resource settings as public health program implementers and as members of multidisciplinary teams.

  18. Analyses of computer programs for the probabilistic estimation of design earthquake and seismological characteristics of the Korean Peninsula

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Gi Hwa [Seoul National Univ., Seoul (Korea, Republic of)

    1997-11-15

    The purpose of the present study is to develop predictive equations from simulated motions which are adequate for the Korean Peninsula and analyze and utilize the computer programs for the probabilistic estimation of design earthquakes. In part I of the report, computer programs for the probabilistic estimation of design earthquake are analyzed and applied to the seismic hazard characterizations in the Korean Peninsula. In part II of the report, available instrumental earthquake records are analyzed to estimate earthquake source characteristics and medium properties, which are incorporated into simulation process. And earthquake records are simulated by using the estimated parameters. Finally, predictive equations constructed from the simulation are given in terms of magnitude and hypocentral distances.

  19. Earthquake geology: science, society and critical facilities

    Directory of Open Access Journals (Sweden)

    Christoph Grützner

    2014-02-01

    Full Text Available Earthquake geology studies the effects, the mechanics and the impacts of earthquakes in the geological environment. Its role is also to decode the fault history, therefore its approach is fault specific and its outcomes are of decisive value for seismic hazard assessment and planning. The term Earthquake geology includes aspects of modern instrumental studies, tectonics and structural geology, historical surface deformation and tectonic geomorphology, whereas paleoseismology is considered part of earthquake geology [...].

  20. Analog earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Hofmann, R.B. [Center for Nuclear Waste Regulatory Analyses, San Antonio, TX (United States)

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.

  1. Report by the 'Mega-earthquakes and mega-tsunamis' subgroup; Rapport du sous-groupe Sismique 'Megaseismes et megatsunamis'

    Energy Technology Data Exchange (ETDEWEB)

    Friedel, Jacques; Courtillot, Vincent; Dercourt, Jean; Jaupart, Claude; Le Pichon, Xavier; Poirier, Jean-Paul; Salencon, Jean; Tapponnier, Paul; Dautray, Robert; Carpentier, Alain; Taquet, Philippe; Blanchet, Rene; Le Mouel, Jean-Louis [Academie des sciences, 23, quai de Conti, 75006 Paris (France); BARD, Pierre-Yves [Observatoire des sciences de l' Univers de l' universite de Grenoble - OSUG, Universite Joseph Fourier, BP 53, 38041 Grenoble Cedex 9 (France); Bernard, Pascal; Montagner, Jean-Paul; Armijo, Rolando; Shapiro, Nikolai; Tait, Steve [Institut de physique du globe de Paris, 1, rue Jussieu - 75238 Paris cedex 05 (France); Cara, Michel [Ecole et Observatoire des sciences de la Terre de l' universite de Strasbourg - EOST, F-67084 Strasbourg cedex (France); Madariaga, Raul [Ecole normale superieure, 45, rue d' Ulm / 29, rue d' Ulm, F-75230 Paris cedex 05 (France); Pecker, Alain [Academie des technologies, Grand Palais des Champs Elysees - Porte C - Avenue Franklin D. Roosevelt - 75008 Paris (France); Schindele, Francois [CEA/DAM, DIF/DASE/SLDG, 91297 ARPAJON Cedex (France); Douglas, John [BRGM, 3 avenue Claude-Guillemin - BP 36009 - 45060 Orleans Cedex 2 (France)

    2011-06-15

    This report comprises a presentation of scientific data on subduction earthquakes, on tsunamis and on the Tohoku earthquake. It proposes a detailed description of the French situation (in the West Indies, in metropolitan France, and in terms of soil response), and a discussion of social and economic issues (governance, seismic regulation and nuclear safety, para-seismic protection of constructions). The report is completed by other large documents: presentation of data on the Japanese earthquake, discussion on prediction and governance errors in the management of earthquake mitigation in Japan, discussions on tsunami prevention, on needs of research on accelerometers, and on the seismic risk in France

  2. Earthquake forecast models for Italy based on the RI algorithm

    Directory of Open Access Journals (Sweden)

    Kazuyoshi Z. Nanjo

    2010-11-01

    Full Text Available This study provides an overview of relative-intensity (RI-based earthquake forecast models that have been submitted for the 5-year and 10-year testing classes and the 3-month class of the Italian experiment within the Collaboratory for the Study of Earthquake Predictability (CSEP. The RI algorithm starts as a binary forecast system based on the working assumption that future large earthquakes are considered likely to occur at sites of higher seismic activity in the past. The measure of RI is the simply counting of the number of past earthquakes, which is known as the RI of seismicity. To improve the RI forecast performance, we first expand the RI algorithm to become part of a general class of smoothed seismicity models. We then convert the RI representation from a binary system into a testable CSEP model that forecasts the numbers of earthquakes for the predefined magnitudes. Our parameter tuning for the CSEP models is based on the past seismicity. The final submission is a set of two numerical data files that were created by tuned 5-year and 10-year models and an executable computer code of a tuned 3-month model, to examine which testing class is more meaningful in terms of the RI hypothesis. The main purpose of our participation is to better understand the importance (or lack of importance of RI of seismicity for earthquake forecastability.

  3. Artificial intelligence to predict short-term wind speed

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Tiago; Soares, Joao; Ramos, Sergio; Vale, Zita [Polytechnic of Porto (Portugal). GECAD - ISEP

    2012-07-01

    The use of renewable energy is increasing exponentially in many countries due to the introduction of new energy and environmental policies. Thus, the focus on energy and on the environment makes the efficient integration of renewable energy into the electric power system extremely important. Several European countries have been seeing a high penetration of wind power, representing, gradually, a significant penetration on electricity generation. The introduction of wind power in the network power system causes new challenges for the power system operator due to the variability and uncertainty in weather conditions and, consequently, in the wind power generation. As result, the scheduling dispatch has a significantly portion of uncertainty. In order to deal with the uncertainty in wind power and, with that, introduce improvements in the power system operator efficiency, the wind power forecasting may reveal as a useful tool. This paper proposes a data-mining-based methodology to forecast wind speed. This method is based on the use of data mining techniques applied to a real database of historical wind data. The paper includes a case study based on a real database regarding the last three years to predict wind speed at 5 minute intervals. (orig.)

  4. The First Results of Testing Methods and Algorithms for Automatic Real Time Identification of Waveforms Introduction from Local Earthquakes in Increased Level of Man-induced Noises for the Purposes of Ultra-short-term Warning about an Occurred Earthquake

    Science.gov (United States)

    Gravirov, V. V.; Kislov, K. V.

    2009-12-01

    The chief hazard posed by earthquakes consists in their suddenness. The number of earthquakes annually recorded is in excess of 100,000; of these, over 1000 are strong ones. Great human losses usually occur because no devices exist for advance warning of earthquakes. It is therefore high time that mobile information automatic systems should be developed for analysis of seismic information at high levels of manmade noise. The systems should be operated in real time with the minimum possible computational delays and be able to make fast decisions. The chief statement of the project is that sufficiently complete information about an earthquake can be obtained in real time by examining its first onset as recorded by a single seismic sensor or a local seismic array. The essential difference from the existing systems consists in the following: analysis of local seismic data at high levels of manmade noise (that is, when the noise level may be above the seismic signal level), as well as self-contained operation. The algorithms developed during the execution of the project will be capable to be used with success for individual personal protection kits and for warning the population in earthquake-prone areas over the world. The system being developed for this project uses P and S waves as well. The difference in the velocities of these seismic waves permits a technique to be developed for identifying a damaging earthquake. Real time analysis of first onsets yields the time that remains before surface waves arrive and the damage potential of these waves. Estimates show that, when the difference between the earthquake epicenter and the monitored site is of order 200 km, the time difference between the arrivals of P waves and surface waves will be about 30 seconds, which is quite sufficient to evacuate people from potentially hazardous space, insertion of moderators at nuclear power stations, pipeline interlocking, transportation stoppage, warnings issued to rescue services

  5. Prediction of short-term and long-term VOC emissions from SBR bitumen-backed carpet under different temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Yang, S.; Chen, Q. [Massachusetts Inst. of Tech., Cambridge, MA (United States). Building Technology Program; Bluyssen, P.M. [TNO Building and Construction Research, Delft (Netherlands)

    1998-12-31

    This paper presents two models for volatile organic compound (VOC) emissions from carpet. One is a numerical model using the computational fluid dynamics (CFD) technique for short-term predictions, the other an analytical model for long-term predictions. The numerical model can (1) deal with carpets that are not new, (2) calculate the time-dependent VOC distributions in a test chamber or room, and (3) consider the temperature effect on VOC emissions. Based on small-scale chamber data, both models were used to examine the VOC emissions under different temperatures from polypropene styrene-butadiene rubber (SBR) bitumen-backed carpet. The short-term predictions show that the VOC emissions under different temperatures can be modeled solely by changing the carpet diffusion coefficients. A formulation of the Arrhenius relation was used to correlate the dependence of carpet diffusion coefficient with temperature. The long-term predictions show that it would take several years to bake out the VOCs, and temperature would have a major impact on the bake-out time.

  6. Short-term changes on MRI predict long-term changes on radiography in rheumatoid arthritis

    DEFF Research Database (Denmark)

    Peterfy, Charles; Strand, Vibeke; Tian, Lu

    2017-01-01

    OBJECTIVE: In rheumatoid arthritis (RA), MRI provides earlier detection of structural damage than radiography (X-ray) and more sensitive detection of intra-articular inflammation than clinical examination. This analysis was designed to evaluate the ability of early MRI findings to predict...

  7. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  8. Short-term variability in body weight predicts long-term weight gain1

    Science.gov (United States)

    Lowe, Michael R; Feig, Emily H; Winter, Samantha R; Stice, Eric

    2015-01-01

    Background: Body weight in lower animals and humans is highly stable despite a very large flux in energy intake and expenditure over time. Conversely, the existence of higher-than-average variability in weight may indicate a disruption in the mechanisms responsible for homeostatic weight regulation. Objective: In a sample chosen for weight-gain proneness, we evaluated whether weight variability over a 6-mo period predicted subsequent weight change from 6 to 24 mo. Design: A total of 171 nonobese women were recruited to participate in this longitudinal study in which weight was measured 4 times over 24 mo. The initial 3 weights were used to calculate weight variability with the use of a root mean square error approach to assess fluctuations in weight independent of trajectory. Linear regression analysis was used to examine whether weight variability in the initial 6 mo predicted weight change 18 mo later. Results: Greater weight variability significantly predicted amount of weight gained. This result was unchanged after control for baseline body mass index (BMI) and BMI change from baseline to 6 mo and for measures of disinhibition, restrained eating, and dieting. Conclusions: Elevated weight variability in young women may signal the degradation of body weight regulatory systems. In an obesogenic environment this may eventuate in accelerated weight gain, particularly in those with a genetic susceptibility toward overweight. Future research is needed to evaluate the reliability of weight variability as a predictor of future weight gain and the sources of its predictive effect. The trial on which this study is based is registered at clinicaltrials.gov as NCT00456131. PMID:26354535

  9. Possible Influence of Volcanic Activity on the Decadal Potential Predictability of the Natural Variability in Near-Term Climate Predictions

    Directory of Open Access Journals (Sweden)

    Hideo Shiogama

    2010-01-01

    Full Text Available Initialization based on data assimilations using historical observations possibly improves near-term climate predictions. Significant volcanic activity in the future is unpredictable and not assumed in future climate predictions. To examine the possible influence of unpredictable future volcanic activity on the decadal potential predictability of the natural variability, we performed a 2006–2035 climate prediction experiment with the assumption that the 1991  Mt. Pinatubo eruption would take place again in 2010. The Pinatubo forcing induced not only significant cooling responses but also considerable noises in the natural variability. The errors due to the Pinatubo forcing grew faster than that arising from imperfect knowledge of the observed state, leading to a rapid reduction of the decadal potential predictability of the natural variability.

  10. Predicting Short-Term Subway Ridership and Prioritizing Its Influential Factors Using Gradient Boosting Decision Trees

    Directory of Open Access Journals (Sweden)

    Chuan Ding

    2016-10-01

    Full Text Available Understanding the relationship between short-term subway ridership and its influential factors is crucial to improving the accuracy of short-term subway ridership prediction. Although there has been a growing body of studies on short-term ridership prediction approaches, limited effort is made to investigate the short-term subway ridership prediction considering bus transfer activities and temporal features. To fill this gap, a relatively recent data mining approach called gradient boosting decision trees (GBDT is applied to short-term subway ridership prediction and used to capture the associations with the independent variables. Taking three subway stations in Beijing as the cases, the short-term subway ridership and alighting passengers from its adjacent bus stops are obtained based on transit smart card data. To optimize the model performance with different combinations of regularization parameters, a series of GBDT models are built with various learning rates and tree complexities by fitting a maximum of trees. The optimal model performance confirms that the gradient boosting approach can incorporate different types of predictors, fit complex nonlinear relationships, and automatically handle the multicollinearity effect with high accuracy. In contrast to other machine learning methods—or “black-box” procedures—the GBDT model can identify and rank the relative influences of bus transfer activities and temporal features on short-term subway ridership. These findings suggest that the GBDT model has considerable advantages in improving short-term subway ridership prediction in a multimodal public transportation system.

  11. Was the magnitude (M = 9.0R) of the mega-earthquake of Japan (11th of March, 2011) predictable? An analysis based on the Lithospheric Seismic Energy Flow Model (LSEFM)

    CERN Document Server

    Thanassoulas, C; Verveniotis, G

    2012-01-01

    The Tohoku EQ (11th of March, 2011, M = 9.0) in Japan falsified the proposed EQ magnitude range (M = 7.0 - 8.5) of the same seismogenic regional area that had been determined by the compiled hazard maps, study of historical data, or other probabilistic methods while a larger magnitude (M > 9.0) had been proposed for all subduction zones. The observed discrepancy between the proposed EQ magnitude range and the actual one of the Tohoku EQ is studied in this work in terms of the cumulative seismic energy release of the study area and by the use of the Lithospheric Seismic Energy Flow Model (LSEFM). The results indicate that the Tohoku mega-earthquake magnitude could be predicted quite accurately provided that a long past seismic history had been available for use by the LSEFM procedure. Moreover, the presence, of the missing historic 1855 EQ (7.0 < M < 8.0) from seismic catalogs, was predicted backwards by the LSEFM method and its existence was verified by the Ishibashi (2004) work on Japanese historic sei...

  12. Estimating the Maximum Possible Earthquake Magnitude Using Extreme Value Methodology : the Groningen Case

    NARCIS (Netherlands)

    Beirlant, J.; Kijko, Andrzej; Reykens, Tom; Einmahl, John

    2017-01-01

    The area-characteristic, maximum possible earthquake magnitude TM is required by the earthquake engineering community, disaster management agencies and the insurance industry. The Gutenberg-Richter law predicts that earthquake magnitudes M follow a truncated exponential distribution. In the

  13. Earthquake Facts

    Science.gov (United States)

    ... to the Atlantic Ocean, around Africa, Asia, and Australia, and under the Pacific Ocean to the west ... are similar to earthquakes, but occur within the ice sheet itself instead of the land underneath the ...

  14. Toward automated directivity estimates in earthquake moment tensor inversion

    OpenAIRE

    Huang, Hsin-Hua; Aso, Naofumi; Tsai, Victor C.

    2017-01-01

    Rapid estimates of earthquake rupture properties are useful for both scientific characterization of earthquakes and emergency response to earthquake hazards. Rupture directivity is a particularly important property to constrain since seismic waves radiated in the direction of rupture can be greatly amplified, and even moderate magnitude earthquakes can sometimes cause serious damage. Knowing the directivity of earthquakes is important for ground shaking prediction and hazard mitigation, and i...

  15. Early Seizure Frequency and Aetiology Predict Long-Term Medical Outcome in Childhood-Onset Epilepsy

    Science.gov (United States)

    Sillanpaa, Matti; Schmidt, Dieter

    2009-01-01

    In clinical practice, it is important to predict as soon as possible after diagnosis and starting treatment, which children are destined to develop medically intractable seizures and be at risk of increased mortality. In this study, we determined factors predictive of long-term seizure and mortality outcome in a population-based cohort of 102…

  16. Prediction of long-term success of orthopedic treatment in skeletal Class III malocclusions.

    Science.gov (United States)

    Choi, Yoon Jeong; Chang, Jeong Eun; Chung, Chooryung J; Tahk, Ji Hyun; Kim, Kyung-Ho

    2017-08-01

    We investigated the long-term success of orthopedic treatment in skeletal Class III malocclusions, established a model to predict its long-term success, and verified previously reported success rates and prediction models. Fifty-nine patients who underwent successful facemask treatment and were followed until growth completion were evaluated. After completion of growth, the patients were divided into successful and unsuccessful groups according to overjet, overbite, and facial profile. Pretreatment cephalometric measurements were compared between groups, and logistic regression analysis was used to identify the predictors of long-term success. Four previously published articles were selected to verify the success rate and predictability of the prediction models with regard to our patient sample. The treatment success rate was 62.7%. The AB-mandibular plane angle, Wits appraisal, and the articular angle were identified as predictors. The success rates differed according to success criteria and patient characteristics. The prediction models proposed by the 4 previous studies and our study showed similar predictabilities (61.0%-64.4%) for our patient sample. The predictability for the unsuccessful group was low. Our results suggest that no particular method or factor can predict the long-term success of orthopedic treatment for skeletal Class III malocclusion. Copyright © 2017 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  17. Comparison of Linear Prediction Methods in Terms of Sparsity, Stability and Robustness to Reverberation

    NARCIS (Netherlands)

    Koutrouvelis, A.I.; Heusdens, R.; Gaubitch, N.D.

    2014-01-01

    The aim of this paper is to provide an experimental evaluation of five linear prediction methods in terms of sparsity, stability and robustness to reverberation. Moreover, we show that all the considered methods can be derived from a general linear prediction optimization problem. It is empirically

  18. Rupture, waves and earthquakes

    Science.gov (United States)

    UENISHI, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but “extraordinary” phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. PMID:28077808

  19. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  20. Earthquake impact scale

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  1. Computational prediction of protein function based on weighted mapping of domains and GO terms.

    Science.gov (United States)

    Teng, Zhixia; Guo, Maozu; Dai, Qiguo; Wang, Chunyu; Li, Jin; Liu, Xiaoyan

    2014-01-01

    In this paper, we propose a novel method, SeekFun, to predict protein function based on weighted mapping of domains and GO terms. Firstly, a weighted mapping of domains and GO terms is constructed according to GO annotations and domain composition of the proteins. The association strength between domain and GO term is weighted by symmetrical conditional probability. Secondly, the mapping is extended along the true paths of the terms based on GO hierarchy. Finally, the terms associated with resident domains are transferred to host protein and real annotations of the host protein are determined by association strengths. Our careful comparisons demonstrate that SeekFun outperforms the concerned methods on most occasions. SeekFun provides a flexible and effective way for protein function prediction. It benefits from the well-constructed mapping of domains and GO terms, as well as the reasonable strategy for inferring annotations of protein from those of its domains.

  2. Earthquakes for Kids

    Science.gov (United States)

    ... dug across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History ...

  3. Earthquake Hazards Program: Earthquake Scenarios

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A scenario represents one realization of a potential future earthquake by assuming a particular magnitude, location, and fault-rupture geometry and estimating...

  4. A review of methods of extrapolating tidal model predictions to long term siltation effects in estuaries

    Energy Technology Data Exchange (ETDEWEB)

    1990-01-01

    Sites which have prospects for tidal power generation almost invariably have large expanses of potentially mobile sand banks, mud shoals and intertidal sand or mud flats. The construction and operation of a tidal power plant alters the tidal regime and hence the pattern of sediment transport within the sphere of influence of the engineering works. At present, there is no generally accepted method of predicting the long term effects of a major tidal power barrage on the equilibrium distribution of sediment in a large wide estuary. Detailed multi-dimension mathematical models are criticised because they appear only to predict the immediate impact of the proposed engineering works on the sediment regime. The largely empirical regime technique is commonly used to predict changes in the cross-sectional area of the estuary. However, there is a less widely used, but much more powerful, technique whereby the results from short term predictions are extrapolated in an interactive process to predict long term siltation. There are no established, tested theories or standard methods for extrapolating results from short term siltation models. For this reason this review is based on an analysis of a number of case studies, which represent the only readily available source of experience with the problem. The case studies do not relate to tidal barrage projects, but they do illustrate and demonstrate the basic principles and problems associated with the extrapolation of short term predictions. (author).

  5. Higher order predicted terms for an QCD observable, using PMS procedure

    Energy Technology Data Exchange (ETDEWEB)

    Bakniehl [Institute for Theoretical Physics-II, Hamburg University, Luruper Chaussee 149, 22761 Hamburg (Germany); Mirjalili, A [Institute for Studies in Theoretical Physics and Mathematics (IPM), 19395-5531, Tehran (Iran, Islamic Republic of)], E-mail: Mirjalili@ipm.ir

    2008-05-15

    In this letter, we first review the principle of minimum sensitivity (PMS) in NLO, NNLO and higher order approximations to find an optimized expression for the desired observable at specific order. It is possible to expand the optimized quantities in terms of the quantities which exist in the standard approach of QCD for the observable. In this case we are able to obtain the predicted higher order terms. The calculations indicate that the predicted term, in the NNLO and higher approximations is not unique.

  6. Temporal Prediction Errors Affect Short-Term Memory Scanning Response Time.

    Science.gov (United States)

    Limongi, Roberto; Silva, Angélica M

    2016-11-01

    The Sternberg short-term memory scanning task has been used to unveil cognitive operations involved in time perception. Participants produce time intervals during the task, and the researcher explores how task performance affects interval production - where time estimation error is the dependent variable of interest. The perspective of predictive behavior regards time estimation error as a temporal prediction error (PE), an independent variable that controls cognition, behavior, and learning. Based on this perspective, we investigated whether temporal PEs affect short-term memory scanning. Participants performed temporal predictions while they maintained information in memory. Model inference revealed that PEs affected memory scanning response time independently of the memory-set size effect. We discuss the results within the context of formal and mechanistic models of short-term memory scanning and predictive coding, a Bayes-based theory of brain function. We state the hypothesis that our finding could be associated with weak frontostriatal connections and weak striatal activity.

  7. Operational earthquake forecasting in California: A prototype system combining UCERF3 and CyberShake

    Science.gov (United States)

    Milner, K. R.; Jordan, T. H.; Field, E. H.

    2014-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about time-dependent earthquake probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To attain this goal, OEF must provide a complete description of the seismic hazard—ground motion exceedance probabilities as well as short-term rupture probabilities—in concert with the long-term forecasts of probabilistic seismic hazard analysis. We have combined the Third Uniform California Earthquake Rupture Forecast (UCERF3) of the Working Group on California Earthquake Probabilities (Field et al., 2014) with the CyberShake ground-motion model of the Southern California Earthquake Center (Graves et al., 2011; Callaghan et al., this meeting) into a prototype OEF system for generating time-dependent hazard maps. UCERF3 represents future earthquake activity in terms of fault-rupture probabilities, incorporating both Reid-type renewal models and Omori-type clustering models. The current CyberShake model comprises approximately 415,000 earthquake rupture variations to represent the conditional probability of future shaking at 285 geographic sites in the Los Angeles region (~236 million horizontal-component seismograms). This combination provides significant probability gains relative to OEF models based on empirical ground-motion prediction equations (GMPEs), primarily because the physics-based CyberShake simulations account for the rupture directivity, basin effects, and directivity-basin coupling that are not represented by the GMPEs.

  8. Diagnosis of Time of Increased Probability of strong earthquakes in different regions of the world: algorithm CN

    Science.gov (United States)

    Keilis-Borok, V. I.; Rotwain, I. M.

    An algorithm for intermediate-term earthquake prediction is suggested which allows diagnosis of the times of increased probability of strong earthquakes (TIPs). TIPs are declared for the time period of one year and an area with linear dimensions of a few hundred kilometers, and can be extended in time. The algorithm is based on the following traits of an earthquake flow: level of seismic activity; its temporal variation; clustering of earthquakes in space and time; their concentration in space; and their long-range interaction. The algorithm is normalized so that it can be applied in various regions without readaptation. TIPs, diagnosed by the algorithm, precede ˜ 80% of strong earthquakes and take on average ˜ 24% of the total time.

  9. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  10. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  11. A comparison of the medium-term impact and recovery of the Pakistan floods and the Haiti earthquake: objective and subjective measures.

    Science.gov (United States)

    Weiss, William M; Kirsch, Thomas D; Doocy, Shannon; Perrin, Paul

    2014-06-01

    The 2010 Haiti earthquake and Pakistan floods were similar in their massive human impact. Although the specific events were very different, the humanitarian response to disasters is supposed to achieve the same ends. This paper contrasts the disaster effects and aims to contrast the medium-term response. In January 2011, similarly structured population-based surveys were carried out in the most affected areas using stratified cluster designs (80×20 in Pakistan and 60×20 in Haiti) with probability proportional to size sampling. Displacement persisted in Haiti and Pakistan at 53% and 39% of households, respectively. In Pakistan, 95% of households reported damage to their homes and loss of income or livelihoods, and in Haiti, the rates were 93% and 85%, respectively. Frequency of displacement, and income or livelihood loss, were significantly higher in Pakistan, whereas disaster-related deaths or injuries were significantly more prevalent in Haiti. Given the rise in disaster frequency and costs, and the volatility of humanitarian funding streams as a result of the recent global financial crisis, it is increasingly important to measure the impact of humanitarian response against the goal of a return to normalcy.

  12. Predicting the liquefaction phenomena from shear velocity profiling: Empirical approach to 6.3 Mw, May 2006 Yogyakarta earthquake

    Science.gov (United States)

    Hartantyo, Eddy; Brotopuspito, Kirbani S.; Sismanto, Waluyo

    2015-04-01

    The liquefactions phenomena have been reported after a shocking 6.5Mw earthquake hit Yogyakarta province in the morning at 27 May 2006. Several researchers have reported the damage, casualties, and soil failure due to the quake, including the mapping and analyzing the liquefaction phenomena. Most of them based on SPT test. The study try to draw the liquefaction susceptibility by means the shear velocity profiling using modified Multichannel Analysis of Surface Waves (MASW). This paper is a preliminary report by using only several measured MASW points. The study built 8-channel seismic data logger with 4.5 Hz geophones for this purpose. Several different offsets used to record the high and low frequencies of surface waves. The phase-velocity diagrams were stacked in the frequency domain rather than in time domain, for a clearer and easier dispersion curve picking. All codes are implementing in Matlab. From these procedures, shear velocity profiling was collected beneath each geophone's spread. By mapping the minimum depth of shallow water table, calculating PGA with soil classification, using empirical formula for saturated soil weight from shear velocity profile, and calculating CRR and CSR at every depth, the liquefaction characteristic can be identify in every layer. From several acquired data, a liquefiable potential at some depth below water table was obtained.

  13. Predicting the liquefaction phenomena from shear velocity profiling: Empirical approach to 6.3 Mw, May 2006 Yogyakarta earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Hartantyo, Eddy, E-mail: hartantyo@ugm.ac.id [PhD student, Physics Department, FMIPA, UGM. Sekip Utara Yogyakarta 55281 Indonesia (Indonesia); Brotopuspito, Kirbani S.; Sismanto; Waluyo [Geophysics Laboratory, FMIPA, Universitas Gadjah Mada, Sekip Utara Yogyakarta 55281 (Indonesia)

    2015-04-24

    The liquefactions phenomena have been reported after a shocking 6.5Mw earthquake hit Yogyakarta province in the morning at 27 May 2006. Several researchers have reported the damage, casualties, and soil failure due to the quake, including the mapping and analyzing the liquefaction phenomena. Most of them based on SPT test. The study try to draw the liquefaction susceptibility by means the shear velocity profiling using modified Multichannel Analysis of Surface Waves (MASW). This paper is a preliminary report by using only several measured MASW points. The study built 8-channel seismic data logger with 4.5 Hz geophones for this purpose. Several different offsets used to record the high and low frequencies of surface waves. The phase-velocity diagrams were stacked in the frequency domain rather than in time domain, for a clearer and easier dispersion curve picking. All codes are implementing in Matlab. From these procedures, shear velocity profiling was collected beneath each geophone’s spread. By mapping the minimum depth of shallow water table, calculating PGA with soil classification, using empirical formula for saturated soil weight from shear velocity profile, and calculating CRR and CSR at every depth, the liquefaction characteristic can be identify in every layer. From several acquired data, a liquefiable potential at some depth below water table was obtained.

  14. Short-term memory predictions across the lifespan: monitoring span before and after conducting a task.

    Science.gov (United States)

    Bertrand, Julie Marilyne; Moulin, Chris John Anthony; Souchay, Céline

    2017-05-01

    Our objective was to explore metamemory in short-term memory across the lifespan. Five age groups participated in this study: 3 groups of children (4-13 years old), and younger and older adults. We used a three-phase task: prediction-span-postdiction. For prediction and postdiction phases, participants reported with a Yes/No response if they could recall in order a series of images. For the span task, they had to actually recall such series. From 4 years old, children have some ability to monitor their short-term memory and are able to adjust their prediction after experiencing the task. However, accuracy still improves significantly until adolescence. Although the older adults had a lower span, they were as accurate as young adults in their evaluation, suggesting that metamemory is unimpaired for short-term memory tasks in older adults. •We investigate metamemory for short-term memory tasks across the lifespan. •We find younger children cannot accurately predict their span length. •Older adults are accurate in predicting their span length. •People's metamemory accuracy was related to their short-term memory span.

  15. Phosphorylated IGFBP-1 in predicting successful vaginal delivery in post-term pregnancy.

    Science.gov (United States)

    Kosinska-Kaczynska, Katarzyna; Bomba-Opon, Dorota; Bobrowska, Katarzyna; Kozlowski, Szymon; Brawura-Biskupski-Samaha, Robert; Szymusik, Iwona; Wegrzyn, Piotr; Wielgos, Miroslaw

    2015-07-01

    To estimate whether phosphorylated IGFBP-1 (phIGFBP-1) in cervical secretion in term and post-term pregnancies can predict spontaneous onset of labor or vaginal delivery. A prospective cohort study of 167 women in singleton term and post-term pregnancies, was conducted at 1st Department of Obstetrics and Gynecology, Medical University of Warsaw, between 2013 and 2014. phIGFBP-1 test (Actim Partus Medix Biochemica), ultrasound cervix assessment and Bishop score were analyzed in the study group. Spontaneous onset of labor was the primary and vaginal delivery was the secondary outcome. In 32.5 % of patients, spontaneous uterine contractions appeared. 67.5 % of women delivered vaginally, 32.5 % had cesarean section. phIGFBP-1 test predicted spontaneous onset of labor (sensitivity 0.69, specificity of 0.42) and successful vaginal delivery (0.67, 0.48). In the prediction of spontaneous delivery onset ultrasound cervical assessment and phIBFBP-1 had comparable sensitivity and in the prediction of successful vaginal birth all three tests had comparable sensitivity. The time from preinduction to spontaneous onset of delivery was significantly shorter in women with positive phIGFBP-1 test (13.65 ± 6.7 vs 20.75 ± 2.6 h; p = 0.006). A test for phIGFBP1 presence might be an additional tool for predicting both spontaneous onset of labor and successful vaginal delivery in post-term pregnancies.

  16. Long-term prediction of polar motion using a combined SSA and ARMA model

    Science.gov (United States)

    Shen, Yi; Guo, Jinyun; Liu, Xin; Kong, Qiaoli; Guo, Linxi; Li, Wang

    2017-09-01

    To meet the need for real-time and high-accuracy predictions of polar motion (PM), the singular spectrum analysis (SSA) and the autoregressive moving average (ARMA) model are combined for short- and long-term PM prediction. According to the SSA results for PM and the SSA prediction algorithm, the principal components of PM were predicted by SSA, and the remaining components were predicted by the ARMA model. In applying this proposed method, multiple sets of PM predictions were made with lead times of two years, based on an IERS 08 C04 series. The observations and predictions of the principal components correlated well, and the SSA + ARMA model effectively predicted the PM. For 360-day lead time predictions, the root-mean-square errors (RMSEs) of PMx and PMy were 20.67 and 20.42 mas, respectively, which were less than the 24.46 and 24.78 mas predicted by IERS Bulletin A. The RMSEs of PMx and PMy in the 720-day lead time predictions were 28.61 and 27.95 mas, respectively.

  17. Prediction of Human Phenotype Ontology terms by means of hierarchical ensemble methods.

    Science.gov (United States)

    Notaro, Marco; Schubach, Max; Robinson, Peter N; Valentini, Giorgio

    2017-10-12

    The prediction of human gene-abnormal phenotype associations is a fundamental step toward the discovery of novel genes associated with human disorders, especially when no genes are known to be associated with a specific disease. In this context the Human Phenotype Ontology (HPO) provides a standard categorization of the abnormalities associated with human diseases. While the problem of the prediction of gene-disease associations has been widely investigated, the related problem of gene-phenotypic feature (i.e., HPO term) associations has been largely overlooked, even if for most human genes no HPO term associations are known and despite the increasing application of the HPO to relevant medical problems. Moreover most of the methods proposed in literature are not able to capture the hierarchical relationships between HPO terms, thus resulting in inconsistent and relatively inaccurate predictions. We present two hierarchical ensemble methods that we formally prove to provide biologically consistent predictions according to the hierarchical structure of the HPO. The modular structure of the proposed methods, that consists in a "flat" learning first step and a hierarchical combination of the predictions in the second step, allows the predictions of virtually any flat learning method to be enhanced. The experimental results show that hierarchical ensemble methods are able to predict novel associations between genes and abnormal phenotypes with results that are competitive with state-of-the-art algorithms and with a significant reduction of the computational complexity. Hierarchical ensembles are efficient computational methods that guarantee biologically meaningful predictions that obey the true path rule, and can be used as a tool to improve and make consistent the HPO terms predictions starting from virtually any flat learning method. The implementation of the proposed methods is available as an R package from the CRAN repository.

  18. Predictive Validity of the Columbia-Suicide Severity Rating Scale for Short-Term Suicidal Behavior

    DEFF Research Database (Denmark)

    Conway, Paul Maurice; Erlangsen, Annette; Teasdale, Thomas William

    2016-01-01

    Objectives: Using the Columbia-Suicide Severity Rating Scale (C-SSRS), we examined the predictive and incremental predictive validity of past-month suicidal behavior and ideation for short-term suicidal behavior among adolescents at a high risk of suicide. Methods: The study was conducted in 2014...... behavior predicted subsequent suicidal behavior (actual attempts and suicidal behavior of any type, including preparatory acts, aborted, interrupted and actual attempts; mean follow-up of 80.8 days, SD = 52.4). Furthermore, we examined whether suicidal ideation severity and intensity incrementally...... predicted suicidal behavior at follow-up over and above suicidal behavior at baseline. Results: Actual suicide attempts at baseline strongly predicted suicide attempts at follow-up. Baseline suicidal ideation severity and intensity did not significantly predict future actual attempts over and above baseline...

  19. Predicting heat flow in the 2001 Bhuj earthquake (Mw=7.7 region of Kachchh (Western India, using an inverse recurrence method

    Directory of Open Access Journals (Sweden)

    V. P. Dimri

    2011-09-01

    Full Text Available Terrestrial heat flow is considered an important parameter in studying the regional geotectonic and geodynamic evolutionary history of any region. However, its distribution is still very uneven. There is hardly any information available for many geodynamically important areas. In the present study, we provide a methodology to predict the surface heat flow in areas, where detailed seismic information such as depth to the lithosphere-asthenosphere boundary (LAB and crustal structure is known. The tool was first tested in several geotectonic blocks around the world and then used to predict the surface heat flow for the 2001 Bhuj earthquake region of Kachchh, India, which has been seismically active since historical times and where aftershock activity is still continuing nine years after the 2001 main event. Surface heat flow for this region is estimated to be about 61.3 mW m−2. Beneath this region, heat flow input from the mantle as well as the temperatures at the Moho are quite high at around 44 mW m−2 and 630 °C, respectively, possibly due to thermal restructuring of the underlying crust and mantle lithosphere. In absence of conventional data, the proposed tool may be used to estimate a first order heat flow in continental regions for geotectonic studies, as it is also unaffected by the subsurface climatic perturbations that percolate even up to 2000 m depth.

  20. First Results of the Regional Earthquake Likelihood Models Experiment

    OpenAIRE

    Schorlemmer, Danijel; Zechar, J. Douglas; Maximilian J. Werner; Field, Edward H.; Jackson, David D; Jordan, Thomas H.

    2010-01-01

    The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize e...

  1. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to changes in geologic processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  2. Development of an integrated method for long-term water quality prediction using seasonal climate forecast

    Directory of Open Access Journals (Sweden)

    J. Cho

    2016-10-01

    Full Text Available The APEC Climate Center (APCC produces climate prediction information utilizing a multi-climate model ensemble (MME technique. In this study, four different downscaling methods, in accordance with the degree of utilizing the seasonal climate prediction information, were developed in order to improve predictability and to refine the spatial scale. These methods include: (1 the Simple Bias Correction (SBC method, which directly uses APCC's dynamic prediction data with a 3 to 6 month lead time; (2 the Moving Window Regression (MWR method, which indirectly utilizes dynamic prediction data; (3 the Climate Index Regression (CIR method, which predominantly uses observation-based climate indices; and (4 the Integrated Time Regression (ITR method, which uses predictors selected from both CIR and MWR. Then, a sampling-based temporal downscaling was conducted using the Mahalanobis distance method in order to create daily weather inputs to the Soil and Water Assessment Tool (SWAT model. Long-term predictability of water quality within the Wecheon watershed of the Nakdong River Basin was evaluated. According to the Korean Ministry of Environment's Provisions of Water Quality Prediction and Response Measures, modeling-based predictability was evaluated by using 3-month lead prediction data issued in February, May, August, and November as model input of SWAT. Finally, an integrated approach, which takes into account various climate information and downscaling methods for water quality prediction, was presented. This integrated approach can be used to prevent potential problems caused by extreme climate in advance.

  3. Short-Term Wind Speed Prediction Using EEMD-LSSVM Model

    Directory of Open Access Journals (Sweden)

    Aiqing Kang

    2017-01-01

    Full Text Available Hybrid Ensemble Empirical Mode Decomposition (EEMD and Least Square Support Vector Machine (LSSVM is proposed to improve short-term wind speed forecasting precision. The EEMD is firstly utilized to decompose the original wind speed time series into a set of subseries. Then the LSSVM models are established to forecast these subseries. Partial autocorrelation function is adopted to analyze the inner relationships between the historical wind speed series in order to determine input variables of LSSVM models for prediction of every subseries. Finally, the superposition principle is employed to sum the predicted values of every subseries as the final wind speed prediction. The performance of hybrid model is evaluated based on six metrics. Compared with LSSVM, Back Propagation Neural Networks (BP, Auto-Regressive Integrated Moving Average (ARIMA, combination of Empirical Mode Decomposition (EMD with LSSVM, and hybrid EEMD with ARIMA models, the wind speed forecasting results show that the proposed hybrid model outperforms these models in terms of six metrics. Furthermore, the scatter diagrams of predicted versus actual wind speed and histograms of prediction errors are presented to verify the superiority of the hybrid model in short-term wind speed prediction.

  4. Improved Short-Term Clock Prediction Method for Real-Time Positioning

    Directory of Open Access Journals (Sweden)

    Yifei Lv

    2017-06-01

    Full Text Available The application of real-time precise point positioning (PPP requires real-time precise orbit and clock products that should be predicted within a short time to compensate for the communication delay or data gap. Unlike orbit correction, clock correction is difficult to model and predict. The widely used linear model hardly fits long periodic trends with a small data set and exhibits significant accuracy degradation in real-time prediction when a large data set is used. This study proposes a new prediction model for maintaining short-term satellite clocks to meet the high-precision requirements of real-time clocks and provide clock extrapolation without interrupting the real-time data stream. Fast Fourier transform (FFT is used to analyze the linear prediction residuals of real-time clocks. The periodic terms obtained through FFT are adopted in the sliding window prediction to achieve a significant improvement in short-term prediction accuracy. This study also analyzes and compares the accuracy of short-term forecasts (less than 3 h by using different length observations. Experimental results obtained from International GNSS Service (IGS final products and our own real-time clocks show that the 3-h prediction accuracy is better than 0.85 ns. The new model can replace IGS ultra-rapid products in the application of real-time PPP. It is also found that there is a positive correlation between the prediction accuracy and the short-term stability of on-board clocks. Compared with the accuracy of the traditional linear model, the accuracy of the static PPP using the new model of the 2-h prediction clock in N, E, and U directions is improved by about 50%. Furthermore, the static PPP accuracy of 2-h clock products is better than 0.1 m. When an interruption occurs in the real-time model, the accuracy of the kinematic PPP solution using 1-h clock prediction product is better than 0.2 m, without significant accuracy degradation. This model is of practical

  5. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev

    2010-12-01

    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  6. If pandas scream. an earthquake is coming

    Energy Technology Data Exchange (ETDEWEB)

    Magida, P.

    Feature article:Use of the behavior of animals to predict weather has spanned several ages and dozens of countries. While animals may behave in diverse ways to indicate weather changes, they all tend to behave in more or less the same way before earthquakes. The geophysical community in the U.S. has begun testing animal behavior before earthquakes. It has been determined that animals have the potential of acting as accurate geosensors to detect earthquakes before they occur. (5 drawings)

  7. Prediction of hyperbilirubinemia by noninvasive methods in full-term newborns

    OpenAIRE

    Danijela Furlan; Lidija Žalec; Tatjana Pavlin; Mirjam Gradecki; Darinka Oštir Mevželj; Borut Bratanič

    2013-01-01

    Introduction: The noninvasive screening methods for bilirubin determination were studied prospectively in a group of full-term healthy newborns with the aim of early prediction of pathological neonatal hyperbilirubinemia. Laboratory determination of bilirubin (Jendrassik-Grof (JG)) was compared to the noninvasive transcutaneous bilirubin (TcBIL) together with the determination of bilirubin in cord blood.Methods: The study group consisted of 284 full-term healthy consecutively born infants in ...

  8. Long-term prediction of reading accuracy and speed: The importance of paired-associate learning

    DEFF Research Database (Denmark)

    Poulsen, Mads; Asmussen, Vibeke; Elbro, Carsten

    of reading comprehension and isolated sight word reading accuracy and speed. Results: PAL predicted unique variance in sight word accuracy, but not speed. Furthermore, PAL was indirectly linked to reading comprehension through sight word accuracy. RAN correlated with both accuracy and speed......, and these effects translated into indirect effects on reading comprehension. Phonological awareness and letter knowledge did not contribute uniquely to Grade 5 reading after control for PAL and RAN. Conclusions: PAL and RAN predict separate aspects of sight word reading: accuracy and speed. Both aspects appear...... relevant for reading comprehension. We discuss the lack of long-term prediction from phonological awareness and letter knowledge....

  9. Exploring the predictive power of interaction terms in a sophisticated risk equalization model using regression trees.

    Science.gov (United States)

    van Veen, S H C M; van Kleef, R C; van de Ven, W P M M; van Vliet, R C J A

    2017-05-23

    This study explores the predictive power of interaction terms between the risk adjusters in the Dutch risk equalization (RE) model of 2014. Due to the sophistication of this RE-model and the complexity of the associations in the dataset (N = ~16.7 million), there are theoretically more than a million interaction terms. We used regression tree modelling, which has been applied rarely within the field of RE, to identify interaction terms that statistically significantly explain variation in observed expenses that is not already explained by the risk adjusters in this RE-model. The interaction terms identified were used as additional risk adjusters in the RE-model. We found evidence that interaction terms can improve the prediction of expenses overall and for specific groups in the population. However, the prediction of expenses for some other selective groups may deteriorate. Thus, interactions can reduce financial incentives for risk selection for some groups but may increase them for others. Furthermore, because regression trees are not robust, additional criteria are needed to decide which interaction terms should be used in practice. These criteria could be the right incentive structure for risk selection and efficiency or the opinion of medical experts. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Analysis of a Possibility of Electromagnetic Earthquake Triggering by Ionospheric Disturbations

    Science.gov (United States)

    Novikov, V.; Ruzhin, Y.

    2012-12-01

    It is well known that some ionospheric perturbations precede strong earthquakes, and there are attempts to detect and apply them as precursors for short-term earthquake prediction. In that case it is assumed that the processes of earthquake preparation in lithosphere can provide disturbances in ionosphere. From another hand, theoretical, field, and laboratory experimental results obtained during implementation of research projects in Russia within recent ten years demonstrated an evidence of artificial electromagnetic triggering of earthquakes, when electric current density provided by special pulsed power systems at the earthquake source depth (5-10 km) is 10^-7 - 10^-8 A/m^2 is comparable with the density of telluric currents induced in the crust by ionospheric disturbances. In this case it may be supposed that some reported preseismic ionosperic anomalies provide triggering effect for earthquake occurrence. To clear the details of ionosphere-lithosphere coupling and a possibility of electromagnetic triggering of seismic events an analysis of ionospheric precursors of earthquakes, statistical analysis of geomagnetic field variations and seismic activity, laboratory studies of dynamics of deformation of stressed rocks under the electromagnetic impact, as well as theoretical analysis of the possible mechanisms of interaction of rocks with electromagnetic field and their verification in laboratory experiments at the special test equipment, which simulates behavior of the fault zone under external triggering factors were catrried out. A model of electromagnetic triggering of seismic events caused by ionospheric electromagnetic perturbations is proposed based on the fluid migration to the fault under critical stressed state due to interaction of conductive fluid with telluric currents and geomagnetic field. A possibility of development of physical method of short-term earthquake prediction based on electromagnetic triggering effects is discussed.

  11. How Much Can the Total Aleatory Variability of Empirical Ground Motion Prediction Equations Be Reduced Using Physics-Based Earthquake Simulations?

    Science.gov (United States)

    Jordan, T. H.; Wang, F.; Graves, R. W.; Callaghan, S.; Olsen, K. B.; Cui, Y.; Milner, K. R.; Juve, G.; Vahi, K.; Yu, J.; Deelman, E.; Gill, D.; Maechling, P. J.

    2015-12-01

    Ground motion prediction equations (GMPEs) in common use predict the logarithmic intensity of ground shaking, lnY, as a deterministic value, lnYpred(x), conditioned on a set of explanatory variables x plus a normally distributed random variable with a standard deviation σT. The latter accounts for the unexplained variability in the ground motion data used to calibrate the GMPE and is typically 0.5-0.7 in natural log units. Reducing this residual or "aleatory" variability is a high priority for seismic hazard analysis, because the probabilities of exceedance at high Y values go up rapidly with σT. adding costs to the seismic design of critical facilities to account for the prediction uncertainty. However, attempts to decrease σT by incorporating more explanatory variables to the GMPEs have been largely unsuccessful (e.g., Strasser et al., SRL, 2009). An alternative is to employ physics-based earthquake simulations that properly account for source directivity, basin effects, directivity-basin coupling, and other 3D complexities. We have explored the theoretical limits of this approach through an analysis of large (> 108) ensembles of 3D synthetic seismograms generated for the Los Angeles region by SCEC's CyberShake project using the new tool of averaging-based factorization (ABF, Wang & Jordan, BSSA, 2014). The residual variance obtained by applying GMPEs to the CyberShake dataset matches the frequency-dependence of σT obtained for the GMPE calibration dataset. The ABF analysis allows us to partition this variance into uncorrelated components representing source, path, and site effects. We show that simulations can potentially reduce σT by about one-third, which could lower the exceedance probabilities for high hazard levels at fixed x by orders of magnitude. Realizing this gain in forecasting probability would have a broad impact on risk-reduction strategies, especially for critical facilities such as large dams, nuclear power plants, and energy transportation

  12. Chaotic analysis and short-term prediction of ozone pollution in Malaysian urban area

    Science.gov (United States)

    Hamid, Nor Zila Abd; Noorani, Mohd Salmi Md; Hamiza Adenan, Nur

    2017-09-01

    This study focuses on the analysis and prediction of hourly ozone (O3) pollution in one of Malaysian urban area namely Shah Alam through chaotic approach. This approach begins by detecting the chaotic behavior of the O3 pollution using phase space plot and Cao method. Then, the local mean approximation method is used for prediction purposes. The O3 pollution observed at Shah Alam is detected as chaotic in behavior. Due to the chaotic behavior, only short-term prediction is allowed. Thus, the one-hour ahead prediction is done through the local mean approximation method. The prediction result shows that correlation coefficient value between the observed and predicted time series is near to one. This excellent prediction result shows in particular that the local mean approximation method can be used to predict the O3 pollution in urban area. In general, chaotic approach is a useful approach that can be used to analyze and predict the O3 pollution time series.

  13. Prediction of near-term breast cancer risk using a Bayesian belief network

    Science.gov (United States)

    Zheng, Bin; Ramalingam, Pandiyarajan; Hariharan, Harishwaran; Leader, Joseph K.; Gur, David

    2013-03-01

    Accurately predicting near-term breast cancer risk is an important prerequisite for establishing an optimal personalized breast cancer screening paradigm. In previous studies, we investigated and tested the feasibility of developing a unique near-term breast cancer risk prediction model based on a new risk factor associated with bilateral mammographic density asymmetry between the left and right breasts of a woman using a single feature. In this study we developed a multi-feature based Bayesian belief network (BBN) that combines bilateral mammographic density asymmetry with three other popular risk factors, namely (1) age, (2) family history, and (3) average breast density, to further increase the discriminatory power of our cancer risk model. A dataset involving "prior" negative mammography examinations of 348 women was used in the study. Among these women, 174 had breast cancer detected and verified in the next sequential screening examinations, and 174 remained negative (cancer-free). A BBN was applied to predict the risk of each woman having cancer detected six to 18 months later following the negative screening mammography. The prediction results were compared with those using single features. The prediction accuracy was significantly increased when using the BBN. The area under the ROC curve increased from an AUC=0.70 to 0.84 (p<0.01), while the positive predictive value (PPV) and negative predictive value (NPV) also increased from a PPV=0.61 to 0.78 and an NPV=0.65 to 0.75, respectively. This study demonstrates that a multi-feature based BBN can more accurately predict the near-term breast cancer risk than with a single feature.

  14. A review on the young history of the wind power short-term prediction

    DEFF Research Database (Denmark)

    Costa, A.; Crespo, A.; Navarro, J.

    2008-01-01

    This paper makes a brief review on 30 years of history of the wind power short-term prediction, since the first ideas and sketches on the theme to the actual state of the art oil models and tools, giving emphasis to the most significant proposals and developments. The two principal lines of thought...

  15. Serum YKL-40 predicts long-term mortality in patients with stable coronary disease

    DEFF Research Database (Denmark)

    Harutyunyan, Marina; Gøtze, Jens P; Winkel, Per

    2013-01-01

    We investigated whether the inflammatory biomarker YKL-40 could improve the long-term prediction of death made by common risk factors plus high-sensitivity C-reactive protein (hs-CRP) and N-terminal-pro-B natriuretic peptide (NT-proBNP) in patients with stable coronary artery disease (CAD)....

  16. Predicting Changes in Cultural Sensitivity among Students of Spanish during Short-Term Study Abroad

    Science.gov (United States)

    Martinsen, Rob

    2011-01-01

    Short-term study abroad programs of less than a semester are becoming increasingly popular among undergraduate students in the United States. However, little research has examined the changes in students' cultural sensitivity through their participation in such programs or what factors may predict growth and improvement in such areas. This study…

  17. Admission body temperature predicts long-term mortality after acute stroke

    DEFF Research Database (Denmark)

    Kammersgaard, L P; Jørgensen, H S; Rungby, Jørgen

    2002-01-01

    Body temperature is considered crucial in the management of acute stroke patients. Recently hypothermia applied as a therapy for stroke has been demonstrated to be feasible and safe in acute stroke patients. In the present study, we investigated the predictive role of admission body temperature t...... to the long-term mortality in stroke patients....

  18. Serial-Order Short-Term Memory Predicts Vocabulary Development: Evidence from a Longitudinal Study

    Science.gov (United States)

    Leclercq, Anne-Lise; Majerus, Steve

    2010-01-01

    Serial-order short-term memory (STM), as opposed to item STM, has been shown to be very consistently associated with lexical learning abilities in cross-sectional study designs. This study investigated longitudinal predictions between serial-order STM and vocabulary development. Tasks maximizing the temporary retention of either serial-order or…

  19. Prediction of postpartum hemorrhage in women with gestational hypertension or mild preeclampsia at term

    NARCIS (Netherlands)

    Koopmans, Corine M.; van der Tuuk, Karin; Groen, Henk; Doornbos, Johannes P. R.; de Graaf, Irene M.; van der Salm, Pauline C. M.; Porath, Martina M.; Kuppens, Simone M. I.; Wijnen, Ella J.; Aardenburg, Robert; van Loon, Aren J.; Akerboom, Bettina M. C.; van der Lans, Peggy J. A.; Mol, Ben W. J.; van Pampus, Maria G.

    2014-01-01

    To assess whether postpartum hemorrhage can be predicted in women with gestational hypertension or mild preeclampsia at term. A cohort study in which we used data from our multicentre randomized controlled trial (HYPITAT trial). The study was conducted in 38 hospitals in the Netherlands between 2005

  20. Prediction of postpartum hemorrhage in women with gestational hypertension or mild preeclampsia at term

    NARCIS (Netherlands)

    Koopmans, Corine M.; Van der Tuuk, Karin; Groen, Henk; Doornbos, Johannes P. R.; De Graaf, Irene M.; Van der Salm, Pauline C. M.; Porath, Martina M.; Kuppens, Simone M. I.; Wijnen, Ella J.; Aardenburg, Robert; Van Loon, Aren J.; Akerboom, Bettina M. C.; Van der Lans, Peggy J. A.; Mol, Ben W. J.; Van Pampus, Maria G.

    OBJECTIVE: To assess whether postpartum hemorrhage can be predicted in women with gestational hypertension or mild preeclampsia at term. DESIGN: A cohort study in which we used data from our multicentre randomized controlled trial (HYPITAT trial). SETTING: The study was conducted in 38 hospitals in

  1. Prediction of recurrence of hypertensive disorders of pregnancy in the term period, a retrospective cohort study

    NARCIS (Netherlands)

    van Oostwaard, Miriam F.; Langenveld, Josje; Schuit, Ewoud; Wigny, Kiki; van Susante, Hilde; Beune, Irene; Ramaekers, Roos; Papatsonis, Dimitri N. M.; Mol, Ben Willem J.; Ganzevoort, Wessel

    2014-01-01

    Objectives: To assess the recurrence risk of term hypertensive disease of pregnancy and to determine which potential risk factors are predictive of recurrence. Study design: We performed a retrospective cohort study in two secondary and one tertiary care hospitals in the Netherlands. We identified

  2. Temperamental factors predict long-term modifications of eating disorders after treatment.

    Science.gov (United States)

    Segura-García, Cristina; Chiodo, Dora; Sinopoli, Flora; De Fazio, Pasquale

    2013-11-07

    Eating Disorders (EDs) are complex psychiatric pathologies characterized by moderate to poor response to treatment. Criteria of remission and recovery are not yet well defined. Simultaneously, personality plays a key role among the factors that determine treatment outcome. The aim of the present research is to evaluate the possibility of temperamental and character traits to predict the long-term outcome of ED. A sample of 25 AN and 28 BN female patients were re-assessed face-to-face after a minimum 5-years-follow-up through SCID-I, EDI-2 and TCI-R. Regression Analyses were performed to ascertain the possibility of TCI-R dimensions at the first visit to predict the long-term outcome. Clinical and psychopathological symptoms significantly decreased over the time and 23% of participants no longer received a categorical ED diagnosis after at least 5 years of follow-up. TCI-R dimensions failed to predict the absence of a DSM-IV-TR diagnosis in the long term, but Novelty Seeking, Harm Avoidance and Reward Dependence demonstrated to predict the clinical improvement of several EDI-2 scales. Our results support the idea that temperamental dimensions are relevant to the long-term improvement of clinical variables of ED. Low Novelty Seeking is the strongest predictor of poor outcome.

  3. Using Forecasting to Predict Long-Term Resource Utilization for Web Services

    Science.gov (United States)

    Yoas, Daniel W.

    2013-01-01

    Researchers have spent years understanding resource utilization to improve scheduling, load balancing, and system management through short-term prediction of resource utilization. Early research focused primarily on single operating systems; later, interest shifted to distributed systems and, finally, into web services. In each case researchers…

  4. Swarm Intelligence-Based Hybrid Models for Short-Term Power Load Prediction

    Directory of Open Access Journals (Sweden)

    Jianzhou Wang

    2014-01-01

    Full Text Available Swarm intelligence (SI is widely and successfully applied in the engineering field to solve practical optimization problems because various hybrid models, which are based on the SI algorithm and statistical models, are developed to further improve the predictive abilities. In this paper, hybrid intelligent forecasting models based on the cuckoo search (CS as well as the singular spectrum analysis (SSA, time series, and machine learning methods are proposed to conduct short-term power load prediction. The forecasting performance of the proposed models is augmented by a rolling multistep strategy over the prediction horizon. The test results are representative of the out-performance of the SSA and CS in tuning the seasonal autoregressive integrated moving average (SARIMA and support vector regression (SVR in improving load forecasting, which indicates that both the SSA-based data denoising and SI-based intelligent optimization strategy can effectively improve the model’s predictive performance. Additionally, the proposed CS-SSA-SARIMA and CS-SSA-SVR models provide very impressive forecasting results, demonstrating their strong robustness and universal forecasting capacities in terms of short-term power load prediction 24 hours in advance.

  5. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  6. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    Science.gov (United States)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-10-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  7. Seismic activity prediction using computational intelligence techniques in northern Pakistan

    Science.gov (United States)

    Asim, Khawaja M.; Awais, Muhammad; Martínez-Álvarez, F.; Iqbal, Talat

    2017-09-01

    Earthquake prediction study is carried out for the region of northern Pakistan. The prediction methodology includes interdisciplinary interaction of seismology and computational intelligence. Eight seismic parameters are computed based upon the past earthquakes. Predictive ability of these eight seismic parameters is evaluated in terms of information gain, which leads to the selection of six parameters to be used in prediction. Multiple computationally intelligent models have been developed for earthquake prediction using selected seismic parameters. These models include feed-forward neural network, recurrent neural network, random forest, multi layer perceptron, radial basis neural network, and support vector machine. The performance of every prediction model is evaluated and McNemar's statistical test is applied to observe the statistical significance of computational methodologies. Feed-forward neural network shows statistically significant predictions along with accuracy of 75% and positive predictive value of 78% in context of northern Pakistan.

  8. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  9. Short Term Prediction of PM10 Concentrations Using Seasonal Time Series Analysis

    Directory of Open Access Journals (Sweden)

    Hamid Hazrul Abdul

    2016-01-01

    Full Text Available Air pollution modelling is one of an important tool that usually used to make short term and long term prediction. Since air pollution gives a big impact especially to human health, prediction of air pollutants concentration is needed to help the local authorities to give an early warning to people who are in risk of acute and chronic health effects from air pollution. Finding the best time series model would allow prediction to be made accurately. This research was carried out to find the best time series model to predict the PM10 concentrations in Nilai, Negeri Sembilan, Malaysia. By considering two seasons which is wet season (north east monsoon and dry season (south west monsoon, seasonal autoregressive integrated moving average model were used to find the most suitable model to predict the PM10 concentrations in Nilai, Negeri Sembilan by using three error measures. Based on AIC statistics, results show that ARIMA (1, 1, 1 × (1, 0, 012 is the most suitable model to predict PM10 concentrations in Nilai, Negeri Sembilan.

  10. A computational environment for long-term multi-feature and multi-algorithm seizure prediction.

    Science.gov (United States)

    Teixeira, C A; Direito, B; Costa, R P; Valderrama, M; Feldwisch-Drentrup, H; Nikolopoulos, S; Le Van Quyen, M; Schelter, B; Dourado, A

    2010-01-01

    The daily life of epilepsy patients is constrained by the possibility of occurrence of seizures. Until now, seizures cannot be predicted with sufficient sensitivity and specificity. Most of the seizure prediction studies have been focused on a small number of patients, and frequently assuming unrealistic hypothesis. This paper adopts the view that for an appropriate development of reliable predictors one should consider long-term recordings and several features and algorithms integrated in one software tool. A computational environment, based on Matlab (®), is presented, aiming to be an innovative tool for seizure prediction. It results from the need of a powerful and flexible tool for long-term EEG/ECG analysis by multiple features and algorithms. After being extracted, features can be subjected to several reduction and selection methods, and then used for prediction. The predictions can be conducted based on optimized thresholds or by applying computational intelligence methods. One important aspect is the integrated evaluation of the seizure prediction characteristic of the developed predictors.

  11. MultiLoc2: integrating phylogeny and Gene Ontology terms improves subcellular protein localization prediction

    Directory of Open Access Journals (Sweden)

    Kohlbacher Oliver

    2009-09-01

    Full Text Available Abstract Background Knowledge of subcellular localization of proteins is crucial to proteomics, drug target discovery and systems biology since localization and biological function are highly correlated. In recent years, numerous computational prediction methods have been developed. Nevertheless, there is still a need for prediction methods that show more robustness and higher accuracy. Results We extended our previous MultiLoc predictor by incorporating phylogenetic profiles and Gene Ontology terms. Two different datasets were used for training the system, resulting in two versions of this high-accuracy prediction method. One version is specialized for globular proteins and predicts up to five localizations, whereas a second version covers all eleven main eukaryotic subcellular localizations. In a benchmark study with five localizations, MultiLoc2 performs considerably better than other methods for animal and plant proteins and comparably for fungal proteins. Furthermore, MultiLoc2 performs clearly better when using a second dataset that extends the benchmark study to all eleven main eukaryotic subcellular localizations. Conclusion MultiLoc2 is an extensive high-performance subcellular protein localization prediction system. By incorporating phylogenetic profiles and Gene Ontology terms MultiLoc2 yields higher accuracies compared to its previous version. Moreover, it outperforms other prediction systems in two benchmarks studies. MultiLoc2 is available as user-friendly and free web-service, available at: http://www-bs.informatik.uni-tuebingen.de/Services/MultiLoc2.

  12. Severe accident source term characteristics for selected Peach Bottom sequences predicted by the MELCOR Code

    Energy Technology Data Exchange (ETDEWEB)

    Carbajo, J.J. [Oak Ridge National Lab., TN (United States)

    1993-09-01

    The purpose of this report is to compare in-containment source terms developed for NUREG-1159, which used the Source Term Code Package (STCP), with those generated by MELCOR to identify significant differences. For this comparison, two short-term depressurized station blackout sequences (with a dry cavity and with a flooded cavity) and a Loss-of-Coolant Accident (LOCA) concurrent with complete loss of the Emergency Core Cooling System (ECCS) were analyzed for the Peach Bottom Atomic Power Station (a BWR-4 with a Mark I containment). The results indicate that for the sequences analyzed, the two codes predict similar total in-containment release fractions for each of the element groups. However, the MELCOR/CORBH Package predicts significantly longer times for vessel failure and reduced energy of the released material for the station blackout sequences (when compared to the STCP results). MELCOR also calculated smaller releases into the environment than STCP for the station blackout sequences.

  13. Benevolent sexist beliefs predict perceptions of speakers and recipients of a term of endearment.

    Science.gov (United States)

    Boasso, Alyssa; Covert, Sarah; Ruscher, Janet B

    2012-01-01

    This study examined how endorsement of benevolent sexist ideologies predicts perceptions of requesters who use a term of endearment and of the female addressees who comply with their requests. Undergraduate women who previously completed the Benevolent Sexism Scale as part of the Ambivalent Sexism Inventory were randomly assigned to one of four groups. They watched one of four videos in which a female addressee responded to a request that either included or did not include the term of endearment "hon"; the requester was either male or female. Participants then rated both actors' social likeability. Among participants who watched a woman respond to a female requester who addressed her with the term "hon," benevolent sexism scores predicted liking for the female responder and disliking of the female requester. Findings reflect the dissatisfaction of women who are high in benevolent sexism with women who act outside of traditional gender role expectations.

  14. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    Science.gov (United States)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  15. The 1170 and 1202 CE Dead Sea Rift earthquakes and long-term magnitude distribution of the Dead Sea Fault zone

    Science.gov (United States)

    Hough, S.E.; Avni, R.

    2009-01-01

    In combination with the historical record, paleoseismic investigations have provided a record of large earthquakes in the Dead Sea Rift that extends back over 1500 years. Analysis of macroseismic effects can help refine magnitude estimates for large historical events. In this study we consider the detailed intensity distributions for two large events, in 1170 CE and 1202 CE, as determined from careful reinterpretation of available historical accounts, using the 1927 Jericho earthquake as a guide in their interpretation. In the absence of an intensity attenuation relationship for the Dead Sea region, we use the 1927 Jericho earthquake to develop a preliminary relationship based on a modification of the relationships developed in other regions. Using this relation, we estimate M7.6 for the 1202 earthquake and M6.6 for the 1170 earthquake. The uncertainties for both estimates are large and difficult to quantify with precision. The large uncertainties illustrate the critical need to develop a regional intensity attenuation relation. We further consider the distribution of magnitudes in the historic record and show that it is consistent with a b-value distribution with a b-value of 1. Considering the entire Dead Sea Rift zone, we show that the seismic moment release rate over the past 1500 years is sufficient, within the uncertainties of the data, to account for the plate tectonic strain rate along the plate boundary. The results reveal that an earthquake of M7.8 is expected within the zone on average every 1000 years. ?? 2011 Science From Israel/LPPLtd.

  16. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  17. Long-Term Prediction of Severe Hypoglycemia in Type 1 Diabetes

    DEFF Research Database (Denmark)

    Henriksen, Marie Moth; Færch, Louise; Thorsteinsson, Birger

    2016-01-01

    BACKGROUND: Prediction of risk of severe hypoglycemia (SH) in patients with type 1 diabetes is important to prevent future episodes, but it is unknown if it is possible to predict the long-term risk of SH. The aim of the study is to assess if long-term prediction of SH is possible in type 1...... diabetes. METHODS: A follow-up study was performed with 98 patients with type 1 diabetes. At baseline and at follow-up, the patients filled in a questionnaire about diabetes history and complications, number of SH in the preceding year and state of awareness, and HbA1c and C-peptide levels were measured......-up. CONCLUSIONS: Long-term prediction of severe hypoglycemia in type 1 diabetes was not possible, although baseline hypoglycemia unawareness tended to remain a predictor for risk of SH at follow-up. Therefore, it is important repeatedly to assess the different risk factors of SH to determine the actual risk....

  18. Molecular constraints on synaptic tagging and maintenance of long-term potentiation: a predictive model.

    Science.gov (United States)

    Smolen, Paul; Baxter, Douglas A; Byrne, John H

    2012-01-01

    Protein synthesis-dependent, late long-term potentiation (LTP) and depression (LTD) at glutamatergic hippocampal synapses are well characterized examples of long-term synaptic plasticity. Persistent increased activity of protein kinase M ζ (PKMζ) is thought essential for maintaining LTP. Additional spatial and temporal features that govern LTP and LTD induction are embodied in the synaptic tagging and capture (STC) and cross capture hypotheses. Only synapses that have been "tagged" by a stimulus sufficient for LTP and learning can "capture" PKMζ. A model was developed to simulate the dynamics of key molecules required for LTP and LTD. The model concisely represents relationships between tagging, capture, LTD, and LTP maintenance. The model successfully simulated LTP maintained by persistent synaptic PKMζ, STC, LTD, and cross capture, and makes testable predictions concerning the dynamics of PKMζ. The maintenance of LTP, and consequently of at least some forms of long-term memory, is predicted to require continual positive feedback in which PKMζ enhances its own synthesis only at potentiated synapses. This feedback underlies bistability in the activity of PKMζ. Second, cross capture requires the induction of LTD to induce dendritic PKMζ synthesis, although this may require tagging of a nearby synapse for LTP. The model also simulates the effects of PKMζ inhibition, and makes additional predictions for the dynamics of CaM kinases. Experiments testing the above predictions would significantly advance the understanding of memory maintenance.

  19. Molecular constraints on synaptic tagging and maintenance of long-term potentiation: a predictive model.

    Directory of Open Access Journals (Sweden)

    Paul Smolen

    Full Text Available Protein synthesis-dependent, late long-term potentiation (LTP and depression (LTD at glutamatergic hippocampal synapses are well characterized examples of long-term synaptic plasticity. Persistent increased activity of protein kinase M ζ (PKMζ is thought essential for maintaining LTP. Additional spatial and temporal features that govern LTP and LTD induction are embodied in the synaptic tagging and capture (STC and cross capture hypotheses. Only synapses that have been "tagged" by a stimulus sufficient for LTP and learning can "capture" PKMζ. A model was developed to simulate the dynamics of key molecules required for LTP and LTD. The model concisely represents relationships between tagging, capture, LTD, and LTP maintenance. The model successfully simulated LTP maintained by persistent synaptic PKMζ, STC, LTD, and cross capture, and makes testable predictions concerning the dynamics of PKMζ. The maintenance of LTP, and consequently of at least some forms of long-term memory, is predicted to require continual positive feedback in which PKMζ enhances its own synthesis only at potentiated synapses. This feedback underlies bistability in the activity of PKMζ. Second, cross capture requires the induction of LTD to induce dendritic PKMζ synthesis, although this may require tagging of a nearby synapse for LTP. The model also simulates the effects of PKMζ inhibition, and makes additional predictions for the dynamics of CaM kinases. Experiments testing the above predictions would significantly advance the understanding of memory maintenance.

  20. Thermal Infrared Anomalies of Several Strong Earthquakes

    Directory of Open Access Journals (Sweden)

    Congxin Wei

    2013-01-01

    Full Text Available In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1 There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2 There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3 Thermal radiation anomalies are closely related to the geological structure. (4 Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  1. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  2. Two Machine Learning Approaches for Short-Term Wind Speed Time-Series Prediction.

    Science.gov (United States)

    Ak, Ronay; Fink, Olga; Zio, Enrico

    2016-08-01

    The increasing liberalization of European electricity markets, the growing proportion of intermittent renewable energy being fed into the energy grids, and also new challenges in the patterns of energy consumption (such as electric mobility) require flexible and intelligent power grids capable of providing efficient, reliable, economical, and sustainable energy production and distribution. From the supplier side, particularly, the integration of renewable energy sources (e.g., wind and solar) into the grid imposes an engineering and economic challenge because of the limited ability to control and dispatch these energy sources due to their intermittent characteristics. Time-series prediction of wind speed for wind power production is a particularly important and challenging task, wherein prediction intervals (PIs) are preferable results of the prediction, rather than point estimates, because they provide information on the confidence in the prediction. In this paper, two different machine learning approaches to assess PIs of time-series predictions are considered and compared: 1) multilayer perceptron neural networks trained with a multiobjective genetic algorithm and 2) extreme learning machines combined with the nearest neighbors approach. The proposed approaches are applied for short-term wind speed prediction from a real data set of hourly wind speed measurements for the region of Regina in Saskatchewan, Canada. Both approaches demonstrate good prediction precision and provide complementary advantages with respect to different evaluation criteria.

  3. Predicting short-term weight loss using four leading health behavior change theories

    Directory of Open Access Journals (Sweden)

    Barata José T

    2007-04-01

    Full Text Available Abstract Background This study was conceived to analyze how exercise and weight management psychosocial variables, derived from several health behavior change theories, predict weight change in a short-term intervention. The theories under analysis were the Social Cognitive Theory, the Transtheoretical Model, the Theory of Planned Behavior, and Self-Determination Theory. Methods Subjects were 142 overweight and obese women (BMI = 30.2 ± 3.7 kg/m2; age = 38.3 ± 5.8y, participating in a 16-week University-based weight control program. Body weight and a comprehensive psychometric battery were assessed at baseline and at program's end. Results Weight decreased significantly (-3.6 ± 3.4%, p Conclusion The present models were able to predict 20–30% of variance in short-term weight loss and changes in weight management self-efficacy accounted for a large share of the predictive power. As expected from previous studies, exercise variables were only moderately associated with short-term outcomes; they are expected to play a larger explanatory role in longer-term results.

  4. Prediction of Sea Surface Temperature Using Long Short-Term Memory

    Science.gov (United States)

    Zhang, Qin; Wang, Hui; Dong, Junyu; Zhong, Guoqiang; Sun, Xin

    2017-10-01

    This letter adopts long short-term memory(LSTM) to predict sea surface temperature(SST), which is the first attempt, to our knowledge, to use recurrent neural network to solve the problem of SST prediction, and to make one week and one month daily prediction. We formulate the SST prediction problem as a time series regression problem. LSTM is a special kind of recurrent neural network, which introduces gate mechanism into vanilla RNN to prevent the vanished or exploding gradient problem. It has strong ability to model the temporal relationship of time series data and can handle the long-term dependency problem well. The proposed network architecture is composed of two kinds of layers: LSTM layer and full-connected dense layer. LSTM layer is utilized to model the time series relationship. Full-connected layer is utilized to map the output of LSTM layer to a final prediction. We explore the optimal setting of this architecture by experiments and report the accuracy of coastal seas of China to confirm the effectiveness of the proposed method. In addition, we also show its online updated characteristics.

  5. Ultra-Short-Term Wind Power Prediction Using a Hybrid Model

    Science.gov (United States)

    Mohammed, E.; Wang, S.; Yu, J.

    2017-05-01

    This paper aims to develop and apply a hybrid model of two data analytical methods, multiple linear regressions and least square (MLR&LS), for ultra-short-term wind power prediction (WPP), for example taking, Northeast China electricity demand. The data was obtained from the historical records of wind power from an offshore region, and from a wind farm of the wind power plant in the areas. The WPP achieved in two stages: first, the ratios of wind power were forecasted using the proposed hybrid method, and then the transformation of these ratios of wind power to obtain forecasted values. The hybrid model combines the persistence methods, MLR and LS. The proposed method included two prediction types, multi-point prediction and single-point prediction. WPP is tested by applying different models such as autoregressive moving average (ARMA), autoregressive integrated moving average (ARIMA) and artificial neural network (ANN). By comparing results of the above models, the validity of the proposed hybrid model is confirmed in terms of error and correlation coefficient. Comparison of results confirmed that the proposed method works effectively. Additional, forecasting errors were also computed and compared, to improve understanding of how to depict highly variable WPP and the correlations between actual and predicted wind power.

  6. Standardizing the performance evaluation of short-term wind prediction models

    DEFF Research Database (Denmark)

    Madsen, Henrik; Pinson, Pierre; Kariniotakis, G.

    2005-01-01

    evaluation of model performance. This paper proposes a standardized protocol for the evaluation of short-term wind-poser preciction systems. A number of reference prediction models are also described, and their use for performance comparison is analysed. The use of the protocol is demonstrated using results...... from both on-shore and off-shore wind forms. The work was developed in the frame of the Anemos project (EU R&D project) where the protocol has been used to evaluate more than 10 prediction systems....

  7. A hybrid PSO-ANFIS approach for short-term wind power prediction in Portugal

    Energy Technology Data Exchange (ETDEWEB)

    Pousinho, H.M.I. [Department of Electromechanical Engineering, University of Beira Interior, R. Fonte do Lameiro, 6201-001 Covilha (Portugal); Mendes, V.M.F. [Department of Electrical Engineering and Automation, Instituto Superior de Engenharia de Lisboa, R. Conselheiro Emidio Navarro, 1950-062 Lisbon (Portugal); Catalao, J.P.S. [Department of Electromechanical Engineering, University of Beira Interior, R. Fonte do Lameiro, 6201-001 Covilha (Portugal); Center for Innovation in Electrical and Energy Engineering, Instituto Superior Tecnico, Technical University of Lisbon, Av. Rovisco Pais, 1049-001 Lisbon (Portugal)

    2011-01-15

    The increased integration of wind power into the electric grid, as nowadays occurs in Portugal, poses new challenges due to its intermittency and volatility. Wind power prediction plays a key role in tackling these challenges. The contribution of this paper is to propose a new hybrid approach, combining particle swarm optimization and adaptive-network-based fuzzy inference system, for short-term wind power prediction in Portugal. Significant improvements regarding forecasting accuracy are attainable using the proposed approach, in comparison with the results obtained with five other approaches. (author)

  8. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  9. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  10. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    Science.gov (United States)

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-02

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.

  11. Prospective and retrospective evaluation of five-year earthquake forecast models for California

    Science.gov (United States)

    Strader, Anne; Schneider, Max; Schorlemmer, Danijel

    2017-10-01

    The Collaboratory for the Study of Earthquake Predictability was developed to prospectively test earthquake forecasts through reproducible and transparent experiments within a controlled environment. From January 2006 to December 2010, the Regional Earthquake Likelihood Models (RELM) Working Group developed and evaluated thirteen time-invariant prospective earthquake mainshock forecasts. The number, spatial and magnitude components of the forecasts were compared to the observed seismicity distribution using a set of likelihood-based consistency tests. In this RELM experiment update, we assess the long-term forecasting potential of the RELM forecasts. Additionally, we evaluate RELM forecast performance against the Uniform California Earthquake Rupture Forecast (UCERF2) and the National Seismic Hazard Mapping Project (NSHMP) forecasts, which are used for seismic hazard analysis for California. To test each forecast's long-term stability, we also evaluate each forecast from January 2006 to December 2015, which contains both five-year testing periods, and the 40-year period from January 1967 to December 2006. Multiple RELM forecasts, which passed the N-test during the retrospective (January 2006 to December 2010) period, overestimate the number of events from January 2011 to December 2015, although their forecasted spatial distributions are consistent with observed earthquakes. Both the UCERF2 and NSHMP forecasts pass all consistency tests for the two five-year periods; however, they tend to underestimate the number of observed earthquakes over the 40-year testing period. The smoothed seismicity model Helmstetter-et-al.Mainshock outperforms both United States Geological Survey (USGS) models during the second five-year experiment, and contains higher forecasted seismicity rates than the USGS models at multiple observed earthquake locations.

  12. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  13. Robust and Adaptive Online Time Series Prediction with Long Short-Term Memory

    Directory of Open Access Journals (Sweden)

    Haimin Yang

    2017-01-01

    Full Text Available Online time series prediction is the mainstream method in a wide range of fields, ranging from speech analysis and noise cancelation to stock market analysis. However, the data often contains many outliers with the increasing length of time series in real world. These outliers can mislead the learned model if treated as normal points in the process of prediction. To address this issue, in this paper, we propose a robust and adaptive online gradient learning method, RoAdam (Robust Adam, for long short-term memory (LSTM to predict time series with outliers. This method tunes the learning rate of the stochastic gradient algorithm adaptively in the process of prediction, which reduces the adverse effect of outliers. It tracks the relative prediction error of the loss function with a weighted average through modifying Adam, a popular stochastic gradient method algorithm for training deep neural networks. In our algorithm, the large value of the relative prediction error corresponds to a small learning rate, and vice versa. The experiments on both synthetic data and real time series show that our method achieves better performance compared to the existing methods based on LSTM.

  14. Short-Term Coalmine Gas Concentration Prediction Based on Wavelet Transform and Extreme Learning Machine

    Directory of Open Access Journals (Sweden)

    Wu Xiang

    2014-01-01

    Full Text Available It is well known that coalmine gas concentration forecasting is very significant to ensure the safety of mining. Owing to the high-frequency, nonstationary fluctuations and chaotic properties of the gas concentration time series, a gas concentration forecasting model utilizing the original raw data often leads to an inability to provide satisfying forecast results. A hybrid forecasting model that integrates wavelet transform and extreme learning machine (ELM termed as WELM (wavelet based ELM for coalmine gas concentration is proposed. Firstly, the proposed model employs Mallat algorithm to decompose and reconstruct the gas concentration time series to isolate the low-frequency and high-frequency information. Then, ELM model is built for the prediction of each component. At last, these predicted values are superimposed to obtain the predicted values of the original sequence. This method makes an effective separation of the feature information of gas concentration time series and takes full advantage of multi-ELM prediction models with different parameters to achieve divide and rule. Comparative studies with existing prediction models indicate that the proposed model is very promising and can be implemented in one-step or multistep ahead prediction.

  15. Web-based decision support system to predict risk level of long term rice production

    Science.gov (United States)

    Mukhlash, Imam; Maulidiyah, Ratna; Sutikno; Setiyono, Budi

    2017-09-01

    Appropriate decision making in risk management of rice production is very important in agricultural planning, especially for Indonesia which is an agricultural country. Good decision would be obtained if the supporting data required are satisfied and using appropriate methods. This study aims to develop a Decision Support System that can be used to predict the risk level of rice production in some districts which are central of rice production in East Java. Web-based decision support system is constructed so that the information can be easily accessed and understood. Components of the system are data management, model management, and user interface. This research uses regression models of OLS and Copula. OLS model used to predict rainfall while Copula model used to predict harvested area. Experimental results show that the models used are successfully predict the harvested area of rice production in some districts which are central of rice production in East Java at any given time based on the conditions and climate of a region. Furthermore, it can predict the amount of rice production with the level of risk. System generates prediction of production risk level in the long term for some districts that can be used as a decision support for the authorities.

  16. Assessing long-term postseismic deformation following the M7.2 4 April 2010, El Mayor-Cucapah earthquake with implications for lithospheric rheology in the Salton Trough

    Science.gov (United States)

    Spinler, Joshua C.; Bennett, Richard A.; Walls, Chris; Lawrence, Shawn; González García, J. Javier

    2015-05-01

    The 4 April 2010 Mw 7.2 El Mayor-Cucapah (EMC) earthquake provides the best opportunity to date to study the lithospheric response to a large (>M6) magnitude earthquake in the Salton Trough region through analysis of Global Positioning System (GPS) data. In conjunction with the EarthScope Plate Boundary Observatory (PBO), we installed six new continuous GPS stations in the months following the EMC earthquake to increase station coverage in the epicentral region of northern Baja California, Mexico. We modeled the pre-EMC deformation field using available campaign and continuous GPS data for southern California and northern Baja California and inferred a pre-EMC secular rate at each new station location. Through direct comparison of the pre- and post-EMC secular rates, we calculate long-term changes associated with viscoelastic relaxation in the Salton Trough region. We fit these velocity changes using numerical models employing an elastic upper crustal layer underlain by a viscoelastic lower crustal layer and a mantle half-space. Forward models that produce the smallest weighted sum of squared residuals have an upper mantle viscosity in the range 4-6 × 1018 Pa s and a less well-resolved lower crustal viscosity in the range 2 × 1019 to 1 × 1022 Pa s. A high-viscosity lower crust, despite high heat flow in the Salton Trough region, is inconsistent with felsic composition and might suggest accretion of mafic lower crust associated with crustal spreading obscured by thick sedimentary cover.

  17. Prediction of the long-term efficacy of STA-MCA bypass by DSC-PI

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-01-01

    Full Text Available Superficial temporal artery-middle cerebral artery (STA-MCA bypass [1,2] is an important and effective type of surgical revascularization that is widely used in the treatment of ischemic cerebral artery disease. However, a means of predicting its postoperative efficacy has not been established [3,4]. The present study analyzes the correlation between preoperative perfusion parameters (obtained using dynamic susceptibility contrast-enhanced perfusion imaging, DSC-PI and postoperative long-term prognosis (using modified Rankin Scale, mRS scores. The preoperative perfusion parameters were defined by a combination of perfusion-weighted imaging and the Alberta Stroke Program Early Computerized Tomography Score (PWI-ASPECTS and included cerebral blood flow (CBF-ASPECTS, cerebral blood volume (CBV-ASPECTS, mean transit time (MTT-ASPECTS, and time to peak (TTP-ASPECTS. Preoperative and postoperative scores were determined for 33 patients that received a unilateral STA-MCA bypass in order to discover the most reliable imaging predictive index as well as to define the threshold value for a favorable clinical outcome. The results showed that all of the PWI-ASPECTS scores were significantly negatively correlated with clinical prognosis. Receiver operating curve (ROC analysis of the preoperative parameters in relation to long term prognosis showed the area under curve (AUC was maximal for the CBF-ASPECTS score (P = 0.002. A preoperative score of less than six indicated a poor postoperative prognosis (sensitivity = 74.1%, specificity = 100%, AUC = 0.843. In conclusion, preoperative PWI-ASPECTS scores have been found useful as predictive indexes for the long-term prognosis of STA-MCA bypass patients, with higher scores indicating better postoperative long-term outcomes. As the most valuable prognostic indicator, the preoperative CBF-ASPECTS score has potential for use as a major index in screening and outcome prediction of patients under consideration for STA

  18. Peripheral Quantitative Computed Tomography Predicts Humeral Diaphysis Torsional Mechanical Properties With Good Short-Term Precision.

    Science.gov (United States)

    Weatherholt, Alyssa M; Avin, Keith G; Hurd, Andrea L; Cox, Jacob L; Marberry, Scott T; Santoni, Brandon G; Warden, Stuart J

    2015-01-01

    Peripheral quantitative computed tomography (pQCT) is a popular tool for noninvasively estimating bone mechanical properties. Previous studies have demonstrated that pQCT provides precise estimates that are good predictors of actual bone mechanical properties at popular distal imaging sites (tibia and radius). The predictive ability and precision of pQCT at more proximal sites remain unknown. The aim of the present study was to explore the predictive ability and short-term precision of pQCT estimates of mechanical properties of the midshaft humerus, a site gaining popularity for exploring the skeletal benefits of exercise. Predictive ability was determined ex vivo by assessing the ability of pQCT-derived estimates of torsional mechanical properties in cadaver humeri (density-weighted polar moment of inertia [I(P)] and polar strength-strain index [SSI(P)]) to predict actual torsional properties. Short-term precision was assessed in vivo by performing 6 repeat pQCT scans at the level of the midshaft humerus in 30 young, healthy individuals (degrees of freedom = 150), with repeat scans performed by the same and different testers and on the same and different days to explore the influences of different testers and time between repeat scans on precision errors. IP and SSI(P) both independently predicted at least 90% of the variance in ex vivo midshaft humerus mechanical properties in cadaveric bones. Overall values for relative precision error (root mean squared coefficients of variation) for in vivo measures of IP and SSI(P) at the midshaft humerus were mechanical properties with good short-term precision, with measures being robust against the influences of different testers and time between repeat scans. Copyright © 2015 The International Society for Clinical Densitometry. Published by Elsevier Inc. All rights reserved.

  19. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)

    2013-10-15

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  20. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  1. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  2. Ethical aspects of a predictive test for Huntington's Disease: A long term perspective.

    Science.gov (United States)

    Andersson, Petra Lilja; Petersén, Åsa; Graff, Caroline; Edberg, Anna-Karin

    2016-08-01

    A predictive genetic test for Huntington's disease can be used before any symptoms are apparent, but there is only sparse knowledge about the long-term consequences of a positive test result. Such knowledge is important in order to gain a deeper understanding of families' experiences. The aim of the study was to describe a young couple's long-term experiences and the consequences of a predictive test for Huntington's disease. A descriptive case study design was used with a longitudinal narrative life history approach. The study was based on 18 interviews with a young couple, covering a period of 2.5 years; starting 6 months after the disclosure of the test results showing the woman to be a carrier of the gene causing Huntington's disease. Even though the study was extremely sensitive, where potential harm constantly had to be balanced against the benefits, the couple had a strong wish to contribute to increased knowledge about people in their situation. The study was approved by the ethics committee. The results show that the long-term consequences were devastating for the family. This 3-year period was characterized by anxiety, repeated suicide attempts, financial difficulties and eventually divorce. By offering a predictive test, the healthcare system has an ethical and moral responsibility. Once the test result is disclosed, the individual and the family cannot live without the knowledge it brings. Support is needed in a long-term perspective and should involve counselling concerning the families' everyday life involving important decision-making, reorientation towards a new outlook of the future and the meaning of life. As health professionals, our ethical and moral responsibility thus embraces not only the phase in direct connection to the actual genetic test but also a commitment to provide support to help the family deal with the long-term consequences of the test. © The Author(s) 2015.

  3. Rapid prediction of long-term rates of contaminant desorption from soils and sediments.

    Science.gov (United States)

    Johnson, M D; Weber, W J

    2001-01-15

    A method using heated and superheated (subcritical) water is described for rapid prediction of long-term desorption rates from contaminated geosorbents. Rates of contaminant release are measured at temperatures between 75 and 150 degrees C using a dynamic water desorption technique. The subcritical desorption rate data are then modeled to calculate apparent activation energies, and these activation energies are used to predict desorption behaviors at any desired ambient temperature. Predictions of long-term release rates based on this methodology were found to correlate well with experimental 25 degrees C desorption data measured over periods of up to 640 days, even though the 25 degrees C desorption rates were observed to vary by up to 2 orders of magnitude for different geosorbent types and initial solid phase contaminant loading levels. Desorption profiles measured under elevated temperature and pressure conditions closely matched those at 25 degrees C and ambient pressure, but the time scales associated with the high-temperature measurements were up to 3 orders of magnitude lower. The subcritical water technique rapidly estimates rates of desorption-resistant contaminant release as well as those for more labile substances. The practical implications of the methodology are significant because desorption observed under field conditions and ambient temperatures typically proceeds over periods of months or years, while the high temperature experiments used for prediction of such field desorption phenomena can be completed within periods of only hours or days.

  4. Lessons of L'Aquila for Operational Earthquake Forecasting

    Science.gov (United States)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  5. Important meteorological variables for statistical long-term air quality prediction in eastern China

    Science.gov (United States)

    Zhang, Libo; Liu, Yongqiang; Zhao, Fengjun

    2017-09-01

    Weather is an important factor for air quality. While there have been increasing attentions to long-term (monthly and seasonal) air pollution such as regional hazes from land-clearing fires during El Niño, the weather-air quality relationships are much less understood at long-term than short-term (daily and weekly) scales. This study is aimed to fill this gap through analyzing correlations between meteorological variables and air quality at various timescales. A regional correlation scale was defined to measure the longest time with significant correlations at a substantial large number of sites. The air quality index (API) and five meteorological variables during 2001-2012 at 40 eastern China sites were used. The results indicate that the API is correlated to precipitation negatively and air temperature positively across eastern China, and to wind, relative humidity and air pressure with spatially varied signs. The major areas with significant correlations vary with meteorological variables. The correlations are significant not only at short-term but also at long-term scales, and the important variables are different between the two types of scales. The concurrent regional correlation scales reach seasonal at p air temperature and relative humidity. Precipitation, which was found to be the most important variable for short-term air quality conditions, and air pressure are not important for long-term air quality. The lagged correlations are much smaller in magnitude than the concurrent correlations and their regional correction scales are at long term only for wind speed and relative humidity. It is concluded that wind speed should be considered as a primary predictor for statistical prediction of long-term air quality in a large region over eastern China. Relative humidity and temperature are also useful predictors but at less significant levels.

  6. On the prediction of long term creep strength of creep resistant steels

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Mi; Wang, Qiao; Song, Xin-Li; Jia, Juan; Xiang, Zhi-Dong [Wuhan University of Science and Technology (China). The State Key Laboratory of Refractories and Metallurgy

    2016-02-15

    When the conventional power law creep equation is applied to rationalise the creep data of creep resistant steels, its parameters depend strongly on stress and temperature and hence cannot be used to predict long term creep properties. Here, it is shown that this problem can be resolved if it is modified to satisfy two boundary conditions, i.e. when σ (stress) = 0, ε{sub min} (minimum creep rate) = 0, and when σ = σ{sub TS} (tensile stress at creep temperature T), ε{sub min} = ∞. This can be achieved by substituting the reference stress σ{sub 0} in the conventional equation by the term (σ{sub TS} - σ). The new power law creep equation describing the stress and temperature dependence of minimum creep rate can then be applied to predict long term creep strength from data of short term measurements. This is demonstrated using the creep and tensile strength data measured for 11Cr-2W-0.4Mo-1Cu-Nb-V steel (tube).

  7. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  8. Mid- and long-term runoff predictions by an improved phase-space reconstruction model.

    Science.gov (United States)

    Hong, Mei; Wang, Dong; Wang, Yuankun; Zeng, Xiankui; Ge, Shanshan; Yan, Hengqian; Singh, Vijay P

    2016-07-01

    In recent years, the phase-space reconstruction method has usually been used for mid- and long-term runoff predictions. However, the traditional phase-space reconstruction method is still needs to be improved. Using the genetic algorithm to improve the phase-space reconstruction method, a new nonlinear model of monthly runoff is constructed. The new model does not rely heavily on embedding dimensions. Recognizing that the rainfall-runoff process is complex, affected by a number of factors, more variables (e.g. temperature and rainfall) are incorporated in the model. In order to detect the possible presence of chaos in the runoff dynamics, chaotic characteristics of the model are also analyzed, which shows the model can represent the nonlinear and chaotic characteristics of the runoff. The model is tested for its forecasting performance in four types of experiments using data from six hydrological stations on the Yellow River and the Yangtze River. Results show that the medium-and long-term runoff is satisfactorily forecasted at the hydrological stations. Not only is the forecasting trend accurate, but also the mean absolute percentage error is no more than 15%. Moreover, the forecast results of wet years and dry years are both good, which means that the improved model can overcome the traditional ''wet years and dry years predictability barrier,'' to some extent. The model forecasts for different regions are all good, showing the universality of the approach. Compared with selected conceptual and empirical methods, the model exhibits greater reliability and stability in the long-term runoff prediction. Our study provides a new thinking for research on the association between the monthly runoff and other hydrological factors, and also provides a new method for the prediction of the monthly runoff. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Bee-venom allergy in children: long-term predictive value of standardized challenge tests.

    Science.gov (United States)

    Schuetze, Georg E; Forster, Johannes; Hauk, Pia J; Friedl, Katrin; Kuehr, Joachim

    2002-02-01

    Venom immunotherapy (VIT) is able to protect insect venom-allergic patients against life-threatening sting reactions. Standardized sting challenges can be used as a diagnostic tool to check whether VIT is required. No data are available on the long-term predictive value of sting challenge tests. The purpose of this study was to investigate the long-term predictive value of sequential bee-sting challenges with respect to the ability to predict future sting reactions in bee-venom (BV) allergic children. Between 1988 and 1992, 92 BV-allergic children had been challenged with sequential bee stings at intervals of 2-6 weeks to determine the necessity of VIT. In 1996, all 92 families were followed-up using standardized telephone interviews. Until the follow-up, 61 children (66.3%) had experienced at least one natural bee sting. Based on the results of the initial challenge tests, 13 of the 61 patients had been started on VIT. Two of these 13 (15.4%) developed systemic reactions 1 year after VIT of 5 years, of which one was mild and one was severe. Among the 48 re-stung patients who were not treated with VIT, three children (6.3%) experienced mild systemic reactions, whereas 45 children reported no more than a local reaction. The long-term predictive value of sequential bee-sting challenge tests for systemic reactions in children not treated with VIT remained at a level of 93.8% (95% confidence interval: 82.8-98.7%) even over a period of more than 6 years. Based on this data, we conclude that sequential bee-sting challenges are a powerful tool to determine the necessity for VIT in BV-allergic children.

  10. Self-discrepancy: Long-term test-retest reliability and test-criterion predictive validity.

    Science.gov (United States)

    Watson, Neill; Bryan, Brandon C; Thrash, Todd M

    2016-01-01

    Long-term test-retest reliability and predictive test-criterion evidence of validity of scores on measures of the real-ideal self-discrepancy and of the real-ought self-discrepancy were tested over periods of 1 year and 3 years. A sample of 184 undergraduates completed at 2 time points 1 year apart 3 instruments that each measure the 2 self-discrepancies: the idiographic Self-Concept Questionnaire-Personal Constructs, the nonidiographic Self-Concept Questionnaire-Conventional Constructs, and the content-free Abstract Measures. A separate sample of 141 undergraduates completed the instruments 3 years apart. Both samples completed 3 depression instruments and 3 anxiety instruments at the second time point. Results of analyses using latent variables modeled with 3 observed variables showed substantial statistically significant test-retest reliabilities and significant test-criterion prediction of anxiety and depression on the real-ideal and real-ought discrepancy measures over both time periods. Results for the observed variables showed significant 1-year and 3-year reliabilities for scores on all self-discrepancy measures, as well as significant 1-year and 3-year predictive validity for scores on all self-discrepancy measures, except the abstract measure of real-ought discrepancy in predicting scores on all depression measures and on at least 1 anxiety measure. The findings support very strong long-term stabilities of the self-discrepancy personality constructs and their long-term associations with anxiety and depression. (c) 2016 APA, all rights reserved).

  11. Predicting fractures using bone mineral density: a prospective study of long-term care residents.

    Science.gov (United States)

    Broe, K E; Hannan, M T; Kiely, D K; Cali, C M; Cupples, L A; Kiel, D P

    2000-01-01

    Bone mineral density (BMD) has been shown to predict fracture risk in community-dwelling older persons; however, no comparable prospective study has been performed in the long-term care setting where the role of BMD testing is uncertain. To determine the ability of a single BMD measurement to predict the risk of subsequent fracture in long-term care residents, we designed a prospective study in a 725-bed long-term care facility. A total of 252 Caucasian nursing home residents (mean age 88 years, 74% women) were recruited between 1992 and 1998. BMD of the hip, radius or both sites was measured using dual-energy X-ray absorptiometry. Participants were followed through September 1999 for the occurrence of fracture. Cox proportional hazards regression models were constructed to determine the relationship between BMD and the risk of fracture controlling for potentially confounding variables. Sixty-three incident osteoporotic fractures occurred during a median follow-up time of 2.3 years. The multivariate-adjusted risk of fracture for each standard deviation decrease in BMD was 2.82 (95% CI 1.81-4.42) at the total hip, 2.79 (95% CI 1.69-4.61) at the femoral neck, 2.26 (95% CI 1.51-3.38) at the trochanter, 1.83 (95% CI 1.14-2.94) at the radial shaft and 1.84 (95% CI 1.21-2.80) at the ultradistal radius. Subjects in the lowest age-specific quartile of femoral neck BMD had over 4 times the incidence of fracture compared with those in the highest quartile. BMD at either hip or radius was a predictor of osteoporotic fracture, although in women, radial BMD did not predict fracture. Knowledge of BMD in long-term care residents provides important information on subsequent fracture risk.

  12. EEG background features that predict outcome in term neonates with hypoxic ischaemic encephalopathy: A structured review.

    Science.gov (United States)

    Awal, Md Abdul; Lai, Melissa M; Azemi, Ghasem; Boashash, Boualem; Colditz, Paul B

    2016-01-01

    Hypoxic ischaemic encephalopathy is a significant cause of mortality and morbidity in the term infant. Electroencephalography (EEG) is a useful tool in the assessment of newborns with HIE. This systematic review of published literature identifies those background features of EEG in term neonates with HIE that best predict neurodevelopmental outcome. A literature search was conducted using the PubMed, EMBASE and CINAHL databases from January 1960 to April 2014. Studies included in the review described recorded EEG background features, neurodevelopmental outcomes at a minimum age of 12 months and were published in English. Pooled sensitivities and specificities of EEG background features were calculated and meta-analyses were performed for each background feature. Of the 860 articles generated by the initial search strategy, 52 studies were identified as potentially relevant. Twenty-one studies were excluded as they did not distinguish between different abnormal background features, leaving 31 studies from which data were extracted for the meta-analysis. The most promising neonatal EEG features are: burst suppression (sensitivity 0.87 [95% CI (0.78-0.92)]; specificity 0.82 [95% CI (0.72-0.88)]), low voltage (sensitivity 0.92 [95% CI (0.72-0.97)]; specificity 0.99 [95% CI (0.88-1.0)]), and flat trace (sensitivity 0.78 [95% CI (0.58-0.91)]; specificity 0.99 [95% CI (0.88-1.0)]). Burst suppression, low voltage and flat trace in the EEG of term neonates with HIE most accurately predict long term neurodevelopmental outcome. This structured review and meta-analysis provides quality evidence of the background EEG features that best predict neurodevelopmental outcome. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  13. Predicted body weight relationships for protective ventilation - unisex proposals from pre-term through to adult.

    Science.gov (United States)

    Martin, Dion C; Richards, Glenn N

    2017-05-23

    The lung-protective ventilation bundle has been shown to reduce mortality in adult acute respiratory distress syndrome (ARDS). This concept has expanded to other areas of acute adult ventilation and is recommended for pediatric ventilation. A component of lung-protective ventilation relies on a prediction of lean body weight from height. The predicted body weight (PBW) relationship employed in the ARDS Network trial is considered valid only for adults, with a dedicated formula required for each sex. No agreed PBW formula applies to smaller body sizes. This analysis investigated whether it might be practical to derive a unisex PBW formula spanning all body sizes, while retaining relevance to established adult protective ventilation practice. Historic population-based growth charts were adopted as a reference for lean body weight, from pre-term infant through to adult median weight. The traditional ARDSNet PBW formulae acted as the reference for prevailing protective ventilation practice. Error limits for derived PBW models were relative to these references. The ARDSNet PBW formulae typically predict weights heavier than the population median, therefore no single relationship could satisfy both references. Four alternate piecewise-linear lean body-weight predictive formulae were presented for consideration, each with different balance between the objectives. The 'PBWuf + MBW' model is proposed as an appropriate compromise between prevailing practice and simplification, while also better representing lean adult body-weight. This model applies the ARDSNet 'female' formula to both adult sexes, while providing a tight fit to median body weight at smaller statures down to pre-term. The 'PBWmf + MBW' model retains consistency with current practice over the adult range, while adding prediction for small statures.

  14. Early Posttransplant Tryptophan Metabolism Predicts Long-term Outcome of Human Kidney Transplantation.

    Science.gov (United States)

    Vavrincova-Yaghi, Diana; Seelen, Marc A; Kema, Ido P; Deelman, Leo E; van der Heuvel, Marius C; Breukelman, Henk; Van den Eynde, Benoit J; Henning, Rob H; van Goor, Harry; Sandovici, Maria

    2015-08-01

    Chronic transplant dysfunction (CTD) is the leading cause of long-term loss of the renal allograft. So far, no single test is available to reliably predict the risk for CTD. Monitoring of tryptophan (trp) metabolism through indoleamine 2.3-dioxygenase (IDO) has been previously proposed to predict acute rejection of human kidney transplants. Here, we investigate the potential of IDO/trp degradation along the kynurenine (kyn) pathway to predict the long-term outcome of human kidney transplantation. During the 2-year follow-up blood, urine, and kidney biopsies were collected from 48 renal transplant patients. Concentrations of kyn and trp in serum and urine were measured at 2 weeks, 6 months, and 2 years after transplantation. Kynurenine to tryptophan ratio was calculated as an estimate of trp degradation. To evaluate the histological changes and IDO expression, respectively, periodic acid schiff staining and immunohistochemistry for IDO were performed on biopsies taken at 6 months and 2 years. Two years after transplantation, kyn/trp was increased in urine and decreased in serum as compared to 2-week values. In 2-year biopsies, IDO expression was mainly found in infiltrating inflammatory cells and in the glomeruli. The urine level of trp 2 weeks after transplantation predicted the serum creatinine 6 months and the estimated creatinine clearance 2 years after transplantation. Additionally, serum level of kyn 6 months after transplantation predicted the serum creatinine 2 years after transplantation. Early serum and urine levels of trp and kyn may offer a novel route for early detection of patients at risk for developing CTD.

  15. Quantitative MRI predicts long-term structural and functional outcome after experimental traumatic brain injury.

    Science.gov (United States)

    Immonen, Riikka J; Kharatishvili, Irina; Gröhn, Heidi; Pitkänen, Asla; Gröhn, Olli H J

    2009-03-01

    In traumatic brain injury (TBI) the initial impact causes both immediate damage and also launches a cascade of slowly progressive secondary damage. The chronic outcome disabilities vary greatly and can occur several years later. The aim of this study was to find predictive factors for the long-term outcome using multiparametric, non-invasive magnetic resonance imaging (MRI) methodology and a clinically relevant rat model of fluid percussion induced TBI. Our results demonstrated that the multiparametric quantitative MRI (T(2), T(1rho), trace of the diffusion tensor D(av), the extent of hyperintense lesion and intracerebral hemorrhage) acquired during acute and sub acute phases 3 h, 3 days, 9 days and 23 days post-injury has potential to predict the functional and histopathological outcome 6 to 12 months later. The acute D(av) changes in the ipsilateral hippocampus correlated with the chronic spatial learning and memory impairment evaluated using the Morris water maze (phelp to predict the long-term outcome after experimental TBI.

  16. Predicting long-term risk for relationship dissolution using nonparametric conditional survival trees.

    Science.gov (United States)

    Kliem, Sören; Weusthoff, Sarah; Hahlweg, Kurt; Baucom, Katherine J W; Baucom, Brian R

    2015-12-01

    Identifying risk factors for divorce or separation is an important step in the prevention of negative individual outcomes and societal costs associated with relationship dissolution. Programs that aim to prevent relationship distress and dissolution typically focus on changing processes that occur during couple conflict, although the predictive ability of conflict-specific variables has not been examined in the context of other factors related to relationship dissolution. The authors examine whether emotional responding and communication during couple conflict predict relationship dissolution after controlling for overall relationship quality and individual well-being. Using nonparametric conditional survival trees, the study at hand simultaneously examined the predictive abilities of physiological (systolic and diastolic blood pressure, heart rate, cortisol) and behavioral (fundamental frequency; f0) indices of emotional responding, as well as observationally coded positive and negative communication behavior, on long-term relationship stability after controlling for relationship satisfaction and symptoms of depression. One hundred thirty-six spouses were assessed after participating in a randomized clinical trial of a relationship distress prevention program as well as 11 years thereafter; 32.5% of the couples' relationships had dissolved by follow up. For men, the only significant predictor of relationship dissolution was cortisol change score (p = .012). For women, only f0 range was a significant predictor of relationship dissolution (p = .034). These findings highlight the importance of emotional responding during couple conflict for long-term relationship stability. (c) 2015 APA, all rights reserved).

  17. c-Fos expression predicts long-term social memory retrieval in mice.

    Science.gov (United States)

    Lüscher Dias, Thomaz; Fernandes Golino, Hudson; Moura de Oliveira, Vinícius Elias; Dutra Moraes, Márcio Flávio; Schenatto Pereira, Grace

    2016-10-15

    The way the rodent brain generally processes socially relevant information is rather well understood. How social information is stored into long-term social memory, however, is still under debate. Here, brain c-Fos expression was measured after adult mice were exposed to familiar or novel juveniles and expression was compared in several memory and socially relevant brain areas. Machine Learning algorithm Random Forest was then used to predict the social interaction category of adult mice based on c-Fos expression in these areas. Interaction with a familiar co-specific altered brain activation in the olfactory bulb, amygdala, hippocampus, lateral septum and medial prefrontal cortex. Remarkably, Random Forest was able to predict interaction with a familiar juvenile with 100% accuracy. Activity in the olfactory bulb, amygdala, hippocampus and the medial prefrontal cortex were crucial to this prediction. From our results, we suggest long-term social memory depends on initial social olfactory processing in the medial amygdala and its output connections synergistically with non-social contextual integration by the hippocampus and medial prefrontal cortex top-down modulation of primary olfactory structures. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Predicting outcome in term neonates with hypoxic-ischaemic encephalopathy using simplified MR criteria

    Energy Technology Data Exchange (ETDEWEB)

    Jyoti, Rajeev; O' Neil, Ross [Canberra Hospital, Medical Imaging, Canberra, ACT (Australia)

    2006-01-01

    MRI is an established investigation in the evaluation of neonates with suspected hypoxic-ischaemic encephalopathy (HIE). However, its role as a predictor of neurodevelopmental outcome remains complex. To establish reproducible simplified MR criteria and evaluate their role in predicting neurodevelopmental outcome in term neonates with HIE. Term neonates with suspected HIE had MRI at 7-10 days of age. MR scans were interpreted according to new simplified criteria by two radiologists blinded to the clinical course and outcome. The new simplified criteria allocated grade 1 to cases with no central and less than 10% peripheral change, grade 2 to those with less than 30% central and/or 10-30% peripheral area change, and grade 3 to those with more than 30% central or peripheral change. MRI changes were compared with clinical neurodevelopmental outcome evaluated prospectively at 1 year of age. Neurodevelopmental outcome was based upon the DQ score (revised Griffith's) and cerebral palsy on neurological assessment. Of 20 subjects, all those showing severe (grade 3) MR changes (35%) died or had poor neurodevelopmental outcome. Subjects with a normal MR scan or with scans showing only mild (grade 1) MR changes (55%) had normal outcomes. One subject showing moderate (grade 2) changes on MRI had a moderate outcome (5%), while another had an atypical pattern of MR changes with a normal outcome (5%). Assessment of full-term neonates with suspected HIE using the simplified MR criteria is highly predictive of neurodevelopmental outcome. (orig.)

  19. Short term, high fat-feeding induced changes in white adipose tissue gene expression are highly predictive for long term changes

    NARCIS (Netherlands)

    Voigt, A.; Agnew, K.; Schothorst, van E.M.; Keijer, J.; Klaus, S.

    2013-01-01

    Scope - We aimed to evaluate the predictability of short-term (5 days) changes in epididymal white adipose tissue (eWAT) gene expression for long-term (12 weeks) changes induced by high-fat diet (HFD) feeding. Methods and results - Mice were fed semisynthetic diets containing 10 (low-fat diet) or 40

  20. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    Science.gov (United States)

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  1. Hazard Assessment and Early Warning of Tsunamis: Lessons from the 2011 Tohoku earthquake

    Science.gov (United States)

    Satake, K.

    2012-12-01

    The March 11, 2011 Tohoku earthquake (M 9.0) was the largest earthquake in Japanese history, and was the best recorded subduction-zone earthquakes in the world. In particular, various offshore geophysical observations revealed large horizontal and vertical seafloor movements, and the tsunami was recorded on high-quality, high-sampling gauges. Analysis of such tsunami waveforms shows a temporal and spatial slip distribution during the 2011 Tohoku earthquake. The fault rupture started near the hypocenter and propagated into both deep and shallow parts of the plate interface. Very large, ~25 m, slip off Miyagi on the deep part of plate interface corresponds to an interplate earthquake of M 8.8, the location and size similar to 869 Jogan earthquake model, and was responsible for the large tsunami inundation in Sendai and Ishinomaki plains. Huge slip, more than 50 m, occurred on the shallow part near the trench axis ~3 min after the earthquake origin time. This delayed shallow rupture (M 8.8) was similar to the 1896 "tsunami earthquake," and was responsible for the large tsunami on the northern Sanriku coast, measured at ~100 km north of the largest slip. Thus the Tohoku earthquake can be decomposed into an interplate earthquake and the triggered "tsunami earthquake." The Japan Meteorological Agency issued tsunami warning 3 minutes after the earthquake, and saved many lives. However, their initial estimation of tsunami height was underestimated, because the earthquake magnitude was initially estimated as M 7.9, hence the computed tsunami heights were lower. The JMA attempts to improve the tsunami warning system, including technical developments to estimate the earthquake size in a few minutes by using various and redundant information, to deploy and utilize the offshore tsunami observations, and to issue a warning based on the worst case scenario if a possibility of giant earthquake exists. Predicting a trigger of another large earthquake would still be a challenge

  2. Granular Model of Long-Term Prediction for Energy System in Steel Industry.

    Science.gov (United States)

    Zhao, Jun; Han, Zhongyang; Pedrycz, Witold; Wang, Wei

    2016-02-01

    Sound energy scheduling and allocation is of paramount significance for the current steel industry, and the quantitative prediction of energy media is being regarded as the prerequisite for such challenging tasks. In this paper, a long-term prediction for the energy flows is proposed by using a granular computing-based method that considers industrial-driven semantics and granulates the initial data based on the specificity of manufacturing processes. When forming information granules on a basis of experimental data, we propose to deal with the unequal-length temporal granules by exploiting dynamic time warping, which becomes instrumental to the realization of the prediction model. The model engages the fuzzy C -means clustering method. To quantify the performance of the proposed method, real-world industrial energy data coming from a steel plant in China are employed. The experimental results demonstrate that the proposed method is superior to some other data-driven methods and becomes capable of satisfying the requirements of the practically viable prediction.

  3. Three Revised Kalman Filtering Models for Short-Term Rail Transit Passenger Flow Prediction

    Directory of Open Access Journals (Sweden)

    Pengpeng Jiao

    2016-01-01

    Full Text Available Short-term prediction of passenger flow is very important for the operation and management of a rail transit system. Based on the traditional Kalman filtering method, this paper puts forward three revised models for real-time passenger flow forecasting. First, the paper introduces the historical prediction error into the measurement equation and formulates a revised Kalman filtering model based on error correction coefficient (KF-ECC. Second, this paper employs the deviation between real-time passenger flow and corresponding historical data as state variable and presents a revised Kalman filtering model based on Historical Deviation (KF-HD. Third, the paper integrates nonparametric regression forecast into the traditional Kalman filtering method using a Bayesian combined technique and puts forward a revised Kalman filtering model based on Bayesian combination and nonparametric regression (KF-BCNR. A case study is implemented using statistical passenger flow data of rail transit line 13 in Beijing during a one-month period. The reported prediction results show that KF-ECC improves the applicability to historical trend, KF-HD achieves excellent accuracy and stability, and KF-BCNR yields the best performances. Comparisons among different periods further indicate that results during peak periods outperform those during nonpeak periods. All three revised models are accurate and stable enough for on-line predictions, especially during the peak periods.

  4. Using Long-Short-Term-Memory Recurrent Neural Networks to Predict Aviation Engine Vibrations

    Science.gov (United States)

    ElSaid, AbdElRahman Ahmed

    This thesis examines building viable Recurrent Neural Networks (RNN) using Long Short Term Memory (LSTM) neurons to predict aircraft engine vibrations. The different networks are trained on a large database of flight data records obtained from an airline containing flights that suffered from excessive vibration. RNNs can provide a more generalizable and robust method for prediction over analytical calculations of engine vibration, as analytical calculations must be solved iteratively based on specific empirical engine parameters, and this database contains multiple types of engines. Further, LSTM RNNs provide a "memory" of the contribution of previous time series data which can further improve predictions of future vibration values. LSTM RNNs were used over traditional RNNs, as those suffer from vanishing/exploding gradients when trained with back propagation. The study managed to predict vibration values for 1, 5, 10, and 20 seconds in the future, with 2.84% 3.3%, 5.51% and 10.19% mean absolute error, respectively. These neural networks provide a promising means for the future development of warning systems so that suitable actions can be taken before the occurrence of excess vibration to avoid unfavorable situations during flight.

  5. Improving protein disorder prediction by deep bidirectional long short-term memory recurrent neural networks.

    Science.gov (United States)

    Hanson, Jack; Yang, Yuedong; Paliwal, Kuldip; Zhou, Yaoqi

    2017-03-01

    Capturing long-range interactions between structural but not sequence neighbors of proteins is a long-standing challenging problem in bioinformatics. Recently, long short-term memory (LSTM) networks have significantly improved the accuracy of speech and image classification problems by remembering useful past information in long sequential events. Here, we have implemented deep bidirectional LSTM recurrent neural networks in the problem of protein intrinsic disorder prediction. The new method, named SPOT-Disorder, has steadily improved over a similar method using a traditional, window-based neural network (SPINE-D) in all datasets tested without separate training on short and long disordered regions. Independent tests on four other datasets including the datasets from critical assessment of structure prediction (CASP) techniques and >10 000 annotated proteins from MobiDB, confirmed SPOT-Disorder as one of the best methods in disorder prediction. Moreover, initial studies indicate that the method is more accurate in predicting functional sites in disordered regions. These results highlight the usefulness combining LSTM with deep bidirectional recurrent neural networks in capturing non-local, long-range interactions for bioinformatics applications. SPOT-disorder is available as a web server and as a standalone program at: http://sparks-lab.org/server/SPOT-disorder/index.php . j.hanson@griffith.edu.au or yuedong.yang@griffith.edu.au or yaoqi.zhou@griffith.edu.au. Supplementary data is available at Bioinformatics online.

  6. Adaptive ultra-short-term wind power prediction based on risk assessment

    DEFF Research Database (Denmark)

    Xue, Yusheng; Yu, Chen; Li, Kang

    2016-01-01

    A risk assessment based adaptive ultra-short-term wind power prediction (USTWPP) method is proposed in this paper. The method first extracts features from the historical data, and split every wind power time series (WPTS) into several subsets defined by their stationary patterns. A WPTS that does...... not match with any of the stationary patterns is then included into a subset of non-stationary patterns. Every WPTS subset is then related to a USTWPP model which is specially selected and optimized offline based on the proposed risk assessment index. For on-line applications, the pattern of the last short...... WPTS is first recognized, and the relevant prediction model is applied for USTWPP. Experimental results confirm the efficacy of the proposed method....

  7. A Short-Term Photovoltaic Power Prediction Model Based on an FOS-ELM Algorithm

    Directory of Open Access Journals (Sweden)

    Jidong Wang

    2017-04-01

    Full Text Available With the increasing proportion of photovoltaic (PV power in power systems, the problem of its fluctuation and intermittency has become more prominent. To reduce the negative influence of the use of PV power, we propose a short-term PV power prediction model based on the online sequential extreme learning machine with forgetting mechanism (FOS-ELM, which can constantly replace outdated data with new data. We use historical weather data and historical PV power data to predict the PV power in the next period of time. The simulation result shows that this model has the advantages of a short training time and high accuracy. This model can help the power dispatch department schedule generation plans as well as support spatial and temporal compensation and coordinated power control, which is important for the security and stability as well as the optimal operation of power systems.

  8. Short- versus long-term prediction of dementia among subjects with low and high educational levels.

    Science.gov (United States)

    Chary, Emilie; Amieva, Hélène; Pérès, Karine; Orgogozo, Jean-Marc; Dartigues, Jean-François; Jacqmin-Gadda, Hélène

    2013-09-01

    Using simple measures of cognition and disability in a prospective community-living cohort of normal elderly persons, the main objectives of our study were to distinguish short- and long-term predictors for dementia according to educational level and to propose a tool for early detection of subjects at high risk of dementia. Data derived from the French cohort study Paquid (Personnes Agées QUID), which included 3777 subjects, older than 65 years of age, who were followed for a 20-year period. The risk of dementia at 3 years and 10 years was estimated by logistic regression for repeated measures combining data from all the 3- and 10-year windows throughout the follow-up. Predictors included disability assessed by the number of dependent items among four instrumental activities of daily living (IADLs), four neuropsychological tests, five Mini-Mental State Examination (MMSE) subtests, and four items of subjective memory complaints. Of the 2882 included subjects, the number of IADLs remained a predictor of short- and long-term conversion to dementia for those with low educational level (combined with only one cognitive test) whereas the best predictors for more educated subjects combined subjective memory complaints and memory and executive function tests. The episodic memory subtest was the only predictive MMSE subtest. In the high-education-level group, the areas under the receiver-operating characteristic curve of the selected models were 0.85 for 3-year prediction and 0.78 for 10-year prediction. Early predictors of dementia are different according to educational level. Among subjects reaching the secondary school level, early detection of those at high risk of dementia is possible with good predictive performance, with a few simple objective and subjective cognitive evaluations. Copyright © 2013 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  9. From Points to Forecasts: Predicting Invasive Species Habitat Suitability in the Near Term

    Directory of Open Access Journals (Sweden)

    Tracy R. Holcombe

    2010-05-01

    Full Text Available We used near-term climate scenarios for the continental United States, to model 12 invasive plants species. We created three potential habitat suitability models for each species using maximum entropy modeling: (1 current; (2 2020; and (3 2035. Area under the curve values for the models ranged from 0.92 to 0.70, with 10 of the 12 being above 0.83 suggesting strong and predictable species-environment matching. Change in area between the current potential habitat and 2035 ranged from a potential habitat loss of about 217,000 km2, to a potential habitat gain of about 133,000 km2.

  10. Prediction of duration of active labor in nulliparous women at term.

    Science.gov (United States)

    Incerti, Maddalena; Locatelli, Anna; Ghidini, Alessandro; Ciriello, Elena; Malberti, Silvia; Consonni, Sara; Pezzullo, John C

    2008-02-01

    We have assessed the independent predictors of duration of active labor in nulliparous women at term. Using a cohort of 1067 nulliparae in spontaneous labor at > 37.0 weeks with singleton fetuses in vertex presentation, multivariate analysis was used to identify independent predictors of duration of active labor. Duration of active labor was 4.1 +/- 2.4 hours. Stepwise linear regression selected 10 independent predictors of duration of active labor: gestational age at delivery ( P amniotomy ( P oxytocin ( P labor. Ten variables are independent predictors of duration of active labor; when incorporated in a prediction formula they account for > 50% of the variability of duration of labor in nulliparous women.

  11. Predicting the short-term risk of diabetes in HIV-positive patients

    DEFF Research Database (Denmark)

    Petoumenos, Kathy; Worm, Signe Westring; Fontas, Eric

    2012-01-01

    HIV-positive patients receiving combination antiretroviral therapy (cART) frequently experience metabolic complications such as dyslipidemia and insulin resistance, as well as lipodystrophy, increasing the risk of cardiovascular disease (CVD) and diabetes mellitus (DM). Rates of DM and other...... glucose-associated disorders among HIV-positive patients have been reported to range between 2 and 14%, and in an ageing HIV-positive population, the prevalence of DM is expected to continue to increase. This study aims to develop a model to predict the short-term (six-month) risk of DM in HIV...

  12. Predicting and mitigating future biodiversity loss using long-term ecological proxies

    Science.gov (United States)

    Fordham, Damien A.; Akçakaya, H. Resit; Alroy, John; Saltré, Frédérik; Wigley, Tom M. L.; Brook, Barry W.

    2016-10-01

    Uses of long-term ecological proxies in strategies for mitigating future biodiversity loss are too limited in scope. Recent advances in geochronological dating, palaeoclimate reconstructions and molecular techniques for inferring population dynamics offer exciting new prospects for using retrospective knowledge to better forecast and manage ecological outcomes in the face of global change. Opportunities include using fossils, genes and computational models to identify ecological traits that caused species to be differentially prone to regional and range-wide extinction, test if threatened-species assessment approaches work and locate habitats that support stable ecosystems in the face of shifting climates. These long-term retrospective analyses will improve efforts to predict the likely effects of future climate and other environmental change on biodiversity, and target conservation management resources most effectively.

  13. Worldwide impact of aerosol's time scale on the predicted long-term concentrating solar power potential.

    Science.gov (United States)

    Ruiz-Arias, Jose A; Gueymard, Christian A; Santos-Alamillos, Francisco J; Pozo-Vázquez, David

    2016-08-10

    Concentrating solar technologies, which are fuelled by the direct normal component of solar irradiance (DNI), are among the most promising solar technologies. Currently, the state-of the-art methods for DNI evaluation use datasets of aerosol optical depth (AOD) with only coarse (typically monthly) temporal resolution. Using daily AOD data from both site-specific observations at ground stations as well as gridded model estimates, a methodology is developed to evaluate how the calculated long-term DNI resource is affected by using AOD data averaged over periods from 1 to 30 days. It is demonstrated here that the use of monthly representations of AOD leads to systematic underestimations of the predicted long-term DNI up to 10% in some areas with high solar resource, which may result in detrimental consequences for the bankability of concentrating solar power projects. Recommendations for the use of either daily or monthly AOD data are provided on a geographical basis.

  14. Predicting long-term disability outcomes in patients with MS treated with teriflunomide in TEMSO.

    Science.gov (United States)

    Sormani, Maria Pia; Truffinet, Philippe; Thangavelu, Karthinathan; Rufi, Pascal; Simonson, Catherine; De Stefano, Nicola

    2017-09-01

    To predict long-term disability outcomes in TEMSO core (NCT00134563) and extension (NCT00803049) studies in patients with relapsing forms of MS treated with teriflunomide. A post hoc analysis was conducted in a subgroup of patients who received teriflunomide in the core study, had MRI and clinical relapse assessments at months 12 (n = 552) and 18, and entered the extension. Patients were allocated risk scores for disability worsening (DW) after 1 year of teriflunomide treatment: 0 = low risk; 1 = intermediate risk; and 2-3 = high risk, based on the occurrence of relapses (0 to ≥2) and/or active (new and enlarging) T2-weighted (T2w) lesions (≤3 or >3) after the 1-year MRI. Patients in the intermediate-risk group were reclassified as responders or nonresponders (low or high risk) according to relapses and T2w lesions on the 18-month MRI. Long-term risk (7 years) of DW was assessed by Kaplan-Meier survival curves. In patients with a score of 2-3, the risk of 12-week-confirmed DW over 7 years was significantly higher vs those with a score of 0 (hazard ratio [HR] = 1.96, p = 0.0044). Patients reclassified as high risk at month 18 (18.6%) had a significantly higher risk of DW vs those in the low-risk group (81.4%; HR = 1.92; p = 0.0004). Over 80% of patients receiving teriflunomide were classified as low risk (responders) and had a significantly lower risk of DW than those at increased risk (nonresponders) over 7 years of follow-up in TEMSO. Close monitoring of relapses and active T2w lesions after short-term teriflunomide treatment predicts a differential rate of subsequent DW long term. TEMSO, NCT00134563; TEMSO extension, NCT00803049.

  15. Density-dependent microbial turnover improves soil carbon model predictions of long-term litter manipulations

    Science.gov (United States)

    Georgiou, Katerina; Abramoff, Rose; Harte, John; Riley, William; Torn, Margaret

    2017-04-01

    Climatic, atmospheric, and land-use changes all have the potential to alter soil microbial activity via abiotic effects on soil or mediated by changes in plant inputs. Recently, many promising microbial models of soil organic carbon (SOC) decomposition have been proposed to advance understanding and prediction of climate and carbon (C) feedbacks. Most of these models, however, exhibit unrealistic oscillatory behavior and SOC insensitivity to long-term changes in C inputs. Here we diagnose the sources of instability in four models that span the range of complexity of these recent microbial models, by sequentially adding complexity to a simple model to include microbial physiology, a mineral sorption isotherm, and enzyme dynamics. We propose a formulation that introduces density-dependence of microbial turnover, which acts to limit population sizes and reduce oscillations. We compare these models to results from 24 long-term C-input field manipulations, including the Detritus Input and Removal Treatment (DIRT) experiments, to show that there are clear metrics that can be used to distinguish and validate the inherent dynamics of each model structure. We find that widely used first-order models and microbial models without density-dependence cannot readily capture the range of long-term responses observed across the DIRT experiments as a direct consequence of their model structures. The proposed formulation improves predictions of long-term C-input changes, and implies greater SOC storage associated with CO2-fertilization-driven increases in C inputs over the coming century compared to common microbial models. Finally, we discuss our findings in the context of improving microbial model behavior for inclusion in Earth System Models.

  16. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  17. Reduced Right Ventricular Function Predicts Long-Term Cardiac Re-Hospitalization after Cardiac Surgery.

    Directory of Open Access Journals (Sweden)

    Leela K Lella

    Full Text Available The significance of right ventricular ejection fraction (RVEF, independent of left ventricular ejection fraction (LVEF, following isolated coronary artery bypass grafting (CABG and valve procedures remains unknown. The aim of this study is to examine the significance of abnormal RVEF by cardiac magnetic resonance (CMR, independent of LVEF in predicting outcomes of patients undergoing isolated CABG and valve surgery.From 2007 to 2009, 109 consecutive patients (mean age, 66 years; 38% female were referred for pre-operative CMR. Abnormal RVEF and LVEF were considered 30 days outcomes included, cardiac re-hospitalization, worsening congestive heart failure and mortality. Mean clinical follow up was 14 months.Forty-eight patients had reduced RVEF (mean 25% and 61 patients had normal RVEF (mean 50% (p<0.001. Fifty-four patients had reduced LVEF (mean 30% and 55 patients had normal LVEF (mean 59% (p<0.001. Patients with reduced RVEF had a higher incidence of long-term cardiac re-hospitalization vs. patients with normal RVEF (31% vs.13%, p<0.05. Abnormal RVEF was a predictor for long-term cardiac re-hospitalization (HR 3.01 [CI 1.5-7.9], p<0.03. Reduced LVEF did not influence long-term cardiac re-hospitalization.Abnormal RVEF is a stronger predictor for long-term cardiac re-hospitalization than abnormal LVEF in patients undergoing isolated CABG and valve procedures.

  18. Ability of the MACRO model to predict long-term leaching of metribuzin and diketometribuzin.

    Science.gov (United States)

    Rosenbom, Annette E; Kjaer, Jeanne; Henriksen, Trine; Ullum, Marlene; Olsen, Preben

    2009-05-01

    In a regulatory context, numerical models are increasingly employed to quantify leaching of pesticides and their metabolites. Although the ability of these models to accurately simulate leaching of pesticides has been evaluated, little is known about their ability to accurately simulate long-term leaching of metabolites. A Danish study on the dissipation and sorption of metribuzin, involving both monitoring and batch experiments, concluded that desorption and degradation of metribuzin and leaching of its primary metabolite diketometribuzin continued for 5-6 years after application, posing a risk of groundwater contamination. That study provided a unique opportunity for evaluating the ability of the numerical model MACRO to accurately simulate long-term leaching of metribuzin and diketometribuzin. When calibrated and validated with respect to water and bromide balances and applied assuming equilibrium sorption and first-order degradation kinetics as recommended in the European Union pesticide authorization procedure, MACRO was unable to accurately simulate the long-term fate of metribuzin and diketometribuzin; the concentrations in the soil were underestimated by many orders of magnitude. By introducing alternative kinetics (a two-site approach), we captured the observed leaching scenario, thus underlining the necessity of accounting for the long-term sorption and dissipation characteristics when using models to predict the risk of groundwater contamination.

  19. Synaptic Transmission Optimization Predicts Expression Loci of Long-Term Plasticity.

    Science.gov (United States)

    Costa, Rui Ponte; Padamsey, Zahid; D'Amour, James A; Emptage, Nigel J; Froemke, Robert C; Vogels, Tim P

    2017-09-27

    Long-term modifications of neuronal connections are critical for reliable memory storage in the brain. However, their locus of expression-pre- or postsynaptic-is highly variable. Here we introduce a theoretical framework in which long-term plasticity performs an optimization of the postsynaptic response statistics toward a given mean with minimal variance. Consequently, the state of the synapse at the time of plasticity induction determines the ratio of pre- and postsynaptic modifications. Our theory explains the experimentally observed expression loci of the hippocampal and neocortical synaptic potentiation studies we examined. Moreover, the theory predicts presynaptic expression of long-term depression, consistent with experimental observations. At inhibitory synapses, the theory suggests a statistically efficient excitatory-inhibitory balance in which changes in inhibitory postsynaptic response statistics specifically target the mean excitation. Our results provide a unifying theory for understanding the expression mechanisms and functions of long-term synaptic transmission plasticity. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Vegetation cover, tidal amplitude and land area predict short-term marsh vulnerability in Coastal Louisiana

    Science.gov (United States)

    Schoolmaster, Donald; Stagg, Camille L.; Sharp, Leigh Anne; McGinnis, Tommy S.; Wood, Bernard; Piazza, Sarai

    2018-01-01

    The loss of coastal marshes is a topic of great concern, because these habitats provide tangible ecosystem services and are at risk from sea-level rise and human activities. In recent years, significant effort has gone into understanding and modeling the relationships between the biological and physical factors that contribute to marsh stability. Simulation-based process models suggest that marsh stability is the product of a complex feedback between sediment supply, flooding regime and vegetation response, resulting in elevation gains sufficient to match the combination of relative sea-level rise and losses from erosion. However, there have been few direct, empirical tests of these models, because long-term datasets that have captured sufficient numbers of marsh loss events in the context of a rigorous monitoring program are rare. We use a multi-year data set collected by the Coastwide Reference Monitoring System (CRMS) that includes transitions of monitored vegetation plots to open water to build and test a predictive model of near-term marsh vulnerability. We found that despite the conclusions of previous process models, elevation change had no ability to predict the transition of vegetated marsh to open water. However, we found that the processes that drive elevation change were significant predictors of transitions. Specifically, vegetation cover in prior year, land area in the surrounding 1 km2 (an estimate of marsh fragmentation), and the interaction of tidal amplitude and position in tidal frame were all significant factors predicting marsh loss. This suggests that 1) elevation change is likely better a predictor of marsh loss at time scales longer than we consider in this study and 2) the significant predictive factors affect marsh vulnerability through pathways other than elevation change, such as resistance to erosion. In addition, we found that, while sensitivity of marsh vulnerability to the predictive factors varied spatially across coastal Louisiana

  1. The hidden simplicity of subduction megathrust earthquakes

    Science.gov (United States)

    Meier, M.-A.; Ampuero, J. P.; Heaton, T. H.

    2017-09-01

    The largest observed earthquakes occur on subduction interfaces and frequently cause widespread damage and loss of life. Understanding the rupture behavior of megathrust events is crucial for earthquake rupture physics, as well as for earthquake early-warning systems. However, the large variability in behavior between individual events seemingly defies a description with a simple unifying model. Here we use three source time function (STF) data sets for subduction zone earthquakes, with moment magnitude Mw ≥ 7, and show that such large ruptures share a typical universal behavior. The median STF is scalable between events with different sizes, grows linearly, and is nearly triangular. The deviations from the median behavior are multiplicative and Gaussian—that is, they are proportionally larger for larger events. Our observations suggest that earthquake magnitudes cannot be predicted from the characteristics of rupture onsets.

  2. The California Earthquake Advisory Plan: A history

    Science.gov (United States)

    Roeloffs, Evelyn A.; Goltz, James D.

    2017-01-01

    Since 1985, the California Office of Emergency Services (Cal OES) has issued advisory statements to local jurisdictions and the public following seismic activity that scientists on the California Earthquake Prediction Evaluation Council view as indicating elevated probability of a larger earthquake in the same area during the next several days. These advisory statements are motivated by statistical studies showing that about 5% of moderate earthquakes in California are followed by larger events within a 10-km, five-day space-time window (Jones, 1985; Agnew and Jones, 1991; Reasenberg and Jones, 1994). Cal OES issued four earthquake advisories from 1985 to 1989. In October, 1990, the California Earthquake Advisory Plan formalized this practice, and six Cal OES Advisories have been issued since then. This article describes that protocol’s scientific basis and evolution.

  3. Earthquake Hazard Assessment: an Independent Review

    Science.gov (United States)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".

  4. Ordinary kriging approach to predicting long-term particulate matter concentrations in seven major Korean cities

    Directory of Open Access Journals (Sweden)

    Sun-Young Kim

    2014-09-01

    Full Text Available Objectives Cohort studies of associations between air pollution and health have used exposure prediction approaches to estimate individual-level concentrations. A common prediction method used in Korean cohort studies is ordinary kriging. In this study, performance of ordinary kriging models for long-term particulate matter less than or equal to 10 μm in diameter (PM10 concentrations in seven major Korean cities was investigated with a focus on spatial prediction ability. Methods We obtained hourly PM10 data for 2010 at 226 urban-ambient monitoring sites in South Korea and computed annual average PM10 concentrations at each site. Given the annual averages, we developed ordinary kriging prediction models for each of the seven major cities and for the entire country by using an exponential covariance reference model and a maximum likelihood estimation method. For model evaluation, cross-validation was performed and mean square error and R-squared (R2 statistics were computed. Results Mean annual average PM10 concentrations in the seven major cities ranged between 45.5 and 66.0 μg/m3 (standard deviation=2.40 and 9.51 μg/m3, respectively. Cross-validated R2 values in Seoul and Busan were 0.31 and 0.23, respectively, whereas the other five cities had R2 values of zero. The national model produced a higher crossvalidated R2 (0.36 than those for the city-specific models. Conclusions In general, the ordinary kriging models performed poorly for the seven major cities and the entire country of South Korea, but the model performance was better in the national model. To improve model performance, future studies should examine different prediction approaches that incorporate PM10 source characteristics.

  5. Ordinary kriging approach to predicting long-term particulate matter concentrations in seven major Korean cities.

    Science.gov (United States)

    Kim, Sun-Young; Yi, Seon-Ju; Eum, Young Seob; Choi, Hae-Jin; Shin, Hyesop; Ryou, Hyoung Gon; Kim, Ho

    2014-01-01

    Cohort studies of associations between air pollution and health have used exposure prediction approaches to estimate individual-level concentrations. A common prediction method used in Korean cohort studies is ordinary kriging. In this study, performance of ordinary kriging models for long-term particulate matter less than or equal to 10 μm in diameter (PM10) concentrations in seven major Korean cities was investigated with a focus on spatial prediction ability. We obtained hourly PM10 data for 2010 at 226 urban-ambient monitoring sites in South Korea and computed annual average PM10 concentrations at each site. Given the annual averages, we developed ordinary kriging prediction models for each of the seven major cities and for the entire country by using an exponential covariance reference model and a maximum likelihood estimation method. For model evaluation, cross-validation was performed and mean square error and R-squared (R(2)) statistics were computed. Mean annual average PM10 concentrations in the seven major cities ranged between 45.5 and 66.0 μg/m(3) (standard deviation=2.40 and 9.51 μg/m(3), respectively). Cross-validated R(2) values in Seoul and Busan were 0.31 and 0.23, respectively, whereas the other five cities had R(2) values of zero. The national model produced a higher crossvalidated R(2) (0.36) than those for the city-specific models. In general, the ordinary kriging models performed poorly for the seven major cities and the entire country of South Korea, but the model performance was better in the national model. To improve model performance, future studies should examine different prediction approaches that incorporate PM10 source characteristics.

  6. Great Sumatra Earthquake registers on electrostatic sensor

    Science.gov (United States)

    Röder, Helmut; Schuhmann, Wolfram; Büttner, Ralf; Zimanowski, Bernard; Braun, Thomas; Boschi, Enzo

    Strong electrical signals that correspond to the Mw = 9.3 earthquake of 26 December 2004, whichoccurred at 0058:50.7 UTC off the west coast of northern Sumatra, Indonesia, were recorded by an electrostatic sensor (a device that detects short-term variations in Earth's electrostatic field) at a seismic station in Italy, which had been installed to study the influence of local earthquakes on a new landslide monitoring system.Electrical signals arrived at the station practically instantaneously and were detected up to several hours before the onset of the Sumatra earthquake (Figure 1) as well as before local quakes. The corresponding seismic signals (p-waves) arrived 740 seconds after the start of the earthquake. Because the electrical signals travel at the speed of light, electrical monitoring for the global detection of very strong earthquakes could be an important tool in significantly increasing the hazard alert window.

  7. Prediction of hyperbilirubinemia by noninvasive methods in full-term newborns

    Directory of Open Access Journals (Sweden)

    Danijela Furlan

    2013-02-01

    Full Text Available Introduction: The noninvasive screening methods for bilirubin determination were studied prospectively in a group of full-term healthy newborns with the aim of early prediction of pathological neonatal hyperbilirubinemia. Laboratory determination of bilirubin (Jendrassik-Grof (JG was compared to the noninvasive transcutaneous bilirubin (TcBIL together with the determination of bilirubin in cord blood.Methods: The study group consisted of 284 full-term healthy consecutively born infants in the period from March to June 2011. The whole group was divided into a group of physiological (n=199, and a group of pathological hyperbilirubinemia (n=85 according to the level of total bilirubin (220 μmol/L. Bilirubin in cord blood (CbBIL and from capillary blood at the age of three days was determined according to the JG, on the 3rd day TcBIL was also detected by Bilicheck bilirubinometer. The Kolmogorov-Smirnov and Mann-Whitney tests were used for the statistical analysis.Results: Bilirubin concentrations were statisti cally significantly different (CbBIL (p<0,001 on the 3rd day control sample (p<0,001, TcBil (p<0,001 between the groups of newborns with physiological (n=199 and pathological (n=85 hyperbilirubinemia. Using the cut-off value of cord blood bilirubin 28 μmol/L, we could predict the development of pathological hyperbiliru binemia with 98.8% prognostic specificity, and with 100% sensitivity that newborns will not require a phototherapy (all irradiated newborns were taken into account. We confirmed an excellent agreement between bilirubin concentrations determined by the TcBIL and JG methods for both groups of healthy full-term newborns.Conclusion: Based on our results, we could recommend that determination of the cord blood bilirubin in combination with the measurement of TcBIL should be implemented into practice for early prediction of pathological hyperbilirubinemia in full-term healthy newborns. The advantages of both methods in the routine

  8. Markers of preparatory attention predict visual short-term memory performance.

    Science.gov (United States)

    Murray, Alexandra M; Nobre, Anna C; Stokes, Mark G

    2011-05-01

    Visual short-term memory (VSTM) is limited in capacity. Therefore, it is important to encode only visual information that is most likely to be relevant to behaviour. Here we asked which aspects of selective biasing of VSTM encoding predict subsequent memory-based performance. We measured EEG during a selective VSTM encoding task, in which we varied parametrically the memory load and the precision of recall required to compare a remembered item to a subsequent probe item. On half the trials, a spatial cue indicated that participants only needed to encode items from one hemifield. We observed a typical sequence of markers of anticipatory spatial attention: early attention directing negativity (EDAN), anterior attention directing negativity (ADAN), late directing attention positivity (LDAP); as well as of VSTM maintenance: contralateral delay activity (CDA). We found that individual differences in preparatory brain activity (EDAN/ADAN) predicted cue-related changes in recall accuracy, indexed by memory-probe discrimination sensitivity (d'). Importantly, our parametric manipulation of memory-probe similarity also allowed us to model the behavioural data for each participant, providing estimates for the quality of the memory representation and the probability that an item could be retrieved. We found that selective encoding primarily increased the probability of accurate memory recall; that ERP markers of preparatory attention predicted the cue-related changes in recall probability. Copyright © 2011. Published by Elsevier Ltd.

  9. Sensitivity Analysis of Wavelet Neural Network Model for Short-Term Traffic Volume Prediction

    Directory of Open Access Journals (Sweden)

    Jinxing Shen

    2013-01-01

    Full Text Available In order to achieve a more accurate and robust traffic volume prediction model, the sensitivity of wavelet neural network model (WNNM is analyzed in this study. Based on real loop detector data which is provided by traffic police detachment of Maanshan, WNNM is discussed with different numbers of input neurons, different number of hidden neurons, and traffic volume for different time intervals. The test results show that the performance of WNNM depends heavily on network parameters and time interval of traffic volume. In addition, the WNNM with 4 input neurons and 6 hidden neurons is the optimal predictor with more accuracy, stability, and adaptability. At the same time, a much better prediction record will be achieved with the time interval of traffic volume are 15 minutes. In addition, the optimized WNNM is compared with the widely used back-propagation neural network (BPNN. The comparison results indicated that WNNM produce much lower values of MAE, MAPE, and VAPE than BPNN, which proves that WNNM performs better on short-term traffic volume prediction.

  10. Long Term Precipitation Patterns Assessment and Prediction over the Mediterranean Basin

    Science.gov (United States)

    Elkadiri, R.; Sultan, M.; Momm, H.; Elbayoumi, T.; Milewski, A.; Blair, Z.; Briana, V.

    2016-12-01

    We developed and applied an integrated approach to construct predictive tools with lead times of 1 to 24 months to forecast monthly precipitation amounts over the Mediterranean Basin. The study was applied over eight climatically different and homogeneous pilot sites. The following steps were conducted: (1) acquire, assess and inter-correlate temporal remote sensing-based precipitation products (e.g. The CPC Merged Analysis of Precipitation [CMAP], Integrated Multi-Satellite Retrievals for GPM [IMERG]) throughout the investigation period (1980 to 2016), (2) acquire and assess monthly values for all of the climatic indices influencing the regional and global climatic patterns (e.g., Northern Atlantic Oscillation [NOI], Southern Oscillation Index [SOI], and Tropical North Atlantic Index [TNA]); and (3) apply data mining methods (e.g. neural networks, genetic algorithms) to extract relationships between the observed precipitation and the controlling factors (i.e. climatic indices with multiple lead-time periods) and use predictive tools to forecast monthly precipitation. Preliminary results indicate that by using the period from January 2006 until January 2016 for model testing and the period from January 1980 until December 2005 for model training, precipitation can be successfully predicted with lead times from 1 to 24 months over the test sites with an accuracy ranging between 52% and 98%. If our final efforts are successful, our findings could lead the way to the development and implementation of long-term water management scenarios for the Mediterranean region.

  11. Short-term solar flare prediction using multi-model integration method

    Science.gov (United States)

    Liu, Jin-Fu; Li, Fei; Wan, Jie; Yu, Da-Ren

    2017-03-01

    A multi-model integration method is proposed to develop a multi-source and heterogeneous model for short-term solar flare prediction. Different prediction models are constructed on the basis of extracted predictors from a pool of observation databases. The outputs of the base models are normalized first because these established models extract predictors from many data resources using different prediction methods. Then weighted integration of the base models is used to develop a multi-model integrated model (MIM). The weight set that single models assign is optimized by a genetic algorithm. Seven base models and data from Solar and Heliospheric Observatory/Michelson Doppler Imager longitudinal magnetograms are used to construct the MIM, and then its performance is evaluated by cross validation. Experimental results showed that the MIM outperforms any individual model in nearly every data group, and the richer the diversity of the base models, the better the performance of the MIM. Thus, integrating more diversified models, such as an expert system, a statistical model and a physical model, will greatly improve the performance of the MIM.

  12. Neural activity in the hippocampus predicts individual visual short-term memory capacity.

    Science.gov (United States)

    von Allmen, David Yoh; Wurmitzer, Karoline; Martin, Ernst; Klaver, Peter

    2013-07-01

    Although the hippocampus had been traditionally thought to be exclusively involved in long-term memory, recent studies raised controversial explanations why hippocampal activity emerged during short-term memory tasks. For example, it has been argued that long-term memory processes might contribute to performance within a short-term memory paradigm when memory capacity has been exceeded. It is still unclear, though, whether neural activity in the hippocampus predicts visual short-term memory (VSTM) performance. To investigate this question, we measured BOLD activity in 21 healthy adults (age range 19-27 yr, nine males) while they performed a match-to-sample task requiring processing of object-location associations (delay period  =  900 ms; set size conditions 1, 2, 4, and 6). Based on individual memory capacity (estimated by Cowan's K-formula), two performance groups were formed (high and low performers). Within whole brain analyses, we found a robust main effect of "set size" in the posterior parietal cortex (PPC). In line with a "set size × group" interaction in the hippocampus, a subsequent Finite Impulse Response (FIR) analysis revealed divergent hippocampal activation patterns between performance groups: Low performers (mean capacity  =  3.63) elicited increased neural activity at set size two, followed by a drop in activity at set sizes four and six, whereas high performers (mean capacity  =  5.19) showed an incremental activity increase with larger set size (maximal activation at set size six). Our data demonstrated that performance-related neural activity in the hippocampus emerged below capacity limit. In conclusion, we suggest that hippocampal activity reflected successful processing of object-location associations in VSTM. Neural activity in the PPC might have been involved in attentional updating. Copyright © 2013 Wiley Periodicals, Inc.

  13. Antioxidant defenses predict long-term survival in a passerine bird.

    Directory of Open Access Journals (Sweden)

    Nicola Saino

    Full Text Available BACKGROUND: Normal and pathological processes entail the production of oxidative substances that can damage biological molecules and harm physiological functions. Organisms have evolved complex mechanisms of antioxidant defense, and any imbalance between oxidative challenge and antioxidant protection can depress fitness components and accelerate senescence. While the role of oxidative stress in pathogenesis and aging has been studied intensively in humans and model animal species under laboratory conditions, there is a dearth of knowledge on its role in shaping life-histories of animals under natural selection regimes. Yet, given the pervasive nature and likely fitness consequences of oxidative damage, it can be expected that the need to secure efficient antioxidant protection is powerful in molding the evolutionary ecology of animals. Here, we test whether overall antioxidant defense varies with age and predicts long-term survival, using a wild population of a migratory passerine bird, the barn swallow (Hirundo rustica, as a model. METHODOLOGY/PRINCIPAL FINDINGS: Plasma antioxidant capacity (AOC of breeding individuals was measured using standard protocols and annual survival was monitored over five years (2006-2010 on a large sample of selection episodes. AOC did not covary with age in longitudinal analyses after discounting the effect of selection. AOC positively predicted annual survival independently of sex. Individuals were highly consistent in their relative levels of AOC, implying the existence of additive genetic variance and/or environmental (including early maternal components consistently acting through their lives. CONCLUSIONS: Using longitudinal data we showed that high levels of antioxidant protection positively predict long-term survival in a wild animal population. Present results are therefore novel in disclosing a role for antioxidant protection in determining survival under natural conditions, strongly demanding for more

  14. CREATING THE KULTUK POLYGON FOR EARTHQUAKE PREDICTION: VARIATIONS OF (234U/238U AND 87SR/86SR IN GROUNDWATER FROM ACTIVE FAULTS AT THE WESTERN SHORE OF LAKE BAIKAL

    Directory of Open Access Journals (Sweden)

    S. V. Rasskazov

    2015-01-01

    Full Text Available Introduction. Determinations of (234U/238U in groundwater samples are used for monitoring current deformations in active faults (parentheses denote activity ratio units. The cyclic equilibrium of activity ratio 234U/238U≈≈(234U/238U≈γ≈1 corresponds to the atomic ratio ≈5.47×10–5. This parameter may vary due to higher contents of 234U nuclide in groundwater as a result of rock deformation. This effect discovered by P.I. Chalov and V.V. Cherdyntsev was described in [Cherdyntsev, 1969, 1973; Chalov, 1975; Chalov et al., 1990; Faure, 1989]. In 1970s and 1980s, only quite laborious methods were available for measuring uranium isotopic ratios. Today it is possible to determine concentrations and isotopic ration of uranium by express analytical techniques using inductively coupled plasma mass spectrometry (ICP‐MS [Halicz et al., 2000; Shen et al., 2002; Cizdziel et al., 2005; Chebykin et al., 2007]. Sets of samples canbe efficiently analysed by ICP‐MS, and regularly collected uranium isotope values can be systematized at a new quality level for the purposes of earthquake prediction. In this study of (234U/238U in groundwater at the Kultuk polygon, we selected stations of the highest sensitivity, which can ensure proper monitoring of the tectonic activity of the Obruchev and Main Sayan faults. These two faults that limit the Sharyzhalgai block of the crystalline basement of the Siberian craton in the south are conjugated in the territory of the Kultuk polygon (Fig 1. Forty sets of samples taken from 27 June 2012 to 28 January 2014 were analysed, and data on 170 samples are discussed in this paper.Methods. Isotope compositions of uranium and strontium were determined by methods described in [Chebykin et al., 2007; Pin et al., 1992] with modifications. Analyses of uranium by ISP‐MS technique were performed using an Agilent 7500ce quadrapole mass spectrometer of the Ultramicroanalysis Collective Use Centre; analyses of

  15. Estimated damage and loss scenarios for future major earthquakes in Luzon, Philippines

    Science.gov (United States)

    Pacchiani, F.; Wyss, M.

    2012-04-01

    The northernmost and biggest island of the Philippines, Luzon, is bordered both west and east by active subduction zones and, on land, the island is cut by numerous fault systems. These active systems regularly generate major earthquakes that shake the island. The largest historical earthquake was a magnitude M=8 and the last major catastrophic event was the July 16, 1990, Ms 7.8 event, that caused over 2,400 fatalities, injured over 3,500 and destroyed over 100,000 houses. Such catastrophic earthquakes will unfortunately repeat in the future and in an attempt to predict these events it has been found that for Luzon a time of increased probability (TIP) for such an earthquake exists. Considering these facts, it is of interest to evaluate the destruction and losses this event could eventually cause. We have analyzed the historical seismicity and constructed different loss scenarios. With QLARM2, a loss estimation algorithm used in real-time mode for over 10 years, we simulated the various plausible scenarios for a major earthquake (M>7) on Luzon. Results show that such an earthquake affects the majority of the island's inhabitants. As an example, for a M=7.4 earthquake, 12 km deep and 30 km from the capital city Manila, the overall maximum mean damage to be expected is 4.5 and the maximum intensity is X in the epicentral area. In terms of fatalities, QLARM2 allows to obtain first order estimates. Preliminary results suggest a conservative mean estimate of expected fatalities of 6,000, and a maximum of 18,000, values that can vary greatly, depending on location and magnitude of the future earthquake. Due to its position along the ring-of-fire and in light of our computations, Luzon will continue to be shaken by catastrophic earthquakes and should continue its effort to mitigate earthquake risk. This is particularly true for Manila, an agglomeration of over 11 million people, directly affected by the earthquake hazard.

  16. Chest HRCT signs predict deaths in long-term follow-up among asbestos exposed workers

    Energy Technology Data Exchange (ETDEWEB)

    Vehmas, Tapio, E-mail: tapio.vehmas@ttl.fi [Health and Work Ability, Finnish Institute of Occupational Health, Topeliuksenkatu 41 a A, FI-00250 Helsinki (Finland); Oksa, Panu, E-mail: panu.oksa@ttl.fi [Health and Work Ability, Finnish Institute of Occupational Health, Uimalankatu 1, FI-33101 Tampere (Finland)

    2014-10-15

    Highlights: • Much lung and pleural pathology is found in chest CT studies. • HRCT signs were screened and subsequent mortality followed up. • Several signs were related to all-cause and disease specific deaths. • The HRCT classification system used was able to predict mortality. • Secondary preventive strategies should be developed for patients with such signs. - Abstract: Objectives: To study associations between chest HRCT signs and subsequent deaths in long-term follow-up. Methods: Lung and pleural signs of 633 asbestos exposed workers (age 45–86, mean 65) screened with HRCT were recorded by using the International Classification of Occupational and Environmental Respiratory Diseases (ICOERD) system, which contains detailed instructions for use and reference images. Subsequent mortality was checked from the national register. Cox regression adjusted for covariates (age, sex, BMI, asbestos exposure, pack-years) was used to explore the relations between HRCT signs and all-cause deaths, cardiovascular and benign respiratory deaths, and deaths from neoplasms – all according to the ICD-10 diagnostic system. Results: The follow-up totalled 5271.9 person-years (mean 8.3 y/person, range .04–10.3). 119 deaths were reported. Irregular/linear opacities, honeycombing, emphysema, large opacities, visceral pleural abnormalities and bronchial wall thickening were all significantly related to all-cause deaths. Most of these signs were associated also with deaths from neoplasms and benign respiratory disease. Deaths from cardiovascular disease were predicted by emphysema and visceral pleural abnormalities. Conclusions: Several HRCT signs predicted deaths. Careful attention should be paid on subjects with radiological signs predictive of deaths and new secondary preventive strategies developed. This calls for further focused studies among different populations.

  17. Design prediction for long term stress rupture service of composite pressure vessels

    Science.gov (United States)

    Robinson, Ernest Y.

    1992-01-01

    Extensive stress rupture studies on glass composites and Kevlar composites were conducted by the Lawrence Radiation Laboratory beginning in the late 1960's and extending to about 8 years in some cases. Some of the data from these studies published over the years were incomplete or were tainted by spurious failures, such as grip slippage. Updated data sets were defined for both fiberglass and Kevlar composite stand test specimens. These updated data are analyzed in this report by a convenient form of the bivariate Weibull distribution, to establish a consistent set of design prediction charts that may be used as a conservative basis for predicting the stress rupture life of composite pressure vessels. The updated glass composite data exhibit an invariant Weibull modulus with lifetime. The data are analyzed in terms of homologous service load (referenced to the observed median strength). The equations relating life, homologous load, and probability are given, and corresponding design prediction charts are presented. A similar approach is taken for Kevlar composites, where the updated stand data do show a turndown tendency at long life accompanied by a corresponding change (increase) of the Weibull modulus. The turndown characteristic is not present in stress rupture test data of Kevlar pressure vessels. A modification of the stress rupture equations is presented to incorporate a latent, but limited, strength drop, and design prediction charts are presented that incorporate such behavior. The methods presented utilize Cartesian plots of the probability distributions (which are a more natural display for the design engineer), based on median normalized data that are independent of statistical parameters and are readily defined for any set of test data.

  18. Earthquake rate and magnitude distributions of great earthquakes for use in global forecasts

    Science.gov (United States)

    Kagan, Yan Y.; Jackson, David D.

    2016-07-01

    We have obtained new results in the statistical analysis of global earthquake catalogues with special attention to the largest earthquakes, and we examined the statistical behaviour of earthquake rate variations. These results can serve as an input for updating our recent earthquake forecast, known as the `Global Earthquake Activity Rate 1' model (GEAR1), which is based on past earthquakes and geodetic strain rates. The GEAR1 forecast is expressed as the rate density of all earthquakes above magnitude 5.8 within 70 km of sea level everywhere on earth at 0.1 × 0.1 degree resolution, and it is currently being tested by the Collaboratory for Study of Earthquake Predictability. The seismic component of the present model is based on a smoothed version of the Global Centroid Moment Tensor (GCMT) catalogue from 1977 through 2013. The tectonic component is based on the Global Strain Rate Map, a `General Earthquake Model' (GEM) product. The forecast was optimized to fit the GCMT data from 2005 through 2012, but it also fit well the earthquake locations from 1918 to 1976 reported in the International Seismological Centre-Global Earthquake Model (ISC-GEM) global catalogue of instrumental and pre-instrumental magnitude determinations. We have improved the recent forecast by optimizing the treatment of larger magnitudes and including a longer duration (1918-2011) ISC-GEM catalogue of large earthquakes to estimate smoothed seismicity. We revised our estimates of upper magnitude limits, described as corner magnitudes, based on the massive earthquakes since 2004 and the seismic moment conservation principle. The new corner magnitude estimates are somewhat larger than but consistent with our previous estimates. For major subduction zones we find the best estimates of corner magnitude to be in the range 8.9 to 9.6 and consistent with a uniform average of 9.35. Statistical estimates tend to grow with time as larger earthquakes occur. However, by using the moment conservation

  19. Very-short-term wind power prediction by a hybrid model with single- and multi-step approaches

    Science.gov (United States)

    Mohammed, E.; Wang, S.; Yu, J.

    2017-05-01

    Very-short-term wind power prediction (VSTWPP) has played an essential role for the operation of electric power systems. This paper aims at improving and applying a hybrid method of VSTWPP based on historical data. The hybrid method is combined by multiple linear regressions and least square (MLR&LS), which is intended for reducing prediction errors. The predicted values are obtained through two sub-processes:1) transform the time-series data of actual wind power into the power ratio, and then predict the power ratio;2) use the predicted power ratio to predict the wind power. Besides, the proposed method can include two prediction approaches: single-step prediction (SSP) and multi-step prediction (MSP). WPP is tested comparatively by auto-regressive moving average (ARMA) model from the predicted values and errors. The validity of the proposed hybrid method is confirmed in terms of error analysis by using probability density function (PDF), mean absolute percent error (MAPE) and means square error (MSE). Meanwhile, comparison of the correlation coefficients between the actual values and the predicted values for different prediction times and window has confirmed that MSP approach by using the hybrid model is the most accurate while comparing to SSP approach and ARMA. The MLR&LS is accurate and promising for solving problems in WPP.

  20. Health-related quality of life predicts long-term survival in patients with peripheral artery disease

    DEFF Research Database (Denmark)

    Issa, Samson M; Hoeks, Sanne E; Scholte op Reimer, Wilma J M

    2010-01-01

    We examined whether health-related quality of life (HRQoL) predicts long-term survival in patients with peripheral artery disease (PAD) independent of established prognostic risk factors. In 2004, data on 711 consecutive patients with PAD undergoing vascular surgery were collected from 11 hospitals...... prognostic factors. In conclusion, the study indicates that poor HRQoL predicts long-term survival in patients with PAD, and provides prognostic value above established risk factors....

  1. Prostate-specific antigen and long-term prediction of prostate cancer incidence and mortality in the general population

    DEFF Research Database (Denmark)

    Orsted, David D; Nordestgaard, Børge G; Jensen, Gorm B

    2012-01-01

    It is largely unknown whether prostate-specific antigen (PSA) level at first date of testing predicts long-term risk of prostate cancer (PCa) incidence and mortality in the general population.......It is largely unknown whether prostate-specific antigen (PSA) level at first date of testing predicts long-term risk of prostate cancer (PCa) incidence and mortality in the general population....

  2. Prediction of Long-term Cognitive Decline Following Postoperative Delirium in Older Adults.

    Science.gov (United States)

    Devore, Elizabeth E; Fong, Tamara G; Marcantonio, Edward R; Schmitt, Eva M; Travison, Thomas G; Jones, Richard N; Inouye, Sharon K

    2017-11-09

    Increasing evidence suggests that postoperative delirium may result in long-term cognitive decline among older adults. Risk factors for such cognitive decline are unknown. We studied 126 older participants without delirium or dementia upon entering the Successful AGing After Elective Surgery (SAGES) study, who developed postoperative delirium and completed repeated cognitive assessments (up to 36 months of follow-up). Pre-surgical factors were assessed preoperatively and divided into nine groupings of related factors ("domains"). Delirium was evaluated at baseline and daily during hospitalization using the Confusion Assessment Method diagnostic algorithm, and cognitive function was assessed using a neuropsychological battery and the Informant Questionnaire for Cognitive Decline in the Elderly (IQCODE) at baseline and 6-month intervals over 3 years. Linear regression was used to examine associations between potential risk factors and rate of long-term cognitive decline over time. A domain-specific and then overall selection method based on adjusted R2 values was used to identify explanatory factors for the outcome. The General Cognitive Performance (GCP) score (combining all neuropsychological test scores), IQCODE score, and living alone were significantly associated with long-term cognitive decline. GCP score explained the most variation in rate of cognitive decline (13%), and six additional factors-IQCODE score, cognitive independent activities of daily living impairment, living alone, cerebrovascular disease, Charlson comorbidity index score, and exhaustion level-in combination explained 32% of variation in this outcome. Global cognitive performance was most strongly associated with long-term cognitive decline following delirium. Pre-surgical factors may substantially predict this outcome.

  3. Near and long-term load prediction using radial basis function networks

    Energy Technology Data Exchange (ETDEWEB)

    Hancock, M.F. [Rollins College, Winter Park, FL (United States)

    1995-12-31

    A number of researchers have investigated the application of multi-layer perceptrons (MLP`s), a variety of neural network, to the problem of short-term load forecasting for electric utilities (e.g., Rahman & Hazin, IEEE Trans. Power Systems, May 1993). {open_quotes}Short-term{close_quotes} in this context typically means {open_quotes}next day{close_quotes}. These forecasts have been based upon previous day actual loads and meteorological factors (e.g., max-min temperature, relative humidity). We describe the application of radial basis function networks (RBF`s) to the {open_quotes}long-term{close_quotes} (next year) load forecasting problem. The RBF network performs a two-stage classification based upon annual average loads and meteorological data. During stage 1, discrete classification is performed using radius-limited elements. During stage 2, a multi-layer perceptron may be applied. The quantized output is used to correct a prediction template. The stage 1 classifier is trained by maximizing an objective function (the {open_quotes}disambiguity{close_quotes}). The stage 2 MLP`s are trained by standard back-propagation. This work uses 12 months of hourly meteorological data, and the corresponding hourly load data for both commercial and residential feeders. At the current stage of development, the RBF machine can train on 20% of the weather/load data (selected by simple linear sampling), and estimate the hourly load for an entire year (8,760 data points) with 9.1% error (RMS, relative to daily peak load). (By comparison, monthly mean profiles perform at c. 12% error.) The best short-term load forecasters operate in the 2% error range. The current system is an engineering prototype, and development is continuing.

  4. Scenario for a Short-Term Probabilistic Seismic Hazard Assessment (PSHA in Chiayi, Taiwan

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan

    2013-01-01

    Full Text Available Using seismic activity and the Meishan earthquake sequence that occurred from 1904 to 1906, a scenario for short-term probabilistic seismic hazards in the Chiayi region of Taiwan is assessed. The long-term earthquake occurrence rate in Taiwan was evaluated using a smoothing kernel. The highest seismicity rate was calculated around the Chiayi region. To consider earthquake interactions, the rate-and-state friction model was introduced to estimate the seismicity rate evolution due to the Coulomb stress change. As imparted by the 1904 Touliu earthquake, stress changes near the 1906 Meishan and Yangshuigang epicenters was higher than the magnitude of tidal triggering. With regard to the impact of the Meishan earthquake, the region close to the Yangshuigang earthquake epicenter had a +0.75 bar stress increase. The results indicated significant interaction between the three damage events. Considering the path and site effect using ground motion prediction equations, a probabilistic seismic hazard in the form of a hazard evolution and a hazard map was assessed. A significant elevation in hazards following the three earthquakes in the sequence was determined. The results illustrate a possible scenario for seismic hazards in the Chiayi region which may take place repeatly in the future. Such scenario provides essential information on earthquake preparation, devastation estimations, emergency sheltering, utility restoration, and structure reconstruction.

  5. Short- and long-term effects of habitat fragmentation differ but are predicted by response to the matrix.

    Science.gov (United States)

    Evans, Maldwyn J; Banks, Sam C; Driscoll, Don A; Hicks, Andrew J; Melbourne, Brett A; Davies, Kendi F

    2017-03-01

    Habitat loss and fragmentation are major threats to biodiversity and ecosystem processes. Our current understanding of the impacts of habitat loss and fragmentation is based largely on studies that focus on either short-term or long-term responses. Short-term responses are often used to predict long-term responses and make management decisions. The lack of studies comparing short- and long-term responses to fragmentation means we do not adequately understand when and how well short-term responses can be extrapolated to predict long-term responses, and when or why they cannot. To address this gap, we used data from one of the world's longest-running fragmentation experiments, The Wog Wog Habitat Fragmentation Experiment. Using data for carabid beetles, we found that responses in the long term (more than 22 yr post-fragmentation ≈22 generations) often contrasted markedly with those in the short term (5 yr post-fragmentation). The total abundance of all carabids, species richness and the occurrence of six species declined in the short term in the fragments but increased over the long term. The occurrence of three species declined initially and continued to decline, whilst another species was positively affected initially but decreased in the long term. Species' responses to the matrix that surrounds the fragments strongly predicted both the direction (increase/decline in occurrence) and magnitude of their responses to fragmentation. Additionally, species' responses to the matrix were somewhat predicted by their preferences for different types of native habitat (open vs. shaded). Our study highlights the degree of the matrix's influence in fragmented landscapes, and how this influence can change over time. We urge caution in using short-term responses to forecast long-term responses in cases where the matrix (1) impacts species' responses to fragmentation (by isolating them, creating new habitat or altering fragment habitat) and (2) is likely to change through time

  6. Performance of wire-type Rn detectors operated with gas gain in ambient air in view of its possible application to early earthquake predictions

    CERN Document Server

    Charpak, Georges; Breuil, P; Nappi, E; Martinengo, P; Peskov, V

    2010-01-01

    We describe a detector of alpha particles based on wire type counters (single-wire and multiwire) operating in ambient air at high gas gains (100-1000). The main advantages of these detectors are: low cost, robustness and ability to operate in humid air. The minimum detectable activity achieved with the multiwire detector for an integration time of 1 min is 140 Bq per m3, which is comparable to that featured by commercial devices. Owing to such features the detector is suited for massive application, for example for continuous monitoring of Rn or Po contaminations or, as discussed in the paper, its use in a network of Rn counters in areas affected by earth-quakes in order to verify, on a solid statistical basis, the envisaged correlation between the sudden Rn appearance and a forthcoming earthquake.

  7. Dosimetric Inhomogeneity Predicts for Long-Term Breast Pain After Breast-Conserving Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Mak, Kimberley S. [Harvard Radiation Oncology Program, Boston, Massachusetts (United States); Chen, Yu-Hui; Catalano, Paul J. [Department of Biostatistics and Computational Biology, Dana-Farber Cancer Institute, Boston, Massachusetts (United States); Punglia, Rinaa S.; Wong, Julia S.; Truong, Linh [Department of Radiation Oncology, Dana-Farber/Brigham and Women' s Cancer Center, Boston, Massachusetts (United States); Bellon, Jennifer R., E-mail: jbellon@LROC.harvard.edu [Department of Radiation Oncology, Dana-Farber/Brigham and Women' s Cancer Center, Boston, Massachusetts (United States)

    2015-12-01

    Purpose: The objective of this cross-sectional study was to characterize long-term breast pain in patients undergoing breast-conserving surgery and radiation (BCT) and to identify predictors of this pain. Methods and Materials: We identified 355 eligible patients with Tis-T2N0M0 breast cancer who underwent BCT in 2007 to 2011, without recurrent disease. A questionnaire derived from the Late Effects Normal Tissue Task Force (LENT) Subjective, Objective, Management, Analytic (SOMA) scale was mailed with 7 items detailing the severity, frequency, duration, and impact of ipsilateral breast pain over the previous 2 weeks. A logistic regression model identified predictors of long-term breast pain based on questionnaire responses and patient, disease, and treatment characteristics. Results: The questionnaire response rate was 80% (n=285). One hundred thirty-five patients (47%) reported pain in the treated breast, with 19 (14%) having pain constantly or at least daily; 15 (11%) had intense pain. The pain interfered with daily activities in 11 patients (8%). Six patients (4%) took analgesics for breast pain. Fourteen (10%) thought that the pain affected their quality of life. On univariable analysis, volume of breast tissue treated to ≥105% of the prescribed dose (odds ratio [OR] 1.001 per cc, 95% confidence interval [CI] 1.000-1.002; P=.045), volume treated to ≥110% (OR 1.009 per cc, 95% CI 1.002-1.016; P=.012), hormone therapy use (OR 1.95, 95% CI 1.12-3.39; P=.02), and other sites of pain (OR 1.79, 95% CI 1.05-3.07; P=.03) predicted for long-term breast pain. On multivariable analysis, volume ≥110% (OR 1.01 per cc, 95% CI 1.003-1.017; P=.007), shorter time since treatment (OR 0.98 per month, 95% CI 0.96-0.998; P=.03), and hormone therapy (OR 1.84, 95% CI 1.05-3.25; P=.03) were independent predictors of pain. Conclusion: Long-term breast pain was common after BCT. Although nearly half of patients had pain, most considered it tolerable. Dosimetric inhomogeneity

  8. Short-term and long-term thermal prediction of a walking beam furnace using neuro-fuzzy techniques

    Directory of Open Access Journals (Sweden)

    Banadaki Hamed Dehghan

    2015-01-01

    Full Text Available The walking beam furnace (WBF is one of the most prominent process plants often met in an alloy steel production factory and characterized by high non-linearity, strong coupling, time delay, large time-constant and time variation in its parameter set and structure. From another viewpoint, the WBF is a distributed-parameter process in which the distribution of temperature is not uniform. Hence, this process plant has complicated non-linear dynamic equations that have not worked out yet. In this paper, we propose one-step non-linear predictive model for a real WBF using non-linear black-box sub-system identification based on locally linear neuro-fuzzy (LLNF model. Furthermore, a multi-step predictive model with a precise long prediction horizon (i.e., ninety seconds ahead, developed with application of the sequential one-step predictive models, is also presented for the first time. The locally linear model tree (LOLIMOT which is a progressive tree-based algorithm trains these models. Comparing the performance of the one-step LLNF predictive models with their associated models obtained through least squares error (LSE solution proves that all operating zones of the WBF are of non-linear sub-systems. The recorded data from Iran Alloy Steel factory is utilized for identification and evaluation of the proposed neuro-fuzzy predictive models of the WBF process.

  9. Predicting the short-term risk of diabetes in HIV-positive patients

    DEFF Research Database (Denmark)

    Petoumenos, Kathy; Worm, Signe Westring; Fontas, Eric

    2012-01-01

    -positive populations and to compare the existing models developed in the general population. Methods: All patients recruited to the Data Collection on Adverse events of Anti-HIV Drugs (D:A:D) study with follow-up data, without prior DM, myocardial infarction or other CVD events and with a complete DM risk factor...... and other glucose-associated disorders among HIV-positive patients have been reported to range between 2 and 14%, and in an ageing HIV-positive population, the prevalence of DM is expected to continue to increase. This study aims to develop a model to predict the short-term (six-month) risk of DM in HIV...... profile were included. Conventional risk factors identified in the general population as well as key HIV-related factors were assessed using Poisson-regression methods. Expected probabilities of DM events were also determined based on the Framingham Offspring Study DM equation. The D:A:D and Framingham...

  10. Self-reported musculoskeletal pain predicts long-term increase in general health care use

    DEFF Research Database (Denmark)

    Hartvigsen, Jan; Davidsen, Michael; Søgaard, Karen

    2014-01-01

    reported during the past two weeks from the Danish National Cohort Study were merged with data from the Danish National Health Insurance Registry and the National Patient Registry containing information on consultations in the Danish primary and secondary care sector. Absolute and relative rates for all......Aims: Musculoskeletal pain and disability is a modern epidemic and a major reason for seeking health care. The aim of this study is to determine absolute and relative rates of care seeking over 20 years for adults reporting musculoskeletal complaints. Methods: Interview data on musculoskeletal pain...... to any of the outcomes. CONCLUSIONS SELF-REPORT OF MUSCULOSKELETAL PAIN REPORTED WITHIN THE PAST TWO WEEKS PREDICTS A STATISTICALLY SIGNIFICANT LONG-TERM INCREASE IN GENERAL USE OF HEALTH CARE SERVICES IN BOTH THE PRIMARY AND THE SECONDARY HEALTH CARE SECTOR:...

  11. Research on Short-Term Wind Power Prediction Based on Combined Forecasting Models

    Directory of Open Access Journals (Sweden)

    Zhang Chi

    2016-01-01

    Full Text Available Short-Term wind power forecasting is crucial for power grid since the generated energy of wind farm fluctuates frequently. In this paper, a physical forecasting model based on NWP and a statistical forecasting model with optimized initial value in the method of BP neural network are presented. In order to make full use of the advantages of the models presented and overcome the limitation of the disadvantage, the equal weight model and the minimum variance model are established for wind power prediction. Simulation results show that the combination forecasting model is more precise than single forecasting model and the minimum variance combination model can dynamically adjust weight of each single method, restraining the forecasting error further.

  12. Long-Term Predictions from Early Adolescent Attachment State of Mind to Romantic Relationship Behaviors

    Science.gov (United States)

    Tan, Joseph S.; Hessel, Elenda T.; Loeb, Emily L.; Schad, Megan M.; Allen, Joseph P.; Chango, Joanna M.

    2015-01-01

    Attachment state of mind was investigated as a long-term predictor of romantic relationship competence. A secure early adolescent attachment state of mind was hypothesized to predict more constructive dyadic behaviors during conflict discussions and support seeking interactions in late adolescence and early adulthood. Utilizing multi-method data from a community sample of 184 individuals, followed from ages 14 to 21, adolescents with a secure attachment state of mind at age 14 were found to be in relationships that displayed more constructive dyadic conflict discussion behaviors and dyadic supportive behaviors at both ages 18 and 21. Results suggest substantial links between early adolescent attachment state of mind and the adult romantic relationship atmosphere an individual creates and experiences. PMID:28154474

  13. An Optimized Prediction Intervals Approach for Short Term PV Power Forecasting

    Directory of Open Access Journals (Sweden)

    Qiang Ni

    2017-10-01

    Full Text Available High quality photovoltaic (PV power prediction intervals (PIs are essential to power system operation and planning. To improve the reliability and sharpness of PIs, in this paper, a new method is proposed, which involves the model uncertainties and noise uncertainties, and PIs are constructed with a two-step formulation. In the first step, the variance of model uncertainties is obtained by using extreme learning machine to make deterministic forecasts of PV power. In the second stage, innovative PI-based cost function is developed to optimize the parameters of ELM and noise uncertainties are quantization in terms of variance. The performance of the proposed approach is examined by using the PV power and meteorological data measured from 1kW rooftop DC micro-grid system. The validity of the proposed method is verified by comparing the experimental analysis with other benchmarking methods, and the results exhibit a superior performance.

  14. Scaling of Seismic Memory with Earthquake Size

    CERN Document Server

    Zheng, Zeyu; Tenenbaum, Joel; Podobnik, Boris; Stanley, H Eugene

    2011-01-01

    It has been observed that the earthquake events possess short-term memory, i.e. that events occurring in a particular location are dependent on the short history of that location. We conduct an analysis to see whether real-time earthquake data also possess long-term memory and, if so, whether such autocorrelations depend on the size of earthquakes within close spatiotemporal proximity. We analyze the seismic waveform database recorded by 64 stations in Japan, including the 2011 "Great East Japan Earthquake", one of the five most powerful earthquakes ever recorded which resulted in a tsunami and devastating nuclear accidents. We explore the question of seismic memory through use of mean conditional intervals and detrended fluctuation analysis (DFA). We find that the waveform sign series show long-range power-law anticorrelations while the interval series show long-range power-law correlations. We find size-dependence in earthquake auto-correlations---as earthquake size increases, both of these correlation beha...

  15. Earthquake Damage - General

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — An earthquake is the motion or trembling of the ground produced by sudden displacement of rock in the Earth's crust. Earthquakes result from crustal strain,...

  16. Earthquakes in Southern California

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — There have been many earthquake occurrences in Southern California. This set of slides shows earthquake damage from the following events: Imperial Valley, 1979,...

  17. Earthquake Notification Service

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — The Earthquake Notification Service (ENS) is a free service that sends you automated notifications to your email or cell phone when earthquakes happen.

  18. A score to predict short-term risk of COPD exacerbations (SCOPEX

    Directory of Open Access Journals (Sweden)

    Make BJ

    2015-01-01

    properties of predictive variables. Results: The best predictors of an exacerbation in the next 6 months were more COPD maintenance medications prior to the trial, higher mean daily reliever use, more exacerbations during the previous year, lower forced expiratory volume in 1 second/forced vital capacity ratio, and female sex. Using these risk variables, we developed a score to predict short-term (6-month risk of COPD exacerbations (SCOPEX. Budesonide/formoterol reduced future exacerbation risk more than formoterol or as-needed short-acting ß2-agonist (salbutamol. Conclusion: SCOPEX incorporates easily identifiable patient characteristics and can be readily applied in clinical practice to target therapy to reduce COPD exacerbations in patients at the highest risk. Keywords: chronic obstructive pulmonary disease, exacerbation, model, predictor, inhaled corticosteroids, bronchodilators 

  19. Assessment of two mammographic density related features in predicting near-term breast cancer risk

    Science.gov (United States)

    Zheng, Bin; Sumkin, Jules H.; Zuley, Margarita L.; Wang, Xingwei; Klym, Amy H.; Gur, David

    2012-02-01

    In order to establish a personalized breast cancer screening program, it is important to develop risk models that have high discriminatory power in predicting the likelihood of a woman developing an imaging detectable breast cancer in near-term (e.g., breast cancer risk models, mammographic density is considered the second highest breast cancer risk factor (second to woman's age). In this study we explored a new feature, namely bilateral mammographic density asymmetry, and investigated the feasibility of predicting near-term screening outcome. The database consisted of 343 negative examinations, of which 187 depicted cancers that were detected during the subsequent screening examination and 155 that remained negative. We computed the average pixel value of the segmented breast areas depicted on each cranio-caudal view of the initial negative examinations. We then computed the mean and difference mammographic density for paired bilateral images. Using woman's age, subjectively rated density (BIRADS), and computed mammographic density related features we compared classification performance in estimating the likelihood of detecting cancer during the subsequent examination using areas under the ROC curves (AUC). The AUCs were 0.63+/-0.03, 0.54+/-0.04, 0.57+/-0.03, 0.68+/-0.03 when using woman's age, BIRADS rating, computed mean density and difference in computed bilateral mammographic density, respectively. Performance increased to 0.62+/-0.03 and 0.72+/-0.03 when we fused mean and difference in density with woman's age. The results suggest that, in this study, bilateral mammographic tissue density is a significantly stronger (prisk indicator than both woman's age and mean breast density.

  20. Saddle Pulmonary Embolism: Laboratory and Computed Tomographic Pulmonary Angiographic Findings to Predict Short-term Mortality.

    Science.gov (United States)

    Liu, Min; Miao, Ran; Guo, Xiaojuan; Zhu, Li; Zhang, Hongxia; Hou, Qing; Guo, Youmin; Yang, Yuanhua

    2017-02-01

    Saddle pulmonary embolism (SPE) is rare type of acute pulmonary embolism and there is debate about its treatment and prognosis. Our aim is to assess laboratory and computed tomographic pulmonary angiographic (CTPA) findings to predict short-term mortality in patients with SPE. This was a five-centre, retrospective study. The clinical information, laboratory and CTPA findings of 88 consecutive patients with SPE were collected. One-month mortality after diagnosis of SPE was the primary end-point. The correlation of laboratory and CTPA findings with one-month mortality was analysed with area under curve (AUC) of receiver operating characteristic (ROC) curves and logistic regression analysis. Eighteen patients with SPE died within one month. Receiver operating characteristic curves revealed that the cutoff values for the right and left atrial diameter ratio, the right ventricular area and left ventricular area ratio (RVa/LVa ratio), Mastora score, septal angle, N-terminal pro-brain natriuretic peptide and cardiac troponin I (cTnI) for detecting early mortality were 2.15, 2.13, 69%, 57°, 3036 pg/mL and 0.18ng/mL, respectively. Using logistic regression analysis of laboratory and CTPA findings with regard to one-month mortality of SPE, RVa/LVa ratio and cTnI were shown to be independently associated with early death. A combination of cTnI and RVa/LVa ratio revealed an increase in the AUC value, but the difference did not reach significance compared with RVa/LVa or cTnI, alone (P>0.05). In patients with SPE, both the RVa/LVa ratio on CTPA and cTnI appear valuable for the prediction of short-term mortality. Copyright © 2016 Australian and New Zealand Society of Cardiac and Thoracic Surgeons (ANZSCTS) and the Cardiac Society of Australia and New Zealand (CSANZ). Published by Elsevier B.V. All rights reserved.

  1. Lesion load may predict long-term cognitive dysfunction in multiple sclerosis patients.

    Directory of Open Access Journals (Sweden)

    Francesco Patti

    Full Text Available Magnetic Resonance Imaging (MRI techniques provided evidences into the understanding of cognitive impairment (CIm in Multiple Sclerosis (MS.To investigate the role of white matter (WM and gray matter (GM in predicting long-term CIm in a cohort of MS patients.303 out of 597 patients participating in a previous multicenter clinical-MRI study were enrolled (49.4% were lost at follow-up. The following MRI parameters, expressed as fraction (f of intracranial volume, were evaluated: cerebrospinal fluid (CSF-f, WM-f, GM-f and abnormal WM (AWM-f, a measure of lesion load. Nine years later, cognitive status was assessed in 241 patients using the Symbol Digit Modalities Test (SDMT, the Semantically Related Word List Test (SRWL, the Modified Card Sorting Test (MCST, and the Paced Auditory Serial Addition Test (PASAT. In particular, being SRWL a memory test, both immediate recall and delayed recall were evaluated. MCST scoring was calculated based on the number of categories, number of perseverative and non-perseverative errors.AWM-f was predictive of an impaired performance 9 years ahead in SDMT (OR 1.49, CI 1.12-1.97 p = 0.006, PASAT (OR 1.43, CI 1.14-1.80 p = 0.002, SRWL-immediate recall (OR 1.72 CI 1.35-2.20 p<0.001, SRWL-delayed recall (OR 1.61 CI 1.28-2.03 p<0.001, MCST-category (OR 1.52, CI 1.2-1.9 p<0.001, MCST-perseverative error(OR 1.51 CI 1.2-1.9 p = 0.001, MCST-non perseverative error (OR 1.26 CI 1.02-1.55 p = 0.032.In our large MS cohort, focal WM damage appeared to be the most relevant predictor of the long-term cognitive outcome.

  2. Importance of early weight changes to predict long-term weight gain during psychotropic drug treatment.

    Science.gov (United States)

    Vandenberghe, Frederik; Gholam-Rezaee, Mehdi; Saigí-Morgui, Núria; Delacrétaz, Aurélie; Choong, Eva; Solida-Tozzi, Alessandra; Kolly, Stéphane; Thonney, Jacques; Gallo, Sylfa Fassassi; Hedjal, Ahmed; Ambresin, Anne-Emmanuelle; von Gunten, Armin; Conus, Philippe; Eap, Chin B

    2015-11-01

    Psychotropic drugs can induce substantial weight gain, particularly during the first 6 months of treatment. The authors aimed to determine the potential predictive power of an early weight gain after the introduction of weight gain-inducing psychotropic drugs on long-term weight gain. Data were obtained from a 1-year longitudinal study ongoing since 2007 including 351 psychiatric (ICD-10) patients, with metabolic parameters monitored (baseline and/or 1, 3, 6, 9, 12 months) and with compliance ascertained. International Diabetes Federation and World Health Organization definitions were used to define metabolic syndrome and obesity, respectively. Prevalences of metabolic syndrome and obesity were 22% and 17%, respectively, at baseline and 32% and 24% after 1 year. Receiver operating characteristic analyses indicated that an early weight gain > 5% after a period of 1 month is the best predictor for important long-term weight gain (≥ 15% after 3 months: sensitivity, 67%; specificity, 88%; ≥ 20% after 12 months: sensitivity, 47%; specificity, 89%). This analysis identified most patients (97% for 3 months, 93% for 12 months) who had weight gain ≤ 5% after 1 month as continuing to have a moderate weight gain after 3 and 12 months. Its predictive power was confirmed by fitting a longitudinal multivariate model (difference between groups in 1 year of 6.4% weight increase as compared to baseline, P = .0001). Following prescription of weight gain-inducing psychotropic drugs, a 5% threshold for weight gain after 1 month should raise clinician concerns about weight-controlling strategies. © Copyright 2015 Physicians Postgraduate Press, Inc.

  3. Peculiarities and problems of determination of the predicted durability term of steel constructions coatings

    Directory of Open Access Journals (Sweden)

    Андрій Іванович Ковальов

    2017-07-01

    Full Text Available The subject of predicted durability term of steel constructions coatings using accelerated tests and experimental determination of steel constructions coatings fire-resistance after weather testings is raised in the article. The benefits and drawbacks of steel constructions in present-day building industry and construction design are introduced. The factors influencing steel constructions fire resistance increase are presented. The article shows that the most advanced and ideal for the protection of steel constructions is using the agents that expand and blow up under the temperature influence, forming thus a porous structure on the surface to be protected. The stages to receive the necessary indexes of predicted durability term of steel constructions coatings as well as methods to carry out climate tests both in the heated and unheated premises are described. For experimental determination of steel constructions coatings fire resistance it is suggested to use the method based on the experimental determination of the temperature of a steel plate with fire-resistant coating in the conditions of high temperatures typical of a fire, and on the solution of inverse and direct tasks of heat conduction for determination of thermal and physical characteristics of fire-resistant coatings and dependence of minimum coating thickness on the thickness of the steel plate, durability of fire impact and the steel critical temperature value. It is concluded that there is the necessity to develop the methodological support that makes it possible to estimate the coatings fire-resistance after or in the process of carrying out the accelerated climate tests as compared to the control examples. The list of problems that come into being while determining the steel constructions coatings fire-resistance after the climate tests is distinguished as well as the purpose of the future researches and tasks that should be solved

  4. Introduction to mathematical modeling of earthquakes

    Science.gov (United States)

    Ito, H. M.; Kuroki, H.; Yoshida, A.

    2001-06-01

    We first overview statistics and kinematics, necessities in modeling of earthquakes. In statistics, size-frequency distributions of earthquakes and temporal changes of aftershock activities are main subjects. We pay attention not only to power-law behaviors but to non-power-law behaviors as well. In particular, comparison of the two size-frequency distributions, the one by Gutenberg-Richter and the other based on the characteristic earthquake scheme which assumes periodic generation of earthquakes similar in size, is important from a viewpoint of earthquake prediction. Kinematically, a framework is presented which treats earthquakes as generation of dislocations, discontinuities in the displacement over fault surfaces. As static models, we discuss percolation models on a tree and a two-dimensional square lattice. Here the size-frequency generally decays exponentially or stretched exponentially as earthquakes become large, and does algebraically (Gutenberg-Richter law) only at the critical point. Time dependent problem is discussed using cellular automaton models. One of the main concerns here is whether power laws in size-frequency distribution are realized at stationary states. We observe this property, self-organized criticality, is shown only by models close the original sand pile model by Bak et al. Physical processes are included by using elements of blocks and springs. Power laws as well as non-power laws are allowed as stationary size-frequency distributions. In order to account for decay of aftershock activities, it is necessary to introduce some relaxation mechanisms. To take into full account the kinematics of earthquakes, the dislocation picture, we need to stack springs and blocks three-dimensionally. A continuum version is presented to study a case of a subducting plate, where earthquakes occur following a characteristic earthquake scheme.

  5. Sonographical predictive markers of failure of induction of labour in term pregnancy.

    Science.gov (United States)

    Brik, Maia; Mateos, Silvia; Fernandez-Buhigas, Irene; Garbayo, Paloma; Costa, Gloria; Santacruz, Belen

    2017-02-01

    Predictive markers of failure of induction of labour in term pregnancy were evaluated. A prospective study including 245 women attending induction of labour was performed. The inclusion criteria were singleton pregnancies, gestational age 37-42 weeks and the main outcomes were failure of induction, induction to delivery interval and mode of delivery. Women with a longer cervical length prior to induction (CLpi) had a higher rate of failure of induction (30.9 ± 6.8 vs. 23.9 ± 9.3, p < .001). BMI was higher and maternal height was lower in the group of caesarean section compared to vaginal delivery (33.1 ± 8 vs. 29.3 ± 4.6, 160 ± 5 vs. 164 ± 5, p < .001, respectively). A shorter CLpi correlated with a shorter induction to delivery interval (R Pearson .237, p < .001). In the regression analysis, for failure of induction the only independent predictor was the CL prior to induction. Therefore, the CLpi is an independent factor for prediction of failure of induction of labour.

  6. Dynamical prediction and pattern mapping in short-term load forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Aguirre, Luis Antonio; Rodrigues, Daniela D.; Lima, Silvio T. [Departamento de Engenharia Eletronica, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, 31270-901 Belo Horizonte, MG (Brazil); Martinez, Carlos Barreira [Departamento de Engenharia Hidraulica e Recursos Hidricos, Universidade Federal de Minas Gerais, Av. Antonio Carlos, 6627, 31270-901 Belo Horizonte, MG (Brazil)

    2008-01-15

    This work will not put forward yet another scheme for short-term load forecasting but rather will provide evidences that may improve our understanding about fundamental issues which underlay load forecasting problems. In particular, load forecasting will be decomposed into two main problems, namely dynamical prediction and pattern mapping. It is argued that whereas the latter is essentially static and becomes nonlinear when weekly features in the data are taken into account, the former might not be deterministic at all. In such cases there is no determinism (serial correlations) in the data apart from the average cycle and the best a model can do is to perform pattern mapping. Moreover, when there is determinism in addition to the average cycle, the underlying dynamics are sometimes linear, in which case there is no need to resort to nonlinear models to perform dynamical prediction. Such conclusions were confirmed using real load data and surrogate data analysis. In a sense, the paper details and organizes some general beliefs found in the literature on load forecasting. This sheds some light on real model-building and forecasting problems and helps understand some apparently conflicting results reported in the literature. (author)

  7. Long-term Failure Prediction based on an ARP Model of Global Risk Network

    Science.gov (United States)

    Lin, Xin; Moussawi, Alaa; Szymanski, Boleslaw; Korniss, Gyorgy

    Risks that threaten modern societies form an intricately interconnected network. Hence, it is important to understand how risk materializations in distinct domains influence each other. In the paper, we study the global risks network defined by World Economic Forum experts in the form of Stochastic Block Model. We model risks as Alternating Renewal Processes with variable intensities driven by hidden values of exogenous and endogenous failure probabilities. Based on the expert assessments and historical status of each risk, we use Maximum Likelihood Evaluation to find the optimal model parameters and demonstrate that the model considering network effects significantly outperforms the others. In the talk, we discuss how the model can be used to provide quantitative means for measuring interdependencies and materialization of risks in the network. We also present recent results of long-term predictions in the form of predicated distributions of materializations over various time periods. Finally we show how the simulation of ARP's enables us to probe limits of the predictability of the system parameters from historical data and ability to recover hidden variable. Supported in part by DTRA, ARL NS-CTA.

  8. Early prediction of long-term response to cabergoline in patients with macroprolactinomas.

    Science.gov (United States)

    Lee, Youngki; Ku, Cheol Ryong; Kim, Eui-Hyun; Hong, Jae Won; Lee, Eun Jig; Kim, Sun Ho

    2014-09-01

    Cabergoline is typically effective for treating prolactinomas; however, some patients display cabergoline resistance, and the early characteristics of these patients remain unclear. We analyzed early indicators predicting long-term response to cabergoline. We retrospectively reviewed the cases of 44 patients with macroprolactinomas who received cabergoline as first-line treatment; the patients were followed for a median of 16 months. The influence of various clinical parameters on outcomes was evaluated. Forty patients (90.9%) were treated medically and displayed tumor volume reduction (TVR) of 74.7%, a prolactin normalization (NP) rate of 81.8%, and a complete response (CR; TVR >50% with NP, without surgery) rate of 70.5%. Most patients (93.1%) with TVR ≥25% and NP at 3 months eventually achieved CR, whereas only 50% of patients with TVR ≥25% without NP and no patients with TVR cabergoline (β=-1.181 mg/week), and two of four patients who underwent surgery were able to discontinue cabergoline. Determining cabergoline response using TVR and NP 3 months after treatment is useful for predicting later outcomes. However, further cabergoline administration should be considered for patients with TVR >25% at 3 months without NP, particularly those with huge prolactinomas, because a delayed response may be achieved. As surgery can reduce the cabergoline dose necessary for successful disease control, it should be considered for cabergoline-resistant patients.

  9. Thyroiditis de Quervain. Are there predictive factors for long-term hormone-replacement?

    Science.gov (United States)

    Schenke, S; Klett, R; Braun, S; Zimny, M

    2013-01-01

    Subacute thyroiditis is a usually self-limiting disease of the thyroid. However, approximately 0.5-15% of the patients require permanent thyroxine substitution. Aim was to determine predictive factors for the necessity of long-term hormone-replacement (LTH). We retrospectively reviewed the records of 72 patients with subacute thyroiditis. Morphological and serological parameters as well as type of therapy were tested as predictive factors of consecutive hypothyroidism. Mean age was 49 ± 11 years, f/m-ratio was 4.5 : 1. Thyroid pain and signs of hyperthyroidism were leading symptoms. Initial subclinical or overt hyperthyroidism was found in 20% and 37%, respectively. Within six months after onset 15% and 1.3% of the patients developed subclinical or overt hypothyroidism, respectively. At latest follow-up 26% were classified as liable to LTH. At onset the thyroid was enlarged in 64%, and at latest follow-up in 8.3%, with a significant reduction of the thyroid volume after three months. At the endpoint the thyroid volume was less in patients in the LTH group compared with the non-LTH group (41.7% vs. 57.2% of sex-adjusted upper norm, p = 0.041). Characteristic ultrasonographic features occurred in 74% of the patients in both lobes. Serological and morphological parameters as well as type of therapy were not related with the need of LTH. In this study the proportion of patients who received LTH was 26%. At the endpoint these patients had a lower thyroid volume compared with euthyroid patients. No predictive factors for LTH were found.

  10. Leukotriene D4 inhalation challenge for predicting short-term efficacy of montelukast: a pilot study.

    Science.gov (United States)

    Guan, Wei-jie; Shi, Xu; Zheng, Jin-ping; Gao, Yi; Jiang, Cai-yu; Xie, Yan-qing; Liu, Qing-Xia; Zhu, Zheng; Guo, E; An, Jia-ying; Yu, Xin-xin; Liu, Wen-ting; Zhong, Nan-shan

    2015-01-01

    The convenient measure to predict efficacy of leukotriene receptor antagonist is lacking. To determine if leukotriene D4 inhalation challenge predicts short-term efficacy of montelukast in asthma. In this open-labelled 28-day trial, 45 patients with asthma were allocated to leukotriene-sensitive and leukotriene-insensitive group to receive montelukast monotherapy (10 mg, once daily) based on the positive threshold of leukotriene D4 inhalation challenge test (4.800 nmol). Miscellaneous measurements comprised fractional exhaled nitric oxide, methacholine inhalation challenge, Asthma Control Test and Asthma Quality of Life Questionnaire. Peak expiratory flow was self-monitored throughout the treatment. End point assessments were performed 3 to 5 days after montelukast withdrawal. Twenty-three patients in leukotriene-sensitive group and 10 leukotriene-insensitive group completed the study. Both groups differed neither in 28-day peak expiratory flow rate nor in maximal weekly peak expiratory flow (both P > 0.05). However, minimal weekly peak expiratory flow was significantly higher in leukotriene-insensitive group throughout the treatment course (all P  0.05). Both groups did not differ statistically in the post-treatment improvement in forced expiratory volume in 1 s (FEV1 ) predicted% prior to inhalation challenge, fractional exhaled nitric oxide or the airway responsiveness to leukotriene D4 or methacholine (all P > 0.05). There was a marked increase in Asthma Control Test score and the symptom score of Asthma Quality of Life Questionnaire in both groups (both P montelukast monotherapy in patients with asthma. © 2014 John Wiley & Sons Ltd.

  11. Transcutaneous bilirubin nomogram for predicting neonatal hyperbilirubinemia in healthy term and late-preterm Chinese infants.

    Science.gov (United States)

    Yu, Zhang-Bin; Dong, Xiao-Yue; Han, Shu-Ping; Chen, Yu-Lin; Qiu, Yu-Fang; Sha, Li; Sun, Qing; Guo, Xi-Rong

    2011-02-01

    Identifying infants that will develop significant hyperbilirubinemia with the risk of kernicterus, and planning appropriate follow-up strategies, is particularly challenging. In this study, 36,921 transcutaneous bilirubin (TcB) measurements were obtained from 6,035 healthy neonates (gestational age ≥ 35 weeks and birth weight ≥ 2,000 g) between January 1 and December 31, 2009. All measurements were performed with the JM-103 bilirubinometer at designated times between 0 and 168 postnatal hours. TcB percentiles were calculated and used to develop an hour-specific nomogram. The rate of increase in TcB was higher during the first 72 h of age, after which levels declined to a plateau by 72-108 h of age. We constructed a TcB nomogram by using the 40th, 75th, and 95th percentile values of TcB for every 12 h of the studied interval. The 75th percentile curve of the nomogram may be an ideal cutoff point for intensive follow-up of the neonate for hyperbilirubinemia as it carries very high sensitivity (78.7%) and negative predictive value (98.5%). The specificity (45.7%) and positive predictive value (15.5%) decreased to reach their lowest levels at the 40th percentile. Of the neonates in the high-risk zone, 167 (48.8%) infants had persistent subsequent hyperbilirubinemia post-discharge, compared with 292 (27.0%) infants in the high-intermediate-risk zone at discharge. One-hundred and seventeen (5.5%) infants in the low-intermediate-risk zone moved into the high-risk zone during follow-up. No newborn infants in the low-risk zone became high-risk during follow-up. We provide an hour-specific TcB nomogram to predict neonatal hyperbilirubinemia in healthy term and late-preterm Chinese infants.

  12. Early Postimplant Speech Perception and Language Skills Predict Long-Term Language and Neurocognitive Outcomes Following Pediatric Cochlear Implantation

    Science.gov (United States)

    Hunter, Cynthia R.; Kronenberger, William G.; Castellanos, Irina; Pisoni, David B.

    2017-01-01

    Purpose: We sought to determine whether speech perception and language skills measured early after cochlear implantation in children who are deaf, and early postimplant growth in speech perception and language skills, predict long-term speech perception, language, and neurocognitive outcomes. Method: Thirty-six long-term users of cochlear…

  13. Metamemory ratings predict long-term changes in reactivated episodic memories

    Directory of Open Access Journals (Sweden)

    Amnon eYacoby

    2015-02-01

    Full Text Available Reactivation of long-term memory can render the memory item temporarily labile, offering an opportunity to modify it via behavioral or pharmacological intervention. Declarative memory reactivation is accompanied by a metamemory ability to subjectively assess the knowledge available concerning the target item (Feeling of knowing, FOK. We set out to examine whether FOK can predict the extent of change of long-term episodic memories by post-retrieval manipulations. To this end, participants watched a short movie and immediately thereafter tested on their memory for it. A day later, they were reminded of that movie, and either immediately or one day later, were presented with a second movie. The reminder phase consisted of memory cues to which participants were asked to judge their FOK regarding the original movie. The memory performance of participants to whom new information was presented immediately after reactivating the original episode corresponded to the degree of FOK ratings upon reactivation such that the lower their FOK, the less their memory declined. In contrast, no relation was found between FOK and memory strength for those who learned new information one day after the reminder phase. Our findings suggest that the subjective accessibility of reactivated memories may determine the extent to which new information might modify those memories.

  14. Redefining Earthquakes and the Earthquake Machine

    Science.gov (United States)

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  15. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  16. Retinal Vessel Calibers in Predicting Long-Term Cardiovascular Outcomes: The Atherosclerosis Risk in Communities Study.

    Science.gov (United States)

    Seidelmann, Sara B; Claggett, Brian; Bravo, Paco E; Gupta, Ankur; Farhad, Hoshang; Klein, Barbara E; Klein, Ronald; Di Carli, Marcelo; Solomon, Scott D

    2016-11-01

    Narrower retinal arterioles and wider retinal venules have been associated with negative cardiovascular outcomes. We investigated whether retinal vessel calibers are associated with cardiovascular outcomes in long-term follow-up and provide incremental value over the 2013 American College of Cardiology/American Heart Association Pooled Cohort Equations in predicting atherosclerotic cardiovascular disease events. A total of 10 470 men and women without prior atherosclerotic cardiovascular disease events or heart failure in the ARIC Study (Atherosclerosis Risk in Communities) underwent retinal photography at visit 3 (1993-1995). During a mean follow-up of 16 years, there were 1779 incident coronary heart disease events, 548 ischemic strokes, 1395 heart failure events, and 2793 deaths. Rates of all outcomes were higher in those with wider retinal venules and narrower retinal arterioles. Subjects with wider retinal venules (hazard ratio [HR], 1.13; 95% confidence interval [CI], 1.08-1.18; HR, 1.18; 95% CI, 1.07-1.31; and HR, 1.10; 95% CI, 1.00-1.20 per 1-SD increase) and narrower retinal arterioles (HR, 1.06; 95% CI, 1.01-1.11; HR, 1.14; 95% CI, 1.03-1.26; and HR, 1.13; 95% CI, 1.03-1.24 per 1-SD decrease) had a higher risk of death and stroke in both sexes and incident coronary heart disease in women but not men (interaction P=0.02) after adjustment for the Pooled Cohort Equations risk score variables. The association between retinal vessel caliber and heart failure was nonsignificant after adjustment for systolic blood pressure. Among women with Pooled Cohort Equations-predicted 10-year atherosclerotic cardiovascular disease event risk 5%). Narrower retinal arterioles and wider retinal venules conferred long-term risk of mortality and ischemic stroke in both sexes and coronary heart disease in women. These measures serve as an inexpensive, reproducible biomarker that added incremental value to current practice guidelines in atherosclerotic cardiovascular disease

  17. Withdrawal-Related Changes in Delay Discounting Predict Short-Term Smoking Abstinence.

    Science.gov (United States)

    Miglin, Rickie; Kable, Joseph W; Bowers, Maureen E; Ashare, Rebecca L

    2017-06-01

    Impulsive decision making is associated with smoking behavior and reflects preferences for smaller, immediate rewards and intolerance of temporal delays. Nicotine withdrawal may alter impulsive decision making and time perception. However, little is known about whether withdrawal-related changes in decision making and time perception predict smoking relapse. Forty-five smokers (14 female) completed two laboratory sessions, one following 24-hour abstinence and one smoking-as-usual (order counterbalanced; biochemically verified abstinence). During each visit, participants completed measures of time perception, decision making (ie, discount rates), craving, and withdrawal. Following the second laboratory session, subjects underwent a well-validated model of short-term abstinence (quit week) with small monetary incentives for each day of biochemically confirmed abstinence. Smokers significantly overestimated time during abstinence, compared to smoking-as-usual (p = .021), but there were no abstinence effects on discount rates (p = .6). During the quit week, subjects were abstinent for 3.5 days (SD = 2.15) and smoked a total of 12.9 cigarettes (SD = 15.8). Importantly, higher discount rates (ie, preferences for immediate rewards) during abstinence (abstinence minus smoking difference score) predicted greater number of days abstinent (p = .01) and fewer cigarettes smoked during the quit week (p = .02). Withdrawal-related change in time reproduction did not predict relapse (p = .2). These data suggest that individuals who have a greater preference for immediate rewards during abstinence (vs. smoking-as-usual) may be more successful at maintaining short-term abstinence when provided with frequent (eg, daily) versus less frequent incentive schedules (eg, 1 month). Abstinence-induced changes in decision making may be important for identifying smokers who may benefit from interventions that incentivize abstinence such as contingency management (CM). The present results

  18. Future scenarios for earthquake and flood risk in Eastern Europe and Central Asia

    NARCIS (Netherlands)

    Murnane, Rick; Daniell, James E.; Schafer, A.M.; Ward, P.J.; Winsemius, H.C.; Simpson, A.; Tijssen, A.; Toro, Joaquin

    2017-01-01

    We report on a regional flood and earthquake risk assessment for 33 countries in Eastern Europe and Central Asia. Flood and earthquake risk were defined in terms of affected population and affected gross domestic product (GDP). Earthquake risk was also quantified in terms of fatalities and capital

  19. Preschool speech intelligibility and vocabulary skills predict long-term speech and language outcomes following cochlear implantation in early childhood.

    Science.gov (United States)

    Castellanos, Irina; Kronenberger, William G; Beer, Jessica; Henning, Shirley C; Colson, Bethany G; Pisoni, David B

    2014-07-01

    Speech and language measures during grade school predict adolescent speech-language outcomes in children who receive cochlear implants (CIs), but no research has examined whether speech and language functioning at even younger ages is predictive of long-term outcomes in this population. The purpose of this study was to examine whether early preschool measures of speech and language performance predict speech-language functioning in long-term users of CIs. Early measures of speech intelligibility and receptive vocabulary (obtained during preschool ages of 3-6 years) in a sample of 35 prelingually deaf, early-implanted children predicted speech perception, language, and verbal working memory skills up to 18 years later. Age of onset of deafness and age at implantation added additional variance to preschool speech intelligibility in predicting some long-term outcome scores, but the relationship between preschool speech-language skills and later speech-language outcomes was not significantly attenuated by the addition of these hearing history variables. These findings suggest that speech and language development during the preschool years is predictive of long-term speech and language functioning in early-implanted, prelingually deaf children. As a result, measures of speech-language functioning at preschool ages can be used to identify and adjust interventions for very young CI users who may be at long-term risk for suboptimal speech and language outcomes.

  20. A seismological model for earthquakes induced by fluid extraction from a subsurface reservoir

    Science.gov (United States)

    Bourne, S. J.; Oates, S. J.; van Elk, J.; Doornhof, D.

    2014-12-01

    A seismological model is developed for earthquakes induced by subsurface reservoir volume changes. The approach is based on the work of Kostrov () and McGarr () linking total strain to the summed seismic moment in an earthquake catalog. We refer to the fraction of the total strain expressed as seismic moment as the strain partitioning function, α. A probability distribution for total seismic moment as a function of time is derived from an evolving earthquake catalog. The moment distribution is taken to be a Pareto Sum Distribution with confidence bounds estimated using approximations given by Zaliapin et al. (). In this way available seismic moment is expressed in terms of reservoir volume change and hence compaction in the case of a depleting reservoir. The Pareto Sum Distribution for moment and the Pareto Distribution underpinning the Gutenberg-Richter Law are sampled using Monte Carlo methods to simulate synthetic earthquake catalogs for subsequent estimation of seismic ground motion hazard. We demonstrate the method by applying it to the Groningen gas field. A compaction model for the field calibrated using various geodetic data allows reservoir strain due to gas extraction to be expressed as a function of both spatial position and time since the start of production. Fitting with a generalized logistic function gives an empirical expression for the dependence of α on reservoir compaction. Probability density maps for earthquake event locations can then be calculated from the compaction maps. Predicted seismic moment is shown to be strongly dependent on planned gas production.

  1. Possibility of coupling the magnetosphere-ionosphere during the time of earthquakes

    Science.gov (United States)

    Rabeh, Taha; Cataldi, Gabriele; Straser, Valentino

    2014-05-01

    In this work we attempt to quantify and investigate the causes of earthquakes using the magnetic signal and hence to predict. We proceed several trails to quantify forces using Sq-variation currents in the Earth's lithosphere and the electromagnetic induction prevailed in the ionosphere at the time of earthquakes. The deep sources of magnetic field prevailed in the Lithosphere has been investigated using the magnetic jerks. Also, the relationship between the applied stress and the corresponding variation in the remanent magnetization has been investigated for rock samples collected along active tectonic zones, while the electromagnetic variations prevailed in the ionosphere were studied using Kp index with respect to the earthquake occurrences. The results show that correlation between the variations in the magnetic field and the tectonic activities has been approved along the diurnal and long term variations. The cross-correlation coefficients (PCC) factors between the correlated data sets are ranging between 0.813 and 0.94 indicating strong linear relationship. We concluded that we can trace a noticeable magnetic signal during the 24 before earthquake events. We determine the occurrence times of geomagnetic impulses (jerks) at the time of earthquakes. We show a direct relation between the stress and the remanent magnetization confirming the additional magnetic values (ΔH) that is added to the main magnetic field. Also analysis of the Kp and the variations of geomagnetic background (perturbations) shows the possibility of the coupling interaction process between the magnetosphere-ionosphere during the time of earthquake. In fact, by analyzing the modulation of solar activity taking as reference the change in density of the solar wind, was verified that M6+ global seismic activity is influenced by the variations of the density of the solar wind. Key words: Sq variations, earthquakes, magnetic jerks, Seismic Geomagnetic Precursor (SGP), Interplanetary Seismic

  2. Monitoring perturbations of earth surface process after the 2015 Gorkha earthquake in Nepal

    Science.gov (United States)

    Andermann, Christoff; Hovius, Niels; Cook, Kristen; Turowski, Jens; Illien, Luc; Sense-Schönfelder, Christoph; Rössner, Sigrid; Parajuli, Binod; Bajracharya, Krishna; Adi=hikari, Basanta

    2017-04-01

    Large earthquakes can substantially perturb a wide range of Earth surface processes. The strong shaking caused by large earthquakes weakens rockmass, causes extensive landsliding, and alter the hydrological conductivity of the near surface. This leads to subsequent responses that include sediment loading of rivers and changes in subsurface water flow paths. The long term perturbation often last several years and even might outstrip the immediate co-seismic impact in their magnitude. Over time the system restores to background conditions, and the recovery process and transient timescales of different systems provide particularly valuable insights for predicting natural risks associated with the aftermath of earthquakes. Here we will present results of the first 2 years of monitoring surface processes in the epicentral area of the 2015 Gorkha earthquake. The observations started immediately after the event and are planned to continue for a total of four monsoon seasons, in order to capture the full recovery process of the system until pre-earthquake conditions have been reached. We have installed a comprehensive network of twelve river sampling stations for daily water and sediment sampling, covering all major rivers draining the earthquake-affected areas. Nested within this regional network, we have installed an array of 16 seismometers and 6 weather stations in the upper Bhotekoshi catchment. The field measurements are accompanied by repeated mapping of landslide activities using satellite imagery. Our results show pronounced changes of the hydrological regime, underpinned by a marked change of seismic noise velocities, both indications of significant changes of the subsurface rock properties. Alongside, our landslide mapping documents about ten times greater landslide activity during the 2015 monsoon season than typically expected for this monsoon season. Very preliminary estimates for the exceptionally strong 2016 monsoon season are also elevated. This

  3. Seismic activity preceding the 2016 Kumamoto earthquakes: Multiple approaches to recognizing possible precursors

    Science.gov (United States)

    Nanjo, K.; Izutsu, J.; Orihara, Y.; Furuse, N.; Togo, S.; Nitta, H.; Okada, T.; Tanaka, R.; Kamogawa, M.; Nagao, T.

    2016-12-01

    We show the first results of recognizing seismic patterns as possible precursory episodes to the 2016 Kumamoto earthquakes, using existing four different methods: b-value method (e.g., Schorlemmer and Wiemer, 2005; Nanjo et al., 2012), two kinds of seismic quiescence evaluation methods (RTM-algorithm, Nagao et al., 2011; Z-value method, Wiemer and Wyss, 1994), and foreshock seismic density analysis based on Lippiello et al. (2012). We used the earthquake catalog maintained by the Japan Meteorological Agency (JMA). To ensure data quality, we performed catalog completeness check as a pre-processing step of individual analyses. Our finding indicates the methods we adopted do not allow the Kumamoto earthquakes to be predicted exactly. However, we found that the spatial extent of possible precursory patterns differs from one method to the other and ranges from local scales (typically asperity size), to regional scales (e.g., 2° × 3° around the source zone). The earthquakes are preceded by periods of pronounced anomalies, which lasted decade scales (e.g., 20 years or longer) to yearly scales (e.g., 1 2 years). Our results demonstrate that combination of multiple methods detects different signals prior to the Kumamoto earthquakes with more considerable reliability than if measured by single method. This strongly suggests great potential to reduce the possible future sites of earthquakes relative to long-term seismic hazard assessment. This study was partly supported by MEXT under its Earthquake and Volcano Hazards Observation and Research Program and Grant-in-Aid for Scientific Research (C), No. 26350483, 2014-2017, by Chubu University under the Collaboration Research Program of IDEAS, IDEAS201614, and by Tokai University under Project Resarch of IORD. A part of this presentation is given in Nanjo et al. (2016, submitted).

  4. Machine Learning-Based Short-Term Prediction of Air-Conditioning Load through Smart Meter Analytics

    Directory of Open Access Journals (Sweden)

    Manoj Manivannan

    2017-11-01

    Full Text Available The present paper is focused on short-term prediction of air-conditioning (AC load of residential buildings using the data obtained from a conventional smart meter. The AC load, at each time step, is separated from smart meter’s aggregate consumption through energy disaggregation methodology. The obtained air-conditioning load and the corresponding historical weather data are then employed as input features for the prediction procedure. In the prediction step, different machine learning algorithms, including Artificial Neural Networks, Support Vector Machines, and Random Forests, are used in order to conduct hour-ahead and day-ahead predictions. The predictions obtained using Random Forests have been demonstrated to be the most accurate ones leading to hour-ahead and day-ahead prediction with R2 scores of 87.3% and 83.2%, respectively. The main advantage of the present methodology is separating the AC consumption from the consumptions of other residential appliances, which can then be predicted employing short-term weather forecasts. The other devices’ consumptions are largely dependent upon the occupant’s behaviour and are thus more difficult to predict. Therefore, the harsh alterations in the consumption of AC equipment, due to variations in the weather conditions, can be predicted with a higher accuracy; which in turn enhances the overall load prediction accuracy.

  5. Anomalous decrease in groundwater radon before 2016 Mw6.4 Meinong earthquake and its application in Taiwan.

    Science.gov (United States)

    Kuo, T; Chen, W; Ho, C

    2018-02-16

    Recurrent groundwater radon anomalies were observed at the Paihe spring (P1) in southwestern Taiwan prior to the M w 6.3 Jiasian and M w 6.4 Meinong earthquakes that occurred on March 4, 2010 and February 5, 2016, respectively. Specifically, the concentration of groundwater radon decreased from background levels of 144 ± 7 and 137 ± 8 pCi/L to minima of 104 ± 8 and 97 ± 9 pCi/L prior to the 2010 Jiasian and 2016 Meinong earthquakes, respectively. The Paihe spring (P1) is located 46 km and 45 km, respectively, from the epicenters of the 2010 M w 6.3 Jiasian and 2016 M w 6.4 Meinong earthquakes. The above radon anomalies observed at the Paihe limestone spring corroborated that a small fractured aquifer can be used as an effective natural strain meter by applying radon as a tracer for earthquake warning in southwestern Taiwan. There are scientific difficulties and uncertainties in earthquake prediction. Nonetheless, a long-term monitoring of precursory declines in groundwater radon can provide useful data for forecasting local disastrous earthquakes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  7. Ground Motions Due to Earthquakes on Creeping Faults

    Science.gov (United States)

    Harris, R.; Abrahamson, N. A.

    2014-12-01

    We investigate the peak ground motions from the largest well-recorded earthquakes on creeping strike-slip faults in active-tectonic continental regions. Our goal is to evaluate if the strong ground motions from earthquakes on creeping faults are smaller than the strong ground motions from earthquakes on locked faults. Smaller ground motions might be expected from earthquakes on creeping faults if the fault sections that strongly radiate energy are surrounded by patches of fault that predominantly absorb energy. For our study we used the ground motion data available in the PEER NGA-West2 database, and the ground motion prediction equations that were developed from the PEER NGA-West2 dataset. We analyzed data for the eleven largest well-recorded creeping-fault earthquakes, that ranged in magnitude from M5.0-6.5. Our findings are that these earthquakes produced peak ground motions that are statistically indistinguishable from the peak ground motions produced by similar-magnitude earthquakes on locked faults. These findings may be implemented in earthquake hazard estimates for moderate-size earthquakes in creeping-fault regions. Further investigation is necessary to determine if this result will also apply to larger earthquakes on creeping faults. Please also see: Harris, R.A., and N.A. Abrahamson (2014), Strong ground motions generated by earthquakes on creeping faults, Geophysical Research Letters, vol. 41, doi:10.1002/2014GL060228.

  8. Influence of seat geometry and seating posture on NIC(max) long-term AIS 1 neck injury predictability.

    Science.gov (United States)

    Eriksson, Linda; Kullgren, Anders

    2006-03-01

    Validated injury criteria are essential when developing restraints for AIS 1 neck injuries, which should protect occupants in a variety of crash situations. Such criteria have been proposed and attempts have been made to validate or disprove these. However, no criterion has yet been fully validated. The objective of this study is to evaluate the influence of seat geometry and seating posture on the NIC(max) long-term AIS 1 neck injury predictability by making parameter analyses on reconstructed real-life rear-end crashes with known injury outcomes. Mathematical models of the BioRID II and three car seats were used to reconstruct 79 rear-end crashes involving 110 occupants with known injury outcomes. Correlations between the NIC(max) values and the duration of AIS 1 neck injuries were evaluated for variations in seat geometry and seating posture. Sensitivities, specificities, positive predictive values, and negative predictive values were also calculated to evaluate the NIC(max) predictability. Correlations between the NIC(max) values and the duration of AIS 1 neck injuries were found and these relations were used to establish injury risk curves for variations in seat geometry and seating posture. Sensitivities, specificities, positive predictive values, and negative predictive values showed that the NIC(max) predicts long-term AIS 1 neck injuries also for variations in seat geometry and seating postures. The NIC(max) can be used to predict long-term AIS 1 neck injuries.

  9. Short-Term Power Load Point Prediction Based on the Sharp Degree and Chaotic RBF Neural Network

    Directory of Open Access Journals (Sweden)

    Dongxiao Niu

    2015-01-01

    Full Text Available In order to realize the predicting and positioning of short-term load inflection point, this paper made reference to related research in the field of computer image recognition. It got a load sharp degree sequence by the transformation of the original load sequence based on the algorithm of sharp degree. Then this paper designed a forecasting model based on the chaos theory and RBF neural network. It predicted the load sharp degree sequence based on the forecasting model to realize the positioning of short-term load inflection point. Finally, in the empirical example analysis, this paper predicted the daily load point of a region using the actual load data of the certain region to verify the effectiveness and applicability of this method. Prediction results showed that most of the test sample load points could be accurately predicted.

  10. High Attenuation Rate for Shallow, Small Earthquakes in Japan

    Science.gov (United States)

    Si, Hongjun; Koketsu, Kazuki; Miyake, Hiroe

    2017-09-01

    We compared the attenuation characteristics of peak ground accelerations (PGAs) and velocities (PGVs) of strong motion from shallow, small earthquakes that occurred in Japan with those predicted by the equations of Si and Midorikawa (J Struct Constr Eng 523:63-70, 1999). The observed PGAs and PGVs at stations far from the seismic source decayed more rapidly than the predicted ones. The same tendencies have been reported for deep, moderate, and large earthquakes, but not for shallow, moderate, and large earthquakes. This indicates that the peak values of ground motion from shallow, small earthquakes attenuate more steeply than those from shallow, moderate or large earthquakes. To investigate the reason for this difference, we numerically simulated strong ground motion for point sources of M w 4 and 6 earthquakes using a 2D finite difference method. The analyses of the synthetic waveforms suggested that the above differences are caused by surface waves, which are predominant at stations far from the seismic source for shallow, moderate earthquakes but not for shallow, small earthquakes. Thus, although loss due to reflection at the boundaries of the discontinuous Earth structure occurs in all shallow earthquakes, the apparent attenuation rate for a moderate or large earthquake is essentially the same as that of body waves propagating in a homogeneous medium due to the dominance of surface waves.

  11. Dynamics of Earthquake Faults

    CERN Document Server

    Carlson, J M; Shaw, B E

    1993-01-01

    We present an overview of our ongoing studies of the rich dynamical behavior of the uniform, deterministic Burridge--Knopoff model of an earthquake fault. We discuss the behavior of the model in the context of current questions in seismology. Some of the topics considered include: (1) basic properties of the model, such as the magnitude vs. frequency distribution and the distinction between small and large events; (2) dynamics of individual events, including dynamical selection of rupture propagation speeds; (3) generalizations of the model to more realistic, higher dimensional models; (4) studies of predictability, in which artificial catalogs generated by the model are used to test and determine the limitations of pattern recognition algorithms used in seismology.

  12. Ionospheric precursors for crustal earthquakes in Italy

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2010-04-01

    Full Text Available Crustal earthquakes with magnitude 6.0>M≥5.5 observed in Italy for the period 1979–2009 including the last one at L'Aquila on 6 April 2009 were considered to check if the earlier obtained relationships for ionospheric precursors for strong Japanese earthquakes are valid for the Italian moderate earthquakes. The ionospheric precursors are based on the observed variations of the sporadic E-layer parameters (h'Es, fbEs and foF2 at the ionospheric station Rome. Empirical dependencies for the seismo-ionospheric disturbances relating the earthquake magnitude and the epicenter distance are obtained and they have been shown to be similar to those obtained earlier for Japanese earthquakes. The dependences indicate the process of spreading the disturbance from the epicenter towards periphery during the earthquake preparation process. Large lead times for the precursor occurrence (up to 34 days for M=5.8–5.9 tells about a prolong preparation period. A possibility of using the obtained relationships for the earthquakes prediction is discussed.

  13. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  14. First Results of the Regional Earthquake Likelihood Models Experiment

    Science.gov (United States)

    Schorlemmer, Danijel; Zechar, J. Douglas; Werner, Maximilian J.; Field, Edward H.; Jackson, David D.; Jordan, Thomas H.

    2010-08-01

    The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment—a truly prospective earthquake prediction effort—is underway within the U.S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary—the forecasts were meant for an application of 5 years—we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one.

  15. Development of Short-term Molecular Thresholds to Predict Long-term Mouse Liver Tumor Outcomes: Phthalate Case Study

    Science.gov (United States)

    Short-term molecular profiles are a central component of strategies to model health effects of environmental chemicals. In this study, a 7 day mouse assay was used to evaluate transcriptomic and proliferative responses in the liver for a hepatocarcinogenic phthalate, di (2-ethylh...

  16. The Nankai Trough earthquake tsunamis in Korea: numerical studies of the 1707 Hoei earthquake and physics-based scenarios

    Science.gov (United States)

    Kim, SatByul; Saito, Tatsuhiko; Fukuyama, Eiichi; Kang, Tae-Seob

    2016-04-01

    Historical documents in Korea and China report abnormal waves in the sea and rivers close to the date of the 1707 Hoei earthquake, which occurred in the Nankai Trough, off southwestern Japan. This indicates that the tsunami caused by the Hoei earthquake might have reached Korea and China, which suggests a potential hazard in Korea from large earthquakes in the Nankai Trough. We conducted tsunami simulations to study the details of tsunamis in Korea caused by large earthquakes. Our results showed that the Hoei earthquake (Mw 8.8) tsunami reached the Korean Peninsula about 200 min after the earthquake occurred. The maximum tsunami height was ~0.5 m along the Korean coast. The model of the Hoei earthquake predicted a long-lasting tsunami whose highest peak arrived 600 min later after the first arrival near the coastline of Jeju Island. In addition, we conducted tsunami simulations using physics-based scenarios of anticipated earthquakes in the Nankai subduction zone. The maximum tsunami height in the scenarios (Mw 8.5-8.6) was ~0.4 m along the Korean coast. As a simple evaluation of larger possible tsunamis, we increased the amount of stress released by the earthquake by a factor of two and three, resulting in scenarios for Mw 8.8 and 8.9 earthquakes, respectively. The tsunami height increased by 0.1-0.4 m compared to that estimated by the Hoei earthquake.

  17. Risk factors and prediction of very short term versus short/intermediate term post-stroke mortality: a data mining approach.

    Science.gov (United States)

    Easton, Jonathan F; Stephens, Christopher R; Angelova, Maia

    2014-11-01

    Data mining and knowledge discovery as an approach to examining medical data can limit some of the inherent bias in the hypothesis assumptions that can be found in traditional clinical data analysis. In this paper we illustrate the benefits of a data mining inspired approach to statistically analysing a bespoke data set, the academic multicentre randomised control trial, U.K Glucose Insulin in Stroke Trial (GIST-UK), with a view to discovering new insights distinct from the original hypotheses of the trial. We consider post-stroke mortality prediction as a function of days since stroke onset, showing that the time scales that best characterise changes in mortality risk are most naturally defined by examination of the mortality curve. We show that certain risk factors differentiate between very short term and intermediate term mortality. In particular, we show that age is highly relevant for intermediate term risk but not for very short or short term mortality. We suggest that this is due to the concept of frailty. Other risk factors are highlighted across a range of variable types including socio-demographics, past medical histories and admission medication. Using the most statistically significant risk factors we build predictive classification models for very short term and short/intermediate term mortality. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  18. Strong motion duration and earthquake magnitude relationships

    Energy Technology Data Exchange (ETDEWEB)

    Salmon, M.W.; Short, S.A. [EQE International, Inc., San Francisco, CA (United States); Kennedy, R.P. [RPK Structural Mechanics Consulting, Yorba Linda, CA (United States)

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ``strong motion duration`` has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions.

  19. Modelling the elements of country vulnerability to earthquake disasters.

    Science.gov (United States)

    Asef, M R

    2008-09-01

    Earthquakes have probably been the most deadly form of natural disaster in the past century. Diversity of earthquake specifications in terms of magnitude, intensity and frequency at the semicontinental scale has initiated various kinds of disasters at a regional scale. Additionally, diverse characteristics of countries in terms of population size, disaster preparedness, economic strength and building construction development often causes an earthquake of a certain characteristic to have different impacts on the affected region. This research focuses on the appropriate criteria for identifying the severity of major earthquake disasters based on some key observed symptoms. Accordingly, the article presents a methodology for identification and relative quantification of severity of earthquake disasters. This has led to an earthquake disaster vulnerability model at the country scale. Data analysis based on this model suggested a quantitative, comparative and meaningful interpretation of the vulnerability of concerned countries, and successfully explained which countries are more vulnerable to major disasters.

  20. Predicting the denitrification capacity of sandy aquifers from shorter-term incubation experiments and sediment properties

    Directory of Open Access Journals (Sweden)

    W. Eschenbach

    2013-02-01

    protect groundwater from anthropogenic NO3 input. Calculation of Dcum(365 from initial denitrification rates was only successful for samples from the NO3-bearing zone, whereas a lag-phase of denitrification in samples from deeper zones of NO3 free groundwater caused imprecise predictions.

    In our study, Dcum(365 of two sandy Pleistocene aquifers was predictable using a combination of short-term incubations and analysis of sediment parameters. Moreover, the protective lifetime of denitrification sufficient to remove NO3 from groundwater in the investigated aquifers is limited, which demonstrates the need to minimise anthropogenic NO3 input.

  1. Risk factors for long-term post-traumatic stress disorder among medical rescue workers appointed to the 2008 Wenchuan earthquake response in China.

    Science.gov (United States)

    Schenk, Ellen J; Yuan, Jun; Martel, Lise D; Shi, Guo-Qing; Han, Ke; Gao, Xing

    2017-10-01

    This study aims to determine the risk factors for clinically-significant post-traumatic stress disorder (PTSD) among Chinese medical rescue workers one year after the response to the Wenchuan earthquake on 12 May 2008. A sample of 337 medical workers who performed response work within the first three months of the event completed an online questionnaire, which included information on demographics, social support, the management and organisation of the disaster response, and an assessment of PTSD. Symptoms consistent with PTSD were prevalent in 17 per cent of the rescue workers. Those who developed PTSD symptoms were more likely to have been injured, experienced a water shortage, been disconnected from family and friends during the response, and have passive coping styles and neurotic personalities. Factors that cannot be changed easily, such as personality traits, should be evaluated prior to deployment to ensure that rescue workers at higher risk of PTSD are provided with adequate support before and during deployment. © 2017 The Author(s). Disasters © Overseas Development Institute, 2017.

  2. Adaptively Smoothed Seismicity Earthquake Forecasts for Italy

    CERN Document Server

    Werner, M J; Jackson, D D; Kagan, Y Y; Wiemer, S

    2010-01-01

    We present a model for estimating the probabilities of future earthquakes of magnitudes m > 4.95 in Italy. The model, a slightly modified version of the one proposed for California by Helmstetter et al. (2007) and Werner et al. (2010), approximates seismicity by a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog and a longer instrumental and historical catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and trustworthy, we used small earthquakes m>2.95 to illuminate active fault structur...

  3. A Multi-parametric Climatological Approach to Study the 2016 Amatrice-Norcia (Central Italy) Earthquake Preparatory Phase

    Science.gov (United States)

    Piscini, Alessandro; De Santis, Angelo; Marchetti, Dedalo; Cianchini, Gianfranco

    2017-10-01

    Based on observations prior to earthquakes, recent theoretical considerations suggest that some geophysical quantities reveal abnormal changes that anticipate moderate and strong earthquakes, within a defined spatial area (the so-called Dobrovolsky area) according to a lithosphere-atmosphere-ionosphere coupling model. One of the possible pre-earthquake effects could be the appearance of some climatological anomalies in the epicentral region, weeks/months before the major earthquakes. In this paper, the period of 2 months preceding the Amatrice-Norcia (Central Italy) earthquake sequence, that started on 24 August 2016 with an M6 earthquake and a few months later produced other two major shocks (i.e. an M5.9 on 26 October and then an M6.5 on 30 October), was analyzed in terms of skin temperature, total column water vapour and total column of ozone, compared with the past 37-year trend. The novelty of the method stands in the way the complete time series is reduced, where also the possible effect of global warming is properly removed. The simultaneous analysis showed the presence of persistent contemporary anomalies in all of the analysed parameters. To validate the technique, a confutation/confirmation analysis was undertaken where these parameters were successfully analyzed in the same months but considering a seismically "calm" year, when significant seismicity was not present. We also extended the analysis to all available years to construct a confusion matrix comparing the occurrence of climatological data anomalies with real seismicity. This work confirms the potentiality of multi parameters in anticipating the occurrence of large earthquakes in Central Italy, thus reinforcing the idea of considering such behaviour an effective tool for an integrated system of future earthquake prediction.

  4. Predicting long-term weight loss maintenance in previously overweight women: a signal detection approach.

    Science.gov (United States)

    Santos, Inês; Mata, Jutta; Silva, Marlene N; Sardinha, Luís B; Teixeira, Pedro J

    2015-05-01

    Examine psychological and behavioral predictors of 3-year weight loss maintenance in women. Participants were 154 women in a 1-year randomized controlled trial on weight management with a 2-year follow-up. Signal detection analyses identified behavioral and psychological variables that best predicted 5% and 10% weight loss at 3 years. Women with better body image were more likely to have lost ≥5% weight at 3 years (P women with poor body image but higher motivation were more likely to maintain weight loss than women with poor body image and lower motivation (P Women with high exercise autonomous motivation were three times more likely to have lost ≥10% weight than were those with lower autonomous motivation (P women with lower autonomous motivation, perceiving fewer exercise barriers was somewhat compensatory: these women were more likely to maintain weight loss than women with lower autonomy but more perceived barriers (P women, improving body image and increasing autonomous and intrinsic motivation for exercise likely promotes clinically significant long-term weight loss maintenance. Decreasing perceived exercise barriers is another promising intervention target. © 2015 The Obesity Society.

  5. 80 120 yr Long-term solar induced effects on the earth, past and predictions

    Science.gov (United States)

    Yousef, Shahinaz Moustafa

    The 80-120 year solar Wolf-Gleissberg cycles have wide effects on the Earth’s environment. Studying past effects can throw light on future predictions of solar terrestrial relations at similar solar activity levels. Solar induced climate changes do happen at the turning points of such cycles when changes in solar spin rates occur. Reversing of North Atlantic Oscillations can be interpreted in terms of solar stimuli. The sudden abrupt rises of lakes levels and closed seas are solar forced. It is anticipated that the Aral and the Dead Sea will recover in the near future. Following drought conditions in African Equatorial lakes by the end of cycle 23 around 2008 ± 2 yr, cyclic rises and falls of lakes level are expected to be coherent with the weak cycles 24 to perhaps 26 when solar forcings will reverse or cease to exist. The Atlanto Canadian fish disappearance dilemma is a natural Wolf-Gleissberg cycle induced effect and is expected to recover in due time.

  6. Deviations in Energy Sensing Predict Long-term Weight Change in Overweight Native Americans.

    Science.gov (United States)

    Basolo, Alessio; Votruba, Susanne B; Heinitz, Sascha; Krakoff, Jonathan; Piaggi, Paolo

    2018-01-03

    Energy expenditure (EE), as reflective of body energy demand, has been proposed to be the key driver of food intake, possibly influencing weight change in humans. Variation in this energy-sensing link (overeating relative to weight-maintaining energy requirements) may lead to weight gain over time. Sixty-one overweight otherwise healthy Native Americans (age: 34.0±7.9years, body fat: 39.7±9.5%, 36 males) were admitted to our clinical research unit for measurements of body composition by dual-energy X-ray absorptiometry, and 24-h EE and respiratory quotient (RQ) in a whole-room indirect calorimeter during energy balance and weight stability. Following this, ad libitum food intake was assessed for three days using computerized vending machines. Body weight change under unrestricted free-living conditions was assessed at an outpatient follow-up visit (median follow-up time=1.7years). Total ad libitum food intake (3-day average) was positively associated with 24-h EE (r=0.44, penergy requirements can be assessed and predicts long-term weight gain, suggesting that variation in energy sensing may influence appetite by favoring overeating thus promoting obesity development. Copyright © 2018. Published by Elsevier Inc.

  7. Relative performance of different numerical weather prediction models for short term predition of wind wnergy

    Energy Technology Data Exchange (ETDEWEB)

    Giebel, G.; Landberg, L. [Risoe National Lab., Wind Energy and Atmospheric Physics Dept., Roskilde (Denmark); Moennich, K.; Waldl, H.P. [Carl con Ossietzky Univ., Faculty of Physics, Dept. of Energy and Semiconductor, Oldenburg (Germany)

    1999-03-01

    In several approaches presented in other papers in this conference, short term forecasting of wind power for a time horizon covering the next two days is done on the basis of Numerical Weather Prediction (NWP) models. This paper explores the relative merits of HIRLAM, which is the model used by the Danish Meteorological Institute, the Deutschlandmodell from the German Weather Service and the Nested Grid Model used in the US. The performance comparison will be mainly done for a site in Germany which is in the forecasting area of both the Deutschlandmodell and HIRLAM. In addition, a comparison of measured data with the forecasts made for one site in Iowa will be included, which allows conclusions on the merits of all three models. Differences in the relative performances could be due to a better tailoring of one model to its country, or to a tighter grid, or could be a function of the distance between the grid points and the measuring site. Also the amount, in which the performance can be enhanced by the use of model output statistics (topic of other papers in this conference) could give insights into the performance of the models. (au)

  8. Latent profiles of nonresidential father engagement six years after divorce predict long-term offspring outcomes.

    Science.gov (United States)

    Modecki, Kathryn Lynn; Hagan, Melissa J; Sandler, Irwin; Wolchik, Sharlene A

    2015-01-01

    This study examined profiles of nonresidential father engagement (i.e., support to the adolescent, contact frequency, remarriage, relocation, and interparental conflict) with their adolescent children (N = 156) 6 to 8 years following divorce and the prospective relation between these profiles and the psychosocial functioning of their offspring, 9 years later. Parental divorce occurred during late childhood to early adolescence; indicators of nonresidential father engagement were assessed during adolescence, and mental health problems and academic achievement of offspring were assessed 9 years later in young adulthood. Three profiles of father engagement were identified in our sample of mainly White, non-Hispanic divorced fathers: Moderate Involvement/Low Conflict, Low Involvement/Moderate Conflict, and High Involvement/High Conflict. Profiles differentially predicted offspring outcomes 9 years later when they were young adults, controlling for quality of the mother-adolescent relationship, mother's remarriage, mother's income, and gender, age, and offspring mental health problems in adolescence. Offspring of fathers characterized as Moderate Involvement/Low Conflict had the highest academic achievement and the lowest number of externalizing problems 9 years later compared to offspring whose fathers had profiles indicating either the highest or lowest levels of involvement but higher levels of conflict. Results indicate that greater paternal psychosocial support and more frequent father-adolescent contact do not outweigh the negative impact of interparental conflict on youth outcomes in the long term. Implications of findings for policy and intervention are discussed.

  9. Short-term test for predicting the potential of xenobiotics to impair reproductive success in fish

    Energy Technology Data Exchange (ETDEWEB)

    Landner, L.; Neilson, A.H.; Soerensen, L.T.; Taernholm, A.V.; Viktor, T.

    1985-06-01

    Short-term screening tests with the zebra fish (Brachydanio rerio) have been developed for predicting the potential of xenobiotics to impair reproductive success in fish. The aim was to find simple and sensitive test parameters and to simulate exposure situations typical for anadromous fish species (salmonids), which generally cross heavily polluted coastal areas or estuaries before they reach uncontaminated upstream spawning areas. Therefore, particular attention was directed to tests designed to assess adverse effects induced during gametogenesis in adult fish. The test protocol involves exposure of adults prior to, but not during, spawning and the effects are measured in the offspring as alterations in hatching frequency and hatching rate of eggs, and survival and stress tolerance of embryos and larvae. Some representative examples of the application of these tests are given, and it is shown that impairment of reproductive success can be induced by exposure of parent fish prior to spawning at concentrations of xenobiotics at least five times lower than those yielding effects during direct exposure of embryos and larvae. It is suggested that, in hazard assessment programs, tests of the effect of xenobiotics on the offspring of preexposed adults be routinely incorporated.

  10. Prediction of long-term erosion from landfill covers in the southwest

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, C.E.; Stormont, J.C. [Univ. of New Mexico, Albuquerque, NM (United States)

    1997-12-31

    Erosion is a primary stressor of landfill covers, especially for climates with high intensity storms and low native plant density. Rills and gullies formed by discrete events can damage barrier layers and induce failure. Geomorphologic, empirical and physical modeling procedures are available to provide estimates of surface erosion, but numerical modeling requires accurate representation of the severe rainfall events that generate erosion. The National Weather Service precipitation frequency data and estimates of 5, 10, 15, 30 and 60-minute intensity can be statistically combined in a numerical model to obtain long-term erosion estimates. Physically based numerical models using the KINEROS and AHYMO programs have been utilized to predict the erosion from a southwestern landfill or waste containment site with 0.03, 0.05 and 0.08 meter per meter surface slopes. Results of AHYMO modeling were within 15 percent of average annual values computed with the empirical Universal Soil Loss Equation. However, the estimation of rill and gully formation that primarily degrades cover systems requires quantifying single events. For Southwestern conditions, a single 10-year storm can produce erosion quantifies equal to three times the average annual erosion and a 100-year storm can produce five times the average annual erosion.

  11. Could infarct location predict the long-term functional outcome in childhood arterial ischemic stroke?

    Directory of Open Access Journals (Sweden)

    Mauricio López-Espejo

    Full Text Available ABSTRACT Objective: To explore the influence of infarct location on long-term functional outcome following a first-ever arterial ischemic stroke (AIS in non-neonate children. Method: The MRIs of 39 children with AIS (median age 5.38 years; 36% girls; mean follow-up time 5.87 years were prospectively evaluated. Infarct location was classified as the absence or presence of subcortical involvement. Functional outcome was measured using the modified Rankin scale (mRS for children after the follow-up assessment. We utilized multivariate logistic regression models to estimate the odds ratios (ORs for the outcome while adjusting for age, sex, infarct size and middle cerebral artery territory involvement (significance < 0.05. Results: Both infarcts ≥ 4% of total brain volume (OR 9.92; CI 1.76 – 55.9; p 0.009 and the presence of subcortical involvement (OR 8.36; CI 1.76 – 53.6; p 0.025 independently increased the risk of marked functional impairment (mRS 3 to 5. Conclusion: Infarct extension and location can help predict the extent of disability after childhood AIS.

  12. Early Prediction of Long-Term Response to Cabergoline in Patients with Macroprolactinomas

    Directory of Open Access Journals (Sweden)

    Youngki Lee

    2014-09-01

    Full Text Available BackgroundCabergoline is typically effective for treating prolactinomas; however, some patients display cabergoline resistance, and the early characteristics of these patients remain unclear. We analyzed early indicators predicting long-term response to cabergoline.MethodsWe retrospectively reviewed the cases of 44 patients with macroprolactinomas who received cabergoline as first-line treatment; the patients were followed for a median of 16 months. The influence of various clinical parameters on outcomes was evaluated.ResultsForty patients (90.9% were treated medically and displayed tumor volume reduction (TVR of 74.7%, a prolactin normalization (NP rate of 81.8%, and a complete response (CR; TVR >50% with NP, without surgery rate of 70.5%. Most patients (93.1% with TVR ≥25% and NP at 3 months eventually achieved CR, whereas only 50% of patients with TVR ≥25% without NP and no patients with TVR 25% at 3 months without NP, particularly those with huge prolactinomas, because a delayed response may be achieved. As surgery can reduce the cabergoline dose necessary for successful disease control, it should be considered for cabergoline-resistant patients.

  13. Operational source term estimation and ensemble prediction for the Grimsvoetn 2011 event

    Science.gov (United States)

    Maurer, Christian; Arnold, Delia; Klonner, Robert; Wotawa, Gerhard

    2014-05-01

    The ESA-funded international project VAST (Volcanic Ash Strategic Initiative Team) includes focusing on a realistic source term estimation in the case of volcanic eruptions as well as on an estimate of the forecast uncertainty in the resulting atmospheric dispersion calculations, which partly derive from the forecast uncertainty in the meteorological input data. SEVIRI earth observation data serve as a basis for the source term estimation, from which the total atmospheric column ash content can be estimated. In an operational environment, the already available EUMETCAST VOLE product may be used. Further an a priori source term is needed, which can be coarsely estimated according to information from previous eruptions and/or constrained with observations of the eruption column. The link between observations and the a priori source is established by runs of the atmospheric transport model FLEXPART for individual emission periods and a predefined number of vertical levels. Through minimizing the differences between observations and model results the so-called a posteriori source term can be depicted for a certain time interval as a function of height. Such a result is shown for a first test case, the eruption of the Grimsvoetn volcano on Iceland in May 2011. Once the dispersion calculations are as optimized as possible with regard to the source term, the uncertainty stemming from the forecast uncertainty of the numeric weather prediction model used is still present, adding up to the unavoidable model errors. Since it is impossible to perform FLEXPART runs for all 50 members of the Integrated Forecasting System (IFS) of ECMWF due to computational (time-storage) constraints, the number of members gets restricted to five (maximum seven) representative runs via cluster analysis. The approach used is as of Klonner (2012) where it was demonstrated that exclusive consideration of the wind components on a pressure level (e.g. 400 hPa) makes it possible to find clusters and

  14. Integrated study of geophysical and biological anomalies before earthquakes (seismic and non-seismic), in Austria and Indonesia

    Science.gov (United States)

    Straka, Wolfgang; Assef, Rizkita; Faber, Robert; Ferasyi, Reza

    2015-04-01

    Earthquakes are commonly seen as unpredictable. Even when scientists believe an earthquake is likely, it is still hard to understand the indications observed, as well as their theoretical and practical implications. There is some controversy surrounding the concept of using animals as a precursor of earthquakes. Nonetheless, several institutes at University of Natural Resources and Life Sciences, and Vienna University of Technology, both Vienna, Austria, and Syiah Kuala University, Banda Aceh, as well as Terramath Indonesia, Buleleng, both Indonesia, cooperate in a long-term project, funded by Red Bull Media House, Salzburg, Austria, which aims at getting some decisive step forward from anecdotal to scientific evidence of those interdependencies, and show their possible use in forecasting seismic hazard on a short-term basis. Though no conclusive research has yet been published, an idea in this study is that even if animals do not respond to specific geophysical precursors and with enough notice to enable earthquake forecasting on that basis, they may at least enhance, in conjunction with other indications, the degree of certainty we can get of a prediction of an impending earthquake. In Indonesia, indeed, before the great earthquakes of 2004 and 2005, ominous geophysical as well as biological phenomena occurred (but were realized as precursors only in retrospect). Numerous comparable stories can be told from other times and regions. Nearly 2000 perceptible earthquakes (> M3.5) occur each year in Indonesia. Also, in 2007, the government has launched a program, focused on West Sumatra, for investigating earthquake precursors. Therefore, Indonesia is an excellent target area for a study concerning possible interconnections between geophysical and biological earthquake precursors. Geophysical and atmospheric measurements and behavioral observation of several animal species (elephant, domestic cattle, water buffalo, chicken, rat, catfish) are conducted in three areas

  15. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  16. POST Earthquake Debris Management — AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  17. The impact of the Wenchuan earthquake on birth outcomes.

    Directory of Open Access Journals (Sweden)

    Cong E Tan

    Full Text Available BACKGROUND: Earthquakes and other catastrophic events frequently occurring worldwide can be considered as outliers and cause a growing and urgent need to improve our understanding of the negative effects imposed by such disasters. Earthquakes can intensively impact the birth outcomes upon psychological and morphological development of the unborn children, albeit detailed characteristics remain obscure. METHODS AND FINDINGS: We utilized the birth records at Du Jiang Yan and Peng Zhou counties to investigate the birth outcomes as a consequence of a major earthquake occurred in Wenchuan, China on May 12, 2008. Totally 13,003 of neonates were recorded, with 6638 and 6365 for pre- and post- earthquake, respectively. Significant low birthweight, high ratio of low birthweight, and low Apgar scores of post-earthquake group were observed. In contrast, the sex ratio at birth, birth length and length of gestation did not show statistical differences. The overall ratio of birth-defect in the post-earthquake (1.18% is statistically high than that of pre-earthquake (0.99%, especially for those in the first trimester on earthquake day (1.47%. The birth-defect spectrum was dramatically altered after earthquake, with the markedly increased occurrences of ear malformations. The ratio of preterm birth post-earthquake (7.41% is significant increased than that of pre-earthquake (5.63%. For the birth outcomes of twins, significant differences of the ratio of twins, birth weight, ratio of low birthweight and birth-defect rate were observed after earthquake. CONCLUSION: A hospital-based study of birth outcomes impacted by the Wenchuan earthquake shows that the earthquake was associated with significant effects on birth outcomes, indicating it is a major monitor for long-term pregnant outcomes.

  18. Predicting long-term sickness absence and early retirement pension from self-reported work ability

    DEFF Research Database (Denmark)

    Sell, Lea; Bültmann, Ute; Rugulies, Reiner Ernst

    2009-01-01

    The aim of this paper is to examine the relationship between self-reported work ability and long-term term of sickness absence or early retirement from the labour market.......The aim of this paper is to examine the relationship between self-reported work ability and long-term term of sickness absence or early retirement from the labour market....

  19. Earthquake Prognosis With Applied Microseism.

    Science.gov (United States)

    Ahmedov, N.; Nagiyev, A.

    Earthquakes are the most dangerous natural catastrophy in terms of numerous casualties, amount of damages, areal coverage and difficulties associated with a need to provide secure measures. Inability to forecast these events makes the situation worse due to the following circumstances:-their buried focuses are invisible in the subsurface, they occur suddenly as a thunder, and some tens of the seconds later they leave devastated areas and casualties of tens of thousands of people. Currently earthquake forecausting is actually absolutely inefficient. Microseism application is one of the possible ways to forecast earthquakes. These small oscillation of up-going low-ampitude, irregular wawes observed on seismograms are refered to as microseism. Having been different from earhquakes itself, they are continuous, that is, have no origin coordinate on time axis. Their occurence is associated with breakers observed along shorelines, strong wind and hurricane patterns and so on. J.J.Linch has discovered a new tool to monitor hurricane motion trend over the seas with applied microseism recorded at ad hocstations. Similar to these observations it became possible to monitor the formation of the earthquake focuses based on correlation between low-frequency horizontal ahannels'N-S and E-W components. Microseism field and preceding abnormal variations monitoring data derived from "Cherepaha" 3M and 6/12 device enable to draw out some systematic trend in amplitude/frecuency domain. This relationship observed in a certain frequency range made it possible to define the generation of earthquake focuses with regard to the monitoring station. This variation trend was observed while Turkish and Iranian events happened during 1990,1992, and 1997. It is suggested to be useful to verify these effects in other regions to further correlate available data and work out common forecausting criteria.

  20. Forecasting the Rupture Directivity of Large Earthquakes

    Science.gov (United States)

    Donovan, J. R.; Jordan, T. H.

    2013-12-01

    Forecasting the rupture directivity of large earthquakes is an important problem in probabilistic seismic hazard analysis (PSHA), because directivity strongly influences ground motions. We cast this forecasting problem in terms of the conditional hypocenter distribution (CHD), defined to be the probability distribution of a hypocenter given the spatial distribution of fault slip (moment release). The simplest CHD is a uniform distribution for which the hypocenter probability density equals the moment-release probability density. We have compiled samples of CHDs from a global distribution of large earthquakes using three estimation methods: (a) location of hypocenters within the slip distribution from finite-fault inversions, (b) location of hypocenters within early aftershock distributions, and (c) direct inversion for the directivity parameter D, defined in terms of the degree-two polynomial moments of the source space-time function. The data from method (a) are statistically inconsistent with the uniform CHD suggested by McGuire et al. (2002) using method (c). Instead, the data indicate a 'centroid-biased' CHD, in which the expected distance between the hypocenter and the hypocentroid is less than that of a uniform CHD; i.e., the directivities inferred from finite-fault models appear to be closer to bilateral than predicted by the uniform CHD. One source of this discrepancy may be centroid bias in the second-order moments owing to poor localization of the slip in finite-fault inversions. We compare these observational results with CHDs computed from a large set of theoretical ruptures in the Southern California fault system produced by the Rate-State Quake simulator (RSQSim) of Dieterich and Richards-Dinger (2010) and discuss the implications for rupture dynamics and fault-zone heterogeneities.

  1. Brain atrophy and lesion load predict long term disability in multiple sclerosis

    DEFF Research Database (Denmark)

    Popescu, Veronica; Agosta, Federica; Hulst, Hanneke E

    2013-01-01

    To determine whether brain atrophy and lesion volumes predict subsequent 10 year clinical evolution in multiple sclerosis (MS).......To determine whether brain atrophy and lesion volumes predict subsequent 10 year clinical evolution in multiple sclerosis (MS)....

  2. Stress drops of induced and tectonic earthquakes in the central United States are indistinguishable.

    Science.gov (United States)

    Huang, Yihe; Ellsworth, William L; Beroza, Gregory C

    2017-08-01

    Induced earthquakes currently pose a significant hazard in the central United States, but there is considerable uncertainty about the severity of their ground motions. We measure stress drops of 39 moderate-magnitude induced and tectonic earthquakes in the central United States and eastern North America. Induced earthquakes, more than half of which are shallower than 5 km, show a comparable median stress drop to tectonic earthquakes in the central United States that are dominantly strike-slip but a lower median stress drop than that of tectonic earthquakes in the eastern North America that are dominantly reverse-faulting. This suggests that ground motion prediction equations developed for tectonic earthquakes can be applied to induced earthquakes if the effects of depth and faulting style are properly considered. Our observation leads to the notion that, similar to tectonic earthquakes, induced earthquakes are driven by tectonic stresses.

  3. Predicting successful long-term weight loss from short-term weight-loss outcomes: new insights from a dynamic energy balance model (the POUNDS Lost study).

    Science.gov (United States)

    Thomas, Diana M; Ivanescu, Andrada E; Martin, Corby K; Heymsfield, Steven B; Marshall, Kaitlyn; Bodrato, Victoria E; Williamson, Donald A; Anton, Stephen D; Sacks, Frank M; Ryan, Donna; Bray, George A

    2015-03-01

    Currently, early weight-loss predictions of long-term weight-loss success rely on fixed percent-weight-loss thresholds. The objective was to develop thresholds during the first 3 mo of intervention that include the influence of age, sex, baseline weight, percent weight loss, and deviations from expected weight to predict whether a participant is likely to lose 5% or more body weight by year 1. Data consisting of month 1, 2, 3, and 12 treatment weights were obtained from the 2-y Preventing Obesity Using Novel Dietary Strategies (POUNDS Lost) intervention. Logistic regression models that included covariates of age, height, sex, baseline weight, target energy intake, percent weight loss, and deviation of actual weight from expected were developed for months 1, 2, and 3 that predicted the probability of losing treatment approaches during early intervention. © 2015 American Society for Nutrition.

  4. Soluble Co-Signaling Molecules Predict Long-Term Graft Outcome in Kidney-Transplanted Patients

    Science.gov (United States)

    Melendreras, Susana G.; Martínez-Camblor, Pablo; Menéndez, Aurora; Bravo-Mendoza, Cristina; González-Vidal, Ana; Coto, Eliecer; Díaz-Corte, Carmen; Ruiz-Ortega, Marta; López-Larrea, Carlos; Suárez-Álvarez, Beatriz

    2014-01-01

    Co-signaling molecules are responsible for full T-cell activation after solid organ transplantation. Their increased expression can lead to the release of a soluble form that can modulate the immune response post-transplantation. We analyzed the presence of co-signaling molecules (sCD30, sCD40, sCD137, sCTLA-4, sCD80, sCD28, sCD40L, sPD-1, and sPD-L1) in serum from kidney-transplanted patients (n = 59) obtained at different times (before transplantation, and 15 days, 3 months and 1 year post-transplantation) and their contribution to graft outcome was evaluated using principal component analysis. Before transplantation, high levels of soluble co-signaling molecules (mainly sCD30, sCD137 and sCD40) were detected in all patients. These molecules were modulated soon after receiving an allograft but never attained similar levels to those of healthy controls. A signature based on the determination of six soluble co-stimulatory (sCD30, sCD40, sCD137 and sCD40L) and co-inhibitory (sPD-1 and sPD-L1) molecules at 3 months post-transplantation allowed a group of patients to be identified (27.12%) with a worse long-term graft outcome. Patients with high levels of soluble molecules showed a progressive and gradual deterioration of kidney function (increased creatinine and proteinuria levels and decreased estimated glomerular filtration rate) over time and a higher risk of graft loss at 6 years post-transplantation than patients with low levels of these molecules (62.55% versus 5.14%, psoluble co-signaling molecules in kidney-transplanted patients whose quantification at 3 months post-transplantation might be a useful biomarker of immune status and help to predict long-term graft evolution. PMID:25478957

  5. HAZGRIDX: earthquake forecasting model for ML≥ 5.0 earthquakes in Italy based on spatially smoothed seismicity

    Directory of Open Access Journals (Sweden)

    Aybige Akinci

    2010-11-01

    Full Text Available We present a five-year, time-independent, earthquake-forecast model for earthquake magnitudes of 5.0 and greater in Italy using spatially smoothed seismicity data. The model is called HAZGRIDX, and it was developed based on the assumption that future earthquakes will occur near locations of historical earthquakes; it does not take into account any information from tectonic, geological, or geodetic data. Thus HAZGRIDX is based on observed earthquake occurrence from seismicity data, without considering any physical model. In the present study, we calculate earthquake rates on a spatial grid platform using two declustered catalogs: 1 the Parametric catalog of Italian earthquakes (Catalogo Parametrico dei Terremoti Italiani, CPTI04 that contains the larger earthquakes from MW 7.0 since 1100; and 2 the Italian seismicity catalogue (Catalogo della Sismicità Italiana, CSI 1.1 that contains the small earthquakes down to ML 1.0, with a maximum of ML 5.9, over the past 22 years (1981-2003. The model assumes that earthquake magnitudes follow the Gutenberg-Richter law, with a uniform b-value. The forecast rates are presented in terms of the expected numbers of ML>5.0 events per year for each grid cell of about 10 km × 10 km. The final map is derived by averaging the earthquake potentials that come from these two different catalogs: CPTI04 and CSI 1.1. We also describe the earthquake occurrences in terms of probabilities of occurrence of one event within a specified magnitude bin, DM0.1, in a five year time period. HAZGRIDX is one of several forecasting models, scaled to five and ten years, that have been submitted to the Collaboratory for the Study of Earthquake Probability (CSEP forecasting center in ETH, Zurich, to be tested for Italy.

  6. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli