WorldWideScience

Sample records for short-term earthquake probability

  1. Short-term and long-term earthquake occurrence models for Italy: ETES, ERS and LTST

    Directory of Open Access Journals (Sweden)

    Maura Murru

    2010-11-01

    Full Text Available This study describes three earthquake occurrence models as applied to the whole Italian territory, to assess the occurrence probabilities of future (M ≥5.0 earthquakes: two as short-term (24 hour models, and one as long-term (5 and 10 years. The first model for short-term forecasts is a purely stochastic epidemic type earthquake sequence (ETES model. The second short-term model is an epidemic rate-state (ERS forecast based on a model that is physically constrained by the application to the earthquake clustering of the Dieterich rate-state constitutive law. The third forecast is based on a long-term stress transfer (LTST model that considers the perturbations of earthquake probability for interacting faults by static Coulomb stress changes. These models have been submitted to the Collaboratory for the Study of Earthquake Predictability (CSEP for forecast testing for Italy (ETH-Zurich, and they were locked down to test their validity on real data in a future setting starting from August 1, 2009.

  2. Prospective testing of Coulomb short-term earthquake forecasts

    Science.gov (United States)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  3. Long-term earthquake forecasts based on the epidemic-type aftershock sequence (ETAS model for short-term clustering

    Directory of Open Access Journals (Sweden)

    Jiancang Zhuang

    2012-07-01

    Full Text Available Based on the ETAS (epidemic-type aftershock sequence model, which is used for describing the features of short-term clustering of earthquake occurrence, this paper presents some theories and techniques related to evaluating the probability distribution of the maximum magnitude in a given space-time window, where the Gutenberg-Richter law for earthquake magnitude distribution cannot be directly applied. It is seen that the distribution of the maximum magnitude in a given space-time volume is determined in the longterm by the background seismicity rate and the magnitude distribution of the largest events in each earthquake cluster. The techniques introduced were applied to the seismicity in the Japan region in the period from 1926 to 2009. It was found that the regions most likely to have big earthquakes are along the Tohoku (northeastern Japan Arc and the Kuril Arc, both with much higher probabilities than the offshore Nankai and Tokai regions.

  4. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  5. Impact of Short-term Changes In Earthquake Hazard on Risk In Christchurch, New Zealand

    Science.gov (United States)

    Nyst, M.

    2012-12-01

    The recent Mw 7.1, 4 September 2010 Darfield, and Mw 6.2, 22 February 2011 Christchurch, New Zealand earthquakes and the following aftershock activity completely changed the existing view on earthquake hazard of the Christchurch area. Not only have several faults been added to the New Zealand fault database, the main shocks were also followed by significant increases in seismicity due to high aftershock activity throughout the Christchurch region that is still on-going. Probabilistic seismic hazard assessment (PSHA) models take into account a stochastic event set, the full range of possible events that can cause damage or loss at a particular location. This allows insurance companies to look at their risk profiles via average annual losses (AAL) and loss-exceedance curves. The loss-exceedance curve is derived from the full suite of seismic events that could impact the insured exposure and plots the probability of exceeding a particular loss level over a certain period. Insurers manage their risk by focusing on a certain return period exceedance benchmark, typically between the 100 and 250 year return period loss level, and then reserve the amount of money needed to account for that return period loss level, their so called capacity. This component of risk management is not too sensitive to short-term changes in risk due to aftershock seismicity, as it is mostly dominated by longer-return period, larger magnitude, more damaging events. However, because the secondairy uncertainties are taken into account when calculating the exceedance probability, even the longer return period losses can still experience significant impact from the inclusion of time-dependent earthquake behavior. AAL is calculated by summing the product of the expected loss level and the annual rate for all events in the event set that cause damage or loss at a particular location. This relatively simple metric is an important factor in setting the annual premiums. By annualizing the expected losses

  6. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  7. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  8. The USGS plan for short-term prediction of the anticipated Parkfield earthquake

    Science.gov (United States)

    Bakun, W.H.

    1988-01-01

    Aside from the goal of better understanding the Parkfield earthquake cycle, it is the intention of the U.S Geological Survey to attempt to issue a warning shortly before the anticipated earthquake. Although short-term earthquake warnings are not yet generally feasible, the wealth of information available for the previous significant Parkfield earthquakes suggests that if the next earthquake follows the pattern of "characteristic" Parkfield shocks, such a warning might be possible. Focusing on earthquake precursors reported for the previous  "characteristic" shocks, particulary the 1934 and 1966 events, the USGS developed a plan* in late 1985 on which to base earthquake warnings for Parkfield and has assisted State, county, and local officials in the Parkfield area to prepare a coordinated, reasonable response to a warning, should one be issued. 

  9. Four Examples of Short-Term and Imminent Prediction of Earthquakes

    Science.gov (United States)

    zeng, zuoxun; Liu, Genshen; Wu, Dabin; Sibgatulin, Victor

    2014-05-01

    We show here 4 examples of short-term and imminent prediction of earthquakes in China last year. They are Nima Earthquake(Ms5.2), Minxian Earthquake(Ms6.6), Nantou Earthquake (Ms6.7) and Dujiangyan Earthquake (Ms4.1) Imminent Prediction of Nima Earthquake(Ms5.2) Based on the comprehensive analysis of the prediction of Victor Sibgatulin using natural electromagnetic pulse anomalies and the prediction of Song Song and Song Kefu using observation of a precursory halo, and an observation for the locations of a degasification of the earth in the Naqu, Tibet by Zeng Zuoxun himself, the first author made a prediction for an earthquake around Ms 6 in 10 days in the area of the degasification point (31.5N, 89.0 E) at 0:54 of May 8th, 2013. He supplied another degasification point (31N, 86E) for the epicenter prediction at 8:34 of the same day. At 18:54:30 of May 15th, 2013, an earthquake of Ms5.2 occurred in the Nima County, Naqu, China. Imminent Prediction of Minxian Earthquake (Ms6.6) At 7:45 of July 22nd, 2013, an earthquake occurred at the border between Minxian and Zhangxian of Dingxi City (34.5N, 104.2E), Gansu province with magnitude of Ms6.6. We review the imminent prediction process and basis for the earthquake using the fingerprint method. 9 channels or 15 channels anomalous components - time curves can be outputted from the SW monitor for earthquake precursors. These components include geomagnetism, geoelectricity, crust stresses, resonance, crust inclination. When we compress the time axis, the outputted curves become different geometric images. The precursor images are different for earthquake in different regions. The alike or similar images correspond to earthquakes in a certain region. According to the 7-year observation of the precursor images and their corresponding earthquake, we usually get the fingerprint 6 days before the corresponding earthquakes. The magnitude prediction needs the comparison between the amplitudes of the fingerpringts from the same

  10. Fundamental questions of earthquake statistics, source behavior, and the estimation of earthquake probabilities from possible foreshocks

    Science.gov (United States)

    Michael, Andrew J.

    2012-01-01

    Estimates of the probability that an ML 4.8 earthquake, which occurred near the southern end of the San Andreas fault on 24 March 2009, would be followed by an M 7 mainshock over the following three days vary from 0.0009 using a Gutenberg–Richter model of aftershock statistics (Reasenberg and Jones, 1989) to 0.04 using a statistical model of foreshock behavior and long‐term estimates of large earthquake probabilities, including characteristic earthquakes (Agnew and Jones, 1991). I demonstrate that the disparity between the existing approaches depends on whether or not they conform to Gutenberg–Richter behavior. While Gutenberg–Richter behavior is well established over large regions, it could be violated on individual faults if they have characteristic earthquakes or over small areas if the spatial distribution of large‐event nucleations is disproportional to the rate of smaller events. I develop a new form of the aftershock model that includes characteristic behavior and combines the features of both models. This new model and the older foreshock model yield the same results when given the same inputs, but the new model has the advantage of producing probabilities for events of all magnitudes, rather than just for events larger than the initial one. Compared with the aftershock model, the new model has the advantage of taking into account long‐term earthquake probability models. Using consistent parameters, the probability of an M 7 mainshock on the southernmost San Andreas fault is 0.0001 for three days from long‐term models and the clustering probabilities following the ML 4.8 event are 0.00035 for a Gutenberg–Richter distribution and 0.013 for a characteristic‐earthquake magnitude–frequency distribution. Our decisions about the existence of characteristic earthquakes and how large earthquakes nucleate have a first‐order effect on the probabilities obtained from short‐term clustering models for these large events.

  11. Short-term volcano-tectonic earthquake forecasts based on a moving mean recurrence time algorithm: the El Hierro seismo-volcanic crisis experience

    Science.gov (United States)

    García, Alicia; De la Cruz-Reyna, Servando; Marrero, José M.; Ortiz, Ramón

    2016-05-01

    Under certain conditions, volcano-tectonic (VT) earthquakes may pose significant hazards to people living in or near active volcanic regions, especially on volcanic islands; however, hazard arising from VT activity caused by localized volcanic sources is rarely addressed in the literature. The evolution of VT earthquakes resulting from a magmatic intrusion shows some orderly behaviour that may allow the occurrence and magnitude of major events to be forecast. Thus governmental decision makers can be supplied with warnings of the increased probability of larger-magnitude earthquakes on the short-term timescale. We present here a methodology for forecasting the occurrence of large-magnitude VT events during volcanic crises; it is based on a mean recurrence time (MRT) algorithm that translates the Gutenberg-Richter distribution parameter fluctuations into time windows of increased probability of a major VT earthquake. The MRT forecasting algorithm was developed after observing a repetitive pattern in the seismic swarm episodes occurring between July and November 2011 at El Hierro (Canary Islands). From then on, this methodology has been applied to the consecutive seismic crises registered at El Hierro, achieving a high success rate in the real-time forecasting, within 10-day time windows, of volcano-tectonic earthquakes.

  12. Assigning probability gain for precursors of four large Chinese earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Cao, T.; Aki, K.

    1983-03-10

    We extend the concept of probability gain associated with a precursor (Aki, 1981) to a set of precursors which may be mutually dependent. Making use of a new formula, we derive a criterion for selecting precursors from a given data set in order to calculate the probability gain. The probabilities per unit time immediately before four large Chinese earthquakes are calculated. They are approximately 0.09, 0.09, 0.07 and 0.08 per day for 1975 Haicheng (M = 7.3), 1976 Tangshan (M = 7.8), 1976 Longling (M = 7.6), and Songpan (M = 7.2) earthquakes, respectively. These results are encouraging because they suggest that the investigated precursory phenomena may have included the complete information for earthquake prediction, at least for the above earthquakes. With this method, the step-by-step approach to prediction used in China may be quantified in terms of the probability of earthquake occurrence. The ln P versus t curve (where P is the probability of earthquake occurrence at time t) shows that ln P does not increase with t linearly but more rapidly as the time of earthquake approaches.

  13. Feasibility study of short-term earthquake prediction using ionospheric anomalies immediately before large earthquakes

    Science.gov (United States)

    Heki, K.; He, L.

    2017-12-01

    We showed that positive and negative electron density anomalies emerge above the fault immediately before they rupture, 40/20/10 minutes before Mw9/8/7 earthquakes (Heki, 2011 GRL; Heki and Enomoto, 2013 JGR; He and Heki 2017 JGR). These signals are stronger for earthquake with larger Mw and under higher background vertical TEC (total electron conetent) (Heki and Enomoto, 2015 JGR). The epicenter, the positive and the negative anomalies align along the local geomagnetic field (He and Heki, 2016 GRL), suggesting electric fields within ionosphere are responsible for making the anomalies (Kuo et al., 2014 JGR; Kelley et al., 2017 JGR). Here we suppose the next Nankai Trough earthquake that may occur within a few tens of years in Southwest Japan, and will discuss if we can recognize its preseismic signatures in TEC by real-time observations with GNSS.During high geomagnetic activities, large-scale traveling ionospheric disturbances (LSTID) often propagate from auroral ovals toward mid-latitude regions, and leave similar signatures to preseismic anomalies. This is a main obstacle to use preseismic TEC changes for practical short-term earthquake prediction. In this presentation, we show that the same anomalies appeared 40 minutes before the mainshock above northern Australia, the geomagnetically conjugate point of the 2011 Tohoku-oki earthquake epicenter. This not only demonstrates that electric fields play a role in making the preseismic TEC anomalies, but also offers a possibility to discriminate preseismic anomalies from those caused by LSTID. By monitoring TEC in the conjugate areas in the two hemisphere, we can recognize anomalies with simultaneous onset as those caused by within-ionosphere electric fields (e.g. preseismic anomalies, night-time MSTID) and anomalies without simultaneous onset as gravity-wave origin disturbances (e.g. LSTID, daytime MSTID).

  14. VAN method of short-term earthquake prediction shows promise

    Science.gov (United States)

    Uyeda, Seiya

    Although optimism prevailed in the 1970s, the present consensus on earthquake prediction appears to be quite pessimistic. However, short-term prediction based on geoelectric potential monitoring has stood the test of time in Greece for more than a decade [VarotsosandKulhanek, 1993] Lighthill, 1996]. The method used is called the VAN method.The geoelectric potential changes constantly due to causes such as magnetotelluric effects, lightning, rainfall, leakage from manmade sources, and electrochemical instabilities of electrodes. All of this noise must be eliminated before preseismic signals are identified, if they exist at all. The VAN group apparently accomplished this task for the first time. They installed multiple short (100-200m) dipoles with different lengths in both north-south and east-west directions and long (1-10 km) dipoles in appropriate orientations at their stations (one of their mega-stations, Ioannina, for example, now has 137 dipoles in operation) and found that practically all of the noise could be eliminated by applying a set of criteria to the data.

  15. Short- and Long-Term Earthquake Forecasts Based on Statistical Models

    Science.gov (United States)

    Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner

    2017-04-01

    The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.

  16. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    Science.gov (United States)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  17. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  18. Multiparameter monitoring of short-term earthquake precursors and its physical basis. Implementation in the Kamchatka region

    Directory of Open Access Journals (Sweden)

    Pulinets Sergey

    2016-01-01

    Full Text Available We apply experimental approach of the multiparameter monitoring of short-term earthquake precursors which reliability was confirmed by the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC model created recently [1]. A key element of the model is the process of Ion induced Nucleation (IIN and formation of cluster ions occurring as a result of the ionization of near surface air layer by radon emanating from the Earth's crust within the earthquake preparation zone. This process is similar to the formation of droplet’s embryos for cloud formation under action of galactic cosmic rays. The consequence of this process is the generation of a number of precursors that can be divided into two groups: a thermal and meteorological, and b electromagnetic and ionospheric. We demonstrate elements of prospective monitoring of some strong earthquakes in Kamchatka region and statistical results for the Chemical potential correction parameter for more than 10 years of observations for earthquakes with M≥6. As some experimental attempt, the data of Kamchatka volcanoes monitoring will be demonstrated.

  19. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran Kumar; Mai, Paul Martin

    2016-01-01

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  20. Evidence for Truncated Exponential Probability Distribution of Earthquake Slip

    KAUST Repository

    Thingbaijam, Kiran K. S.

    2016-07-13

    Earthquake ruptures comprise spatially varying slip on the fault surface, where slip represents the displacement discontinuity between the two sides of the rupture plane. In this study, we analyze the probability distribution of coseismic slip, which provides important information to better understand earthquake source physics. Although the probability distribution of slip is crucial for generating realistic rupture scenarios for simulation-based seismic and tsunami-hazard analysis, the statistical properties of earthquake slip have received limited attention so far. Here, we use the online database of earthquake source models (SRCMOD) to show that the probability distribution of slip follows the truncated exponential law. This law agrees with rupture-specific physical constraints limiting the maximum possible slip on the fault, similar to physical constraints on maximum earthquake magnitudes.We show the parameters of the best-fitting truncated exponential distribution scale with average coseismic slip. This scaling property reflects the control of the underlying stress distribution and fault strength on the rupture dimensions, which determines the average slip. Thus, the scale-dependent behavior of slip heterogeneity is captured by the probability distribution of slip. We conclude that the truncated exponential law accurately quantifies coseismic slip distribution and therefore allows for more realistic modeling of rupture scenarios. © 2016, Seismological Society of America. All rights reserverd.

  1. Long‐term time‐dependent probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3)

    Science.gov (United States)

    Field, Edward; Biasi, Glenn P.; Bird, Peter; Dawson, Timothy E.; Felzer, Karen R.; Jackson, David A.; Johnson, Kaj M.; Jordan, Thomas H.; Madden, Christopher; Michael, Andrew J.; Milner, Kevin; Page, Morgan T.; Parsons, Thomas E.; Powers, Peter; Shaw, Bruce E.; Thatcher, Wayne R.; Weldon, Ray J.; Zeng, Yuehua

    2015-01-01

    The 2014 Working Group on California Earthquake Probabilities (WGCEP 2014) presents time-dependent earthquake probabilities for the third Uniform California Earthquake Rupture Forecast (UCERF3). Building on the UCERF3 time-independent model, published previously, renewal models are utilized to represent elastic-rebound-implied probabilities. A new methodology has been developed that solves applicability issues in the previous approach for un-segmented models. The new methodology also supports magnitude-dependent aperiodicity and accounts for the historic open interval on faults that lack a date-of-last-event constraint. Epistemic uncertainties are represented with a logic tree, producing 5,760 different forecasts. Results for a variety of evaluation metrics are presented, including logic-tree sensitivity analyses and comparisons to the previous model (UCERF2). For 30-year M≥6.7 probabilities, the most significant changes from UCERF2 are a threefold increase on the Calaveras fault and a threefold decrease on the San Jacinto fault. Such changes are due mostly to differences in the time-independent models (e.g., fault slip rates), with relaxation of segmentation and inclusion of multi-fault ruptures being particularly influential. In fact, some UCERF2 faults were simply too long to produce M 6.7 sized events given the segmentation assumptions in that study. Probability model differences are also influential, with the implied gains (relative to a Poisson model) being generally higher in UCERF3. Accounting for the historic open interval is one reason. Another is an effective 27% increase in the total elastic-rebound-model weight. The exact factors influencing differences between UCERF2 and UCERF3, as well as the relative importance of logic-tree branches, vary throughout the region, and depend on the evaluation metric of interest. For example, M≥6.7 probabilities may not be a good proxy for other hazard or loss measures. This sensitivity, coupled with the

  2. Long-Term Fault Memory: A New Time-Dependent Recurrence Model for Large Earthquake Clusters on Plate Boundaries

    Science.gov (United States)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.; Campbell, M. R.

    2017-12-01

    A challenge for earthquake hazard assessment is that geologic records often show large earthquakes occurring in temporal clusters separated by periods of quiescence. For example, in Cascadia, a paleoseismic record going back 10,000 years shows four to five clusters separated by approximately 1,000 year gaps. If we are still in the cluster that began 1700 years ago, a large earthquake is likely to happen soon. If the cluster has ended, a great earthquake is less likely. For a Gaussian distribution of recurrence times, the probability of an earthquake in the next 50 years is six times larger if we are still in the most recent cluster. Earthquake hazard assessments typically employ one of two recurrence models, neither of which directly incorporate clustering. In one, earthquake probability is time-independent and modeled as Poissonian, so an earthquake is equally likely at any time. The fault has no "memory" because when a prior earthquake occurred has no bearing on when the next will occur. The other common model is a time-dependent earthquake cycle in which the probability of an earthquake increases with time until one happens, after which the probability resets to zero. Because the probability is reset after each earthquake, the fault "remembers" only the last earthquake. This approach can be used with any assumed probability density function for recurrence times. We propose an alternative, Long-Term Fault Memory (LTFM), a modified earthquake cycle model where the probability of an earthquake increases with time until one happens, after which it decreases, but not necessarily to zero. Hence the probability of the next earthquake depends on the fault's history over multiple cycles, giving "long-term memory". Physically, this reflects an earthquake releasing only part of the elastic strain stored on the fault. We use the LTFM to simulate earthquake clustering along the San Andreas Fault and Cascadia. In some portions of the simulated earthquake history, events would

  3. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    Science.gov (United States)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  4. Earthquake Probability Assessment for the Active Faults in Central Taiwan: A Case Study

    Directory of Open Access Journals (Sweden)

    Yi-Rui Lee

    2016-06-01

    Full Text Available Frequent high seismic activities occur in Taiwan due to fast plate motions. According to the historical records the most destructive earthquakes in Taiwan were caused mainly by inland active faults. The Central Geological Survey (CGS of Taiwan has published active fault maps in Taiwan since 1998. There are 33 active faults noted in the 2012 active fault map. After the Chi-Chi earthquake, CGS launched a series of projects to investigate the details to better understand each active fault in Taiwan. This article collected this data to develop active fault parameters and referred to certain experiences from Japan and the United States to establish a methodology for earthquake probability assessment via active faults. We consider the active faults in Central Taiwan as a good example to present the earthquake probability assessment process and results. The appropriate “probability model” was used to estimate the conditional probability where M ≥ 6.5 and M ≥ 7.0 earthquakes. Our result shows that the highest earthquake probability for M ≥ 6.5 earthquake occurring in 30, 50, and 100 years in Central Taiwan is the Tachia-Changhua fault system. Conversely, the lowest earthquake probability is the Chelungpu fault. The goal of our research is to calculate the earthquake probability of the 33 active faults in Taiwan. The active fault parameters are important information that can be applied in the following seismic hazard analysis and seismic simulation.

  5. Foreshock sequences and short-term earthquake predictability on East Pacific Rise transform faults.

    Science.gov (United States)

    McGuire, Jeffrey J; Boettcher, Margaret S; Jordan, Thomas H

    2005-03-24

    East Pacific Rise transform faults are characterized by high slip rates (more than ten centimetres a year), predominantly aseismic slip and maximum earthquake magnitudes of about 6.5. Using recordings from a hydroacoustic array deployed by the National Oceanic and Atmospheric Administration, we show here that East Pacific Rise transform faults also have a low number of aftershocks and high foreshock rates compared to continental strike-slip faults. The high ratio of foreshocks to aftershocks implies that such transform-fault seismicity cannot be explained by seismic triggering models in which there is no fundamental distinction between foreshocks, mainshocks and aftershocks. The foreshock sequences on East Pacific Rise transform faults can be used to predict (retrospectively) earthquakes of magnitude 5.4 or greater, in narrow spatial and temporal windows and with a high probability gain. The predictability of such transform earthquakes is consistent with a model in which slow slip transients trigger earthquakes, enrich their low-frequency radiation and accommodate much of the aseismic plate motion.

  6. Time-dependent earthquake probability calculations for southern Kanto after the 2011 M9.0 Tohoku earthquake

    Science.gov (United States)

    Nanjo, K. Z.; Sakai, S.; Kato, A.; Tsuruoka, H.; Hirata, N.

    2013-05-01

    Seismicity in southern Kanto activated with the 2011 March 11 Tohoku earthquake of magnitude M9.0, but does this cause a significant difference in the probability of more earthquakes at the present or in the To? future answer this question, we examine the effect of a change in the seismicity rate on the probability of earthquakes. Our data set is from the Japan Meteorological Agency earthquake catalogue, downloaded on 2012 May 30. Our approach is based on time-dependent earthquake probabilistic calculations, often used for aftershock hazard assessment, and are based on two statistical laws: the Gutenberg-Richter (GR) frequency-magnitude law and the Omori-Utsu (OU) aftershock-decay law. We first confirm that the seismicity following a quake of M4 or larger is well modelled by the GR law with b ˜ 1. Then, there is good agreement with the OU law with p ˜ 0.5, which indicates that the slow decay was notably significant. Based on these results, we then calculate the most probable estimates of future M6-7-class events for various periods, all with a starting date of 2012 May 30. The estimates are higher than pre-quake levels if we consider a period of 3-yr duration or shorter. However, for statistics-based forecasting such as this, errors that arise from parameter estimation must be considered. Taking into account the contribution of these errors to the probability calculations, we conclude that any increase in the probability of earthquakes is insignificant. Although we try to avoid overstating the change in probability, our observations combined with results from previous studies support the likelihood that afterslip (fault creep) in southern Kanto will slowly relax a stress step caused by the Tohoku earthquake. This afterslip in turn reminds us of the potential for stress redistribution to the surrounding regions. We note the importance of varying hazards not only in time but also in space to improve the probabilistic seismic hazard assessment for southern Kanto.

  7. Development of damage probability matrices based on Greek earthquake damage data

    Science.gov (United States)

    Eleftheriadou, Anastasia K.; Karabinis, Athanasios I.

    2011-03-01

    A comprehensive study is presented for empirical seismic vulnerability assessment of typical structural types, representative of the building stock of Southern Europe, based on a large set of damage statistics. The observational database was obtained from post-earthquake surveys carried out in the area struck by the September 7, 1999 Athens earthquake. After analysis of the collected observational data, a unified damage database has been created which comprises 180,945 damaged buildings from/after the near-field area of the earthquake. The damaged buildings are classified in specific structural types, according to the materials, seismic codes and construction techniques in Southern Europe. The seismic demand is described in terms of both the regional macroseismic intensity and the ratio α g/ a o, where α g is the maximum peak ground acceleration (PGA) of the earthquake event and a o is the unique value PGA that characterizes each municipality shown on the Greek hazard map. The relative and cumulative frequencies of the different damage states for each structural type and each intensity level are computed in terms of damage ratio. Damage probability matrices (DPMs) and vulnerability curves are obtained for specific structural types. A comparison analysis is fulfilled between the produced and the existing vulnerability models.

  8. Conditional Probabilities of Large Earthquake Sequences in California from the Physics-based Rupture Simulator RSQSim

    Science.gov (United States)

    Gilchrist, J. J.; Jordan, T. H.; Shaw, B. E.; Milner, K. R.; Richards-Dinger, K. B.; Dieterich, J. H.

    2017-12-01

    Within the SCEC Collaboratory for Interseismic Simulation and Modeling (CISM), we are developing physics-based forecasting models for earthquake ruptures in California. We employ the 3D boundary element code RSQSim (Rate-State Earthquake Simulator of Dieterich & Richards-Dinger, 2010) to generate synthetic catalogs with tens of millions of events that span up to a million years each. This code models rupture nucleation by rate- and state-dependent friction and Coulomb stress transfer in complex, fully interacting fault systems. The Uniform California Earthquake Rupture Forecast Version 3 (UCERF3) fault and deformation models are used to specify the fault geometry and long-term slip rates. We have employed the Blue Waters supercomputer to generate long catalogs of simulated California seismicity from which we calculate the forecasting statistics for large events. We have performed probabilistic seismic hazard analysis with RSQSim catalogs that were calibrated with system-wide parameters and found a remarkably good agreement with UCERF3 (Milner et al., this meeting). We build on this analysis, comparing the conditional probabilities of sequences of large events from RSQSim and UCERF3. In making these comparisons, we consider the epistemic uncertainties associated with the RSQSim parameters (e.g., rate- and state-frictional parameters), as well as the effects of model-tuning (e.g., adjusting the RSQSim parameters to match UCERF3 recurrence rates). The comparisons illustrate how physics-based rupture simulators might assist forecasters in understanding the short-term hazards of large aftershocks and multi-event sequences associated with complex, multi-fault ruptures.

  9. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    Science.gov (United States)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  10. Growth of verbal short-term memory of nonwords varying in phonotactic probability : A longitudinal study with monolingual and bilingual children

    NARCIS (Netherlands)

    Messer, Marielle H.|info:eu-repo/dai/nl/304835226; Verhagen, Josje|info:eu-repo/dai/nl/277955882; Boom, Jan|info:eu-repo/dai/nl/07472732X; Mayo, Aziza Y.|info:eu-repo/dai/nl/271313404; Leseman, Paul P M|info:eu-repo/dai/nl/070760810

    2015-01-01

    This study investigates the hypothesis that verbal short-term memory growth in young children can be explained by increases in long-term linguistic knowledge. To this aim, we compare children's recall of nonwords varying in phonotactic probability. If our assumption holds, there should be growth in

  11. Large LOCA-earthquake combination probability assessment - Load combination program. Project 1 summary report

    Energy Technology Data Exchange (ETDEWEB)

    Lu, S; Streit, R D; Chou, C K

    1980-01-01

    This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10{sup -12}). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)

  12. Large LOCA-earthquake combination probability assessment - Load combination program. Project 1 summary report

    International Nuclear Information System (INIS)

    Lu, S.; Streit, R.D.; Chou, C.K.

    1980-01-01

    This report summarizes work performed for the U.S. Nuclear Regulatory Commission (NRC) by the Load Combination Program at the Lawrence Livermore National Laboratory to establish a technical basis for the NRC to use in reassessing its requirement that earthquake and large loss-of-coolant accident (LOCA) loads be combined in the design of nuclear power plants. A systematic probabilistic approach is used to treat the random nature of earthquake and transient loading to estimate the probability of large LOCAs that are directly and indirectly induced by earthquakes. A large LOCA is defined in this report as a double-ended guillotine break of the primary reactor coolant loop piping (the hot leg, cold leg, and crossover) of a pressurized water reactor (PWR). Unit 1 of the Zion Nuclear Power Plant, a four-loop PWR-1, is used for this study. To estimate the probability of a large LOCA directly induced by earthquakes, only fatigue crack growth resulting from the combined effects of thermal, pressure, seismic, and other cyclic loads is considered. Fatigue crack growth is simulated with a deterministic fracture mechanics model that incorporates stochastic inputs of initial crack size distribution, material properties, stress histories, and leak detection probability. Results of the simulation indicate that the probability of a double-ended guillotine break, either with or without an earthquake, is very small (on the order of 10 -12 ). The probability of a leak was found to be several orders of magnitude greater than that of a complete pipe rupture. A limited investigation involving engineering judgment of a double-ended guillotine break indirectly induced by an earthquake is also reported. (author)

  13. Short-Term Robustness of Production Management Systems : New Methodology

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Gaury, E.G.A.

    2000-01-01

    This paper investigates the short-term robustness of production planning and control systems. This robustness is defined here as the systems ability to maintain short-term service probabilities (i.e., the probability that the fill rate remains within a prespecified range), in a variety of

  14. Statistical short-term earthquake prediction.

    Science.gov (United States)

    Kagan, Y Y; Knopoff, L

    1987-06-19

    A statistical procedure, derived from a theoretical model of fracture growth, is used to identify a foreshock sequence while it is in progress. As a predictor, the procedure reduces the average uncertainty in the rate of occurrence for a future strong earthquake by a factor of more than 1000 when compared with the Poisson rate of occurrence. About one-third of all main shocks with local magnitude greater than or equal to 4.0 in central California can be predicted in this way, starting from a 7-year database that has a lower magnitude cut off of 1.5. The time scale of such predictions is of the order of a few hours to a few days for foreshocks in the magnitude range from 2.0 to 5.0.

  15. Diagnosis of time of increased probability of volcanic earthquakes at Mt. Vesuvius zone

    CERN Document Server

    Rotwain, I; Kuznetsov, I V; Panza, G F; Peresan, A

    2003-01-01

    The possibility of intermediate-term earthquake prediction at Mt. Vesuvius by means of the algorithm CN is explored. CN was originally designed to identify the Times of Increased Probability (TIPs) for the occurrence of strong tectonic earthquakes, with magnitude M >= M sub 0 , within a region a priori delimited. Here the algorithm CN is applied, for the first time, to the analysis of volcanic seismicity. The earthquakes recorded at Mt. Vesuvius, during the period from February 1972 to October 2002, are considered and the magnitude threshold M sub 0 , selecting the events to be predicted, is varied within the range: 3.0 - 3.3. Satisfactory prediction results are obtained, by retrospective analysis, when a time scaling is introduced. In particular, when the length of the time windows is reduced by a factor 2.5 - 3, with respect to the standard version of CN algorithm, more than 90% of the events with M >= M sub 0 occur within the TIP intervals, with TIPs occupying about 30% of the total time considered. The co...

  16. Framing of decision problem in short and long term and probability perception

    Directory of Open Access Journals (Sweden)

    Anna Wielicka-Regulska

    2010-01-01

    Full Text Available Consumer preferences are dependent on problem framing and time perspective. For experiment’s participants avoiding of losses was less probable in distant time perspective than in near term. On the contrary, achieving gains in near future was less probable than in remote time. One may expect different reactions when presenting problem in terms of gains than in terms of losses. This can be exploited in promotion of highly desired social behaviours like savings for retirement, keeping good diet, investing in learning, and other advantageous activities that are usually put forward by consumers.

  17. An Ensemble Approach for Improved Short-to-Intermediate-Term Seismic Potential Evaluation

    Science.gov (United States)

    Yu, Huaizhong; Zhu, Qingyong; Zhou, Faren; Tian, Lei; Zhang, Yongxian

    2017-06-01

    Pattern informatics (PI), load/unload response ratio (LURR), state vector (SV), and accelerating moment release (AMR) are four previously unrelated subjects, which are sensitive, in varying ways, to the earthquake's source. Previous studies have indicated that the spatial extent of the stress perturbation caused by an earthquake scales with the moment of the event, allowing us to combine these methods for seismic hazard evaluation. The long-range earthquake forecasting method PI is applied to search for the seismic hotspots and identify the areas where large earthquake could be expected. And the LURR and SV methods are adopted to assess short-to-intermediate-term seismic potential in each of the critical regions derived from the PI hotspots, while the AMR method is used to provide us with asymptotic estimates of time and magnitude of the potential earthquakes. This new approach, by combining the LURR, SV and AMR methods with the choice of identified area of PI hotspots, is devised to augment current techniques for seismic hazard estimation. Using the approach, we tested the strong earthquakes occurred in Yunnan-Sichuan region, China between January 1, 2013 and December 31, 2014. We found that most of the large earthquakes, especially the earthquakes with magnitude greater than 6.0 occurred in the seismic hazard regions predicted. Similar results have been obtained in the prediction of annual earthquake tendency in Chinese mainland in 2014 and 2015. The studies evidenced that the ensemble approach could be a useful tool to detect short-to-intermediate-term precursory information of future large earthquakes.

  18. Diagnosis of time of increased probability of volcanic earthquakes at Mt. Vesuvius zone

    International Nuclear Information System (INIS)

    Rotwain, I.; Kuznetsov, I.; De Natale, G.; Peresan, A.; Panza, G.F.

    2003-06-01

    The possibility of intermediate-term earthquake prediction at Mt. Vesuvius by means of the algorithm CN is explored. CN was originally designed to identify the Times of Increased Probability (TIPs) for the occurrence of strong tectonic earthquakes, with magnitude M ≥ M 0 , within a region a priori delimited. Here the algorithm CN is applied, for the first time, to the analysis of volcanic seismicity. The earthquakes recorded at Mt. Vesuvius, during the period from February 1972 to October 2002, are considered and the magnitude threshold M 0 , selecting the events to be predicted, is varied within the range: 3.0 - 3.3. Satisfactory prediction results are obtained, by retrospective analysis, when a time scaling is introduced. In particular, when the length of the time windows is reduced by a factor 2.5 - 3, with respect to the standard version of CN algorithm, more than 90% of the events with M ≥ M 0 occur within the TIP intervals, with TIPs occupying about 30% of the total time considered. The control experiment 'Seismic History' demonstrates the stability of the obtained results and indicates that the algorithm CN can be applied to monitor the preparation of impending earthquakes with M ≥ 3.0 at Mt. Vesuvius. (author)

  19. First-passage Probability Estimation of an Earthquake Response of Seismically Isolated Containment Buildings

    International Nuclear Information System (INIS)

    Hahm, Dae-Gi; Park, Kwan-Soon; Koh, Hyun-Moo

    2008-01-01

    The awareness of a seismic hazard and risk is being increased rapidly according to the frequent occurrences of the huge earthquakes such as the 2008 Sichuan earthquake which caused about 70,000 confirmed casualties and a 20 billion U.S. dollars economic loss. Since an earthquake load contains various uncertainties naturally, the safety of a structural system under an earthquake excitation has been assessed by probabilistic approaches. In many structural applications for a probabilistic safety assessment, it is often regarded that the failure of a system will occur when the response of the structure firstly crosses the limit barrier within a specified interval of time. The determination of such a failure probability is usually called the 'first-passage problem' and has been extensively studied during the last few decades. However, especially for the structures which show a significant nonlinear dynamic behavior, an effective and accurate method for the estimation of such a failure probability is not fully established yet. In this study, we presented a new approach to evaluate the first-passage probability of an earthquake response of seismically isolated structures. The proposed method is applied to the seismic isolation system for the containment buildings of a nuclear power plant. From the numerical example, we verified that the proposed method shows accurate results with more efficient computational efforts compared to the conventional approaches

  20. Permanently enhanced dynamic triggering probabilities as evidenced by two M ≥ 7.5 earthquakes

    Science.gov (United States)

    Gomberg, Joan S.

    2013-01-01

    The 2012 M7.7 Haida Gwaii earthquake radiated waves that likely dynamically triggered the 2013M7.5 Craig earthquake, setting two precedents. First, the triggered earthquake is the largest dynamically triggered shear failure event documented to date. Second, the events highlight a connection between geologic structure, sedimentary troughs that act as waveguides, and triggering probability. The Haida Gwaii earthquake excited extraordinarily large waves within and beyond the Queen Charlotte Trough, which propagated well into mainland Alaska and likely triggering the Craig earthquake along the way. Previously, focusing and associated dynamic triggering have been attributed to unpredictable source effects. This case suggests that elevated dynamic triggering probabilities may exist along the many structures where sedimentary troughs overlie major faults, such as subduction zones’ accretionary prisms and transform faults’ axial valleys. Although data are sparse, I find no evidence of accelerating seismic activity in the vicinity of the Craig rupture between it and the Haida Gwaii earthquake.

  1. How long-term dynamics of sediment subduction controls short-term dynamics of seismicity

    Science.gov (United States)

    Brizzi, S.; van Zelst, I.; van Dinther, Y.; Funiciello, F.; Corbi, F.

    2017-12-01

    Most of the world's greatest earthquakes occur along the subduction megathrust. Weak and porous sediments have been suggested to homogenize the plate interface and thereby promote lateral rupture propagation and great earthquakes. However, the importance of sediment thickness, let alone their physical role, is not yet unequivocally established. Based on a multivariate statistical analysis of a global database of 62 subduction segments, we confirm that sediment thickness is one of the key parameters controlling the maximum magnitude a megathrust can generate. Moreover, Monte Carlo simulations highlighted that the occurrence of great earthquakes on sediment-rich subduction segments is very unlikely (p-value≪0.05) related to pure chance. To understand how sediments in the subduction channel regulate earthquake size, this study extends and demystifies multivariate, spatiotemporally limited data through numerical modeling. We use the 2D Seismo-Thermo-Mechanical modeling approach to simulate both the long- and short-term dynamics of subduction and related seismogenesis (van Dinther et al., JGR, 2013). These models solve for the conservation of mass, momentum and energy using a visco-elasto-plastic rheology with rate-dependent friction. Results show that subducted sediments have a strong influence on the long-term evolution of the convergent margin. Increasing the sediment thickness on the incoming plate from 0 to 6 km causes a decrease of slab dip from 23° to 10°. This, in addition to increased radiogenic heating, extends isotherms, thereby widening the seismogenic portion of the megathrust from 80 to 150 km. Consequently, over tens of thousands of years, we observe that the maximum moment magnitude of megathrust earthquakes increases from 8.2 to 9.2 for these shallower and warmer interfaces. In addition, we observe more and larger splay faults, which could enhance vertical seafloor displacements. These results highlight the primary role of subducted sediments in

  2. Long‐term creep rates on the Hayward Fault: evidence for controls on the size and frequency of large earthquakes

    Science.gov (United States)

    Lienkaemper, James J.; McFarland, Forrest S.; Simpson, Robert W.; Bilham, Roger; Ponce, David A.; Boatwright, John; Caskey, S. John

    2012-01-01

    The Hayward fault (HF) in California exhibits large (Mw 6.5–7.1) earthquakes with short recurrence times (161±65 yr), probably kept short by a 26%–78% aseismic release rate (including postseismic). Its interseismic release rate varies locally over time, as we infer from many decades of surface creep data. Earliest estimates of creep rate, primarily from infrequent surveys of offset cultural features, revealed distinct spatial variation in rates along the fault, but no detectable temporal variation. Since the 1989 Mw 6.9 Loma Prieta earthquake (LPE), monitoring on 32 alinement arrays and 5 creepmeters has greatly improved the spatial and temporal resolution of creep rate. We now identify significant temporal variations, mostly associated with local and regional earthquakes. The largest rate change was a 6‐yr cessation of creep along a 5‐km length near the south end of the HF, attributed to a regional stress drop from the LPE, ending in 1996 with a 2‐cm creep event. North of there near Union City starting in 1991, rates apparently increased by 25% above pre‐LPE levels on a 16‐km‐long reach of the fault. Near Oakland in 2007 an Mw 4.2 earthquake initiated a 1–2 cm creep event extending 10–15 km along the fault. Using new better‐constrained long‐term creep rates, we updated earlier estimates of depth to locking along the HF. The locking depths outline a single, ∼50‐km‐long locked or retarded patch with the potential for an Mw∼6.8 event equaling the 1868 HF earthquake. We propose that this inferred patch regulates the size and frequency of large earthquakes on HF.

  3. Stress transferred by the 1995 Mw = 6.9 Kobe, Japan, shock: Effect on aftershocks and future earthquake probabilities

    Science.gov (United States)

    Toda, S.; Stein, R.S.; Reasenberg, P.A.; Dieterich, J.H.; Yoshida, A.

    1998-01-01

    The Kobe earthquake struck at the edge of the densely populated Osaka-Kyoto corridor in southwest Japan. We investigate how the earthquake transferred stress to nearby faults, altering their proximity to failure and thus changing earthquake probabilities. We find that relative to the pre-Kobe seismicity, Kobe aftershocks were concentrated in regions of calculated Coulomb stress increase and less common in regions of stress decrease. We quantify this relationship by forming the spatial correlation between the seismicity rate change and the Coulomb stress change. The correlation is significant for stress changes greater than 0.2-1.0 bars (0.02-0.1 MPa), and the nonlinear dependence of seismicity rate change on stress change is compatible with a state- and rate-dependent formulation for earthquake occurrence. We extend this analysis to future mainshocks by resolving the stress changes on major faults within 100 km of Kobe and calculating the change in probability caused by these stress changes. Transient effects of the stress changes are incorporated by the state-dependent constitutive relation, which amplifies the permanent stress changes during the aftershock period. Earthquake probability framed in this manner is highly time-dependent, much more so than is assumed in current practice. Because the probabilities depend on several poorly known parameters of the major faults, we estimate uncertainties of the probabilities by Monte Carlo simulation. This enables us to include uncertainties on the elapsed time since the last earthquake, the repeat time and its variability, and the period of aftershock decay. We estimate that a calculated 3-bar (0.3-MPa) stress increase on the eastern section of the Arima-Takatsuki Tectonic Line (ATTL) near Kyoto causes fivefold increase in the 30-year probability of a subsequent large earthquake near Kyoto; a 2-bar (0.2-MPa) stress decrease on the western section of the ATTL results in a reduction in probability by a factor of 140 to

  4. Surface rupturing earthquakes repeated in the 300 years along the ISTL active fault system, central Japan

    Science.gov (United States)

    Katsube, Aya; Kondo, Hisao; Kurosawa, Hideki

    2017-06-01

    Surface rupturing earthquakes produced by intraplate active faults generally have long recurrence intervals of a few thousands to tens of thousands of years. We here report the first evidence for an extremely short recurrence interval of 300 years for surface rupturing earthquakes on an intraplate system in Japan. The Kamishiro fault of the Itoigawa-Shizuoka Tectonic Line (ISTL) active fault system generated a Mw 6.2 earthquake in 2014. A paleoseismic trench excavation across the 2014 surface rupture showed the evidence for the 2014 event and two prior paleoearthquakes. The slip of the penultimate earthquake was similar to that of 2014 earthquake, and its timing was constrained to be after A.D. 1645. Judging from the timing, the damaged area, and the amount of slip, the penultimate earthquake most probably corresponds to a historical earthquake in A.D. 1714. The recurrence interval of the two most recent earthquakes is thus extremely short compared with intervals on other active faults known globally. Furthermore, the slip repetition during the last three earthquakes is in accordance with the time-predictable recurrence model rather than the characteristic earthquake model. In addition, the spatial extent of the 2014 surface rupture accords with the distribution of a serpentinite block, suggesting that the relatively low coefficient of friction may account for the unusually frequent earthquakes. These findings would affect long-term forecast of earthquake probability and seismic hazard assessment on active faults.

  5. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  6. Re‐estimated effects of deep episodic slip on the occurrence and probability of great earthquakes in Cascadia

    Science.gov (United States)

    Beeler, Nicholas M.; Roeloffs, Evelyn A.; McCausland, Wendy

    2013-01-01

    Mazzotti and Adams (2004) estimated that rapid deep slip during typically two week long episodes beneath northern Washington and southern British Columbia increases the probability of a great Cascadia earthquake by 30–100 times relative to the probability during the ∼58 weeks between slip events. Because the corresponding absolute probability remains very low at ∼0.03% per week, their conclusion is that though it is more likely that a great earthquake will occur during a rapid slip event than during other times, a great earthquake is unlikely to occur during any particular rapid slip event. This previous estimate used a failure model in which great earthquakes initiate instantaneously at a stress threshold. We refine the estimate, assuming a delayed failure model that is based on laboratory‐observed earthquake initiation. Laboratory tests show that failure of intact rock in shear and the onset of rapid slip on pre‐existing faults do not occur at a threshold stress. Instead, slip onset is gradual and shows a damped response to stress and loading rate changes. The characteristic time of failure depends on loading rate and effective normal stress. Using this model, the probability enhancement during the period of rapid slip in Cascadia is negligible (stresses of 10 MPa or more and only increases by 1.5 times for an effective normal stress of 1 MPa. We present arguments that the hypocentral effective normal stress exceeds 1 MPa. In addition, the probability enhancement due to rapid slip extends into the interevent period. With this delayed failure model for effective normal stresses greater than or equal to 50 kPa, it is more likely that a great earthquake will occur between the periods of rapid deep slip than during them. Our conclusion is that great earthquake occurrence is not significantly enhanced by episodic deep slip events.

  7. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks

    International Nuclear Information System (INIS)

    Zhuang Jiancang; Ogata, Yosihiko

    2006-01-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata et al., Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method

  8. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    Science.gov (United States)

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  9. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  10. The Effects of a Short-term Cognitive Behavioral Group Intervention on Bam Earthquake Related PTSD Symptoms in Adolescents

    Directory of Open Access Journals (Sweden)

    Fatemeh Naderi

    2009-04-01

    Full Text Available "n "n "nObjective :Post traumatic stress disorder (PTSD may be the first reaction after disasters. Many studies have shown the efficacy of cognitive- behavioral therapy in treatment of post traumatic stress disorder. The main objective of this study is to evaluate the effect of group CBT in adolescent survivors of a large scale disaster (Bam earthquake. "n "nMethods: In a controlled trial, we evaluated the efficacy of a short term method of group cognitive-behavioral therapy in adolescent survivors of Bam earthquake who had PTSD symptoms and compared it with a control group. The adolescents who had severe PTSD or other psychiatric disorders that needed pharmacological interventions were excluded. We evaluated PTSD symptoms using Post traumatic Stress Scale (PSS pre and post intervention and compared them with a control group. "n "nResults: 100 adolescents were included in the study and 15 were excluded during the intervention. The mean age of the participants was 14.6±2.1 years. The mean score of total PTSD symptoms and the symptoms of avoidance was reduced after interventions, and was statistically significant. The mean change of re-experience and hyper arousal symptoms of PTSD were not significant. "n "nConclusion: Psychological debriefing and group cognitive behavioral therapy may be effective in reducing some of the PTSD symptoms.

  11. Short-term robustness of production management systems

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Gaury, E.G.A.

    1998-01-01

    Short-term performance of a production management system for make-to-stock factories may be quantified through the service rate per shift; long-term performance through the average monthly work in process (WIP). This may yield, for example, that WIP is minimized, while the probability of the service

  12. Lessons of L'Aquila for Operational Earthquake Forecasting

    Science.gov (United States)

    Jordan, T. H.

    2012-12-01

    and failures-to-predict. The best way to achieve this separation is to use probabilistic rather than deterministic statements in characterizing short-term changes in seismic hazards. The ICEF recommended establishing OEF systems that can provide the public with open, authoritative, and timely information about the short-term probabilities of future earthquakes. Because the public needs to be educated into the scientific conversation through repeated communication of probabilistic forecasts, this information should be made available at regular intervals, during periods of normal seismicity as well as during seismic crises. In an age of nearly instant information and high-bandwidth communication, public expectations regarding the availability of authoritative short-term forecasts are rapidly evolving, and there is a greater danger that information vacuums will spawn informal predictions and misinformation. L'Aquila demonstrates why the development of OEF capabilities is a requirement, not an option.

  13. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Science.gov (United States)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  14. Physical bases of the generation of short-term earthquake precursors: A complex model of ionization-induced geophysical processes in the lithosphere-atmosphere-ionosphere-magnetosphere system

    Science.gov (United States)

    Pulinets, S. A.; Ouzounov, D. P.; Karelin, A. V.; Davidenko, D. V.

    2015-07-01

    This paper describes the current understanding of the interaction between geospheres from a complex set of physical and chemical processes under the influence of ionization. The sources of ionization involve the Earth's natural radioactivity and its intensification before earthquakes in seismically active regions, anthropogenic radioactivity caused by nuclear weapon testing and accidents in nuclear power plants and radioactive waste storage, the impact of galactic and solar cosmic rays, and active geophysical experiments using artificial ionization equipment. This approach treats the environment as an open complex system with dissipation, where inherent processes can be considered in the framework of the synergistic approach. We demonstrate the synergy between the evolution of thermal and electromagnetic anomalies in the Earth's atmosphere, ionosphere, and magnetosphere. This makes it possible to determine the direction of the interaction process, which is especially important in applications related to short-term earthquake prediction. That is why the emphasis in this study is on the processes proceeding the final stage of earthquake preparation; the effects of other ionization sources are used to demonstrate that the model is versatile and broadly applicable in geophysics.

  15. Long-term predictability of regions and dates of strong earthquakes

    Science.gov (United States)

    Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

    2016-04-01

    parameters and seismic events. Further development of the H-104 method is the plotting of H-104 trajectories in two-dimensional time coordinates. The method provides the dates of future earthquakes for several (3-4) sequential time intervals multiple of 104 days. The H-104 method could be used together with the empirical scheme for short-term earthquake prediction reducing the date uncertainty. Using the H-104 method, it is developed the following long-term forecast of seismic activity. 1. The total number of M6+ earthquakes expected in the time frames: - 10.01-07.02: 14; - 08.02-08.03: 17; - 09.03-06.04: 9. 3. The potential days of M6+ earthquakes expected in the period of 10.01.2016-06.04.2016 are the following: - in January: 17, 18, 23, 24, 26, 28, 31; - in February: 01, 02, 05, 12, 15, 18, 20, 23; - in March: 02, 04, 05, 07 (M7+ is possible), 09, 10, 17 (M7+ is possible), 19, 20 (M7+ is possible), 23 (M7+ is possible), 30; - in April: 02, 06. The work was financially supported by the Ministry of Education and Science of the Russian Federation (contract No. 14.577.21.0109, project UID RFMEFI57714X0109)

  16. Probable Maximum Earthquake Magnitudes for the Cascadia Subduction

    Science.gov (United States)

    Rong, Y.; Jackson, D. D.; Magistrale, H.; Goldfinger, C.

    2013-12-01

    The concept of maximum earthquake magnitude (mx) is widely used in seismic hazard and risk analysis. However, absolute mx lacks a precise definition and cannot be determined from a finite earthquake history. The surprising magnitudes of the 2004 Sumatra and the 2011 Tohoku earthquakes showed that most methods for estimating mx underestimate the true maximum if it exists. Thus, we introduced the alternate concept of mp(T), probable maximum magnitude within a time interval T. The mp(T) can be solved using theoretical magnitude-frequency distributions such as Tapered Gutenberg-Richter (TGR) distribution. The two TGR parameters, β-value (which equals 2/3 b-value in the GR distribution) and corner magnitude (mc), can be obtained by applying maximum likelihood method to earthquake catalogs with additional constraint from tectonic moment rate. Here, we integrate the paleoseismic data in the Cascadia subduction zone to estimate mp. The Cascadia subduction zone has been seismically quiescent since at least 1900. Fortunately, turbidite studies have unearthed a 10,000 year record of great earthquakes along the subduction zone. We thoroughly investigate the earthquake magnitude-frequency distribution of the region by combining instrumental and paleoseismic data, and using the tectonic moment rate information. To use the paleoseismic data, we first estimate event magnitudes, which we achieve by using the time interval between events, rupture extent of the events, and turbidite thickness. We estimate three sets of TGR parameters: for the first two sets, we consider a geographically large Cascadia region that includes the subduction zone, and the Explorer, Juan de Fuca, and Gorda plates; for the third set, we consider a narrow geographic region straddling the subduction zone. In the first set, the β-value is derived using the GCMT catalog. In the second and third sets, the β-value is derived using both the GCMT and paleoseismic data. Next, we calculate the corresponding mc

  17. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    Energy Technology Data Exchange (ETDEWEB)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com [Karadeniz Technical University, Trabzon (Turkey); Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr [Ağrı İbrahim Çeçen University, Ağrı (Turkey)

    2016-04-18

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed time using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.

  18. Prediction of accident sequence probabilities in a nuclear power plant due to earthquake events

    International Nuclear Information System (INIS)

    Hudson, J.M.; Collins, J.D.

    1980-01-01

    This paper presents a methodology to predict accident probabilities in nuclear power plants subject to earthquakes. The resulting computer program accesses response data to compute component failure probabilities using fragility functions. Using logical failure definitions for systems, and the calculated component failure probabilities, initiating event and safety system failure probabilities are synthesized. The incorporation of accident sequence expressions allows the calculation of terminal event probabilities. Accident sequences, with their occurrence probabilities, are finally coupled to a specific release category. A unique aspect of the methodology is an analytical procedure for calculating top event probabilities based on the correlated failure of primary events

  19. Experimental evidence on formation of imminent and short-term hydrochemical precursors for earthquakes

    International Nuclear Information System (INIS)

    Du Jianguo; Amita, Kazuhiro; Ohsawa, Shinji; Zhang Youlian; Kang Chunli; Yamada, Makoto

    2010-01-01

    The formation of imminent hydrochemical precursors of earthquakes is investigated by the simulation for water-rock reaction in a brittle aquifer. Sixty-one soaking experiments were carried out with granodiorite and trachyandesite grains of different sizes and three chemically-distinct waters for 6 to 168 h. The experimental data demonstrate that water-rock reaction can result in both measurable increases and decreases of ion concentrations in short times and that the extents of hydrochemical variations are controlled by the grain size, dissolution and secondary mineral precipitation, as well as the chemistry of the rock and groundwater. The results indicate that water-rock reactions in brittle aquifers and aquitards may be an important genetic mechanism of hydrochemical seismic precursors when the aquifers and aquitards are fractured in response to tectonic stress.

  20. Scenario for a Short-Term Probabilistic Seismic Hazard Assessment (PSHA in Chiayi, Taiwan

    Directory of Open Access Journals (Sweden)

    Chung-Han Chan

    2013-01-01

    Full Text Available Using seismic activity and the Meishan earthquake sequence that occurred from 1904 to 1906, a scenario for short-term probabilistic seismic hazards in the Chiayi region of Taiwan is assessed. The long-term earthquake occurrence rate in Taiwan was evaluated using a smoothing kernel. The highest seismicity rate was calculated around the Chiayi region. To consider earthquake interactions, the rate-and-state friction model was introduced to estimate the seismicity rate evolution due to the Coulomb stress change. As imparted by the 1904 Touliu earthquake, stress changes near the 1906 Meishan and Yangshuigang epicenters was higher than the magnitude of tidal triggering. With regard to the impact of the Meishan earthquake, the region close to the Yangshuigang earthquake epicenter had a +0.75 bar stress increase. The results indicated significant interaction between the three damage events. Considering the path and site effect using ground motion prediction equations, a probabilistic seismic hazard in the form of a hazard evolution and a hazard map was assessed. A significant elevation in hazards following the three earthquakes in the sequence was determined. The results illustrate a possible scenario for seismic hazards in the Chiayi region which may take place repeatly in the future. Such scenario provides essential information on earthquake preparation, devastation estimations, emergency sheltering, utility restoration, and structure reconstruction.

  1. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  2. Initiation process of earthquakes and its implications for seismic hazard reduction strategy.

    Science.gov (United States)

    Kanamori, H

    1996-04-30

    For the average citizen and the public, "earthquake prediction" means "short-term prediction," a prediction of a specific earthquake on a relatively short time scale. Such prediction must specify the time, place, and magnitude of the earthquake in question with sufficiently high reliability. For this type of prediction, one must rely on some short-term precursors. Examinations of strain changes just before large earthquakes suggest that consistent detection of such precursory strain changes cannot be expected. Other precursory phenomena such as foreshocks and nonseismological anomalies do not occur consistently either. Thus, reliable short-term prediction would be very difficult. Although short-term predictions with large uncertainties could be useful for some areas if their social and economic environments can tolerate false alarms, such predictions would be impractical for most modern industrialized cities. A strategy for effective seismic hazard reduction is to take full advantage of the recent technical advancements in seismology, computers, and communication. In highly industrialized communities, rapid earthquake information is critically important for emergency services agencies, utilities, communications, financial companies, and media to make quick reports and damage estimates and to determine where emergency response is most needed. Long-term forecast, or prognosis, of earthquakes is important for development of realistic building codes, retrofitting existing structures, and land-use planning, but the distinction between short-term and long-term predictions needs to be clearly communicated to the public to avoid misunderstanding.

  3. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    Science.gov (United States)

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  4. Saving and Re-building Lives: Determinants of Short-term and Long-term Disaster Relief

    Directory of Open Access Journals (Sweden)

    Geethanjali SELVARETNAM

    2014-11-01

    Full Text Available We analyse both theoretically and empirically, the factors that influence the amount of humanitarian aid received by countries which are struck by natural disasters, particularly distinguishing between immediate disaster relief and long term humanitarian aid. The theoretical model is able to make predictions as well as explain some of the peculiarities in the empirical results. We show that both short and long term humanitarian aid increases with number of people killed, financial loss and level of corruption, while GDP per capita had no effect. More populated countries receive more humanitarian aid. Earthquake, tsunami and drought attract more aid.

  5. Distribution of incremental static stress caused by earthquakes

    Directory of Open Access Journals (Sweden)

    Y. Y. Kagan

    1994-01-01

    Full Text Available Theoretical calculations, simulations and measurements of rotation of earthquake focal mechanisms suggest that the stress in earthquake focal zones follows the Cauchy distribution which is one of the stable probability distributions (with the value of the exponent α equal to 1. We review the properties of the stable distributions and show that the Cauchy distribution is expected to approximate the stress caused by earthquakes occurring over geologically long intervals of a fault zone development. However, the stress caused by recent earthquakes recorded in instrumental catalogues, should follow symmetric stable distributions with the value of α significantly less than one. This is explained by a fractal distribution of earthquake hypocentres: the dimension of a hypocentre set, ��, is close to zero for short-term earthquake catalogues and asymptotically approaches 2¼ for long-time intervals. We use the Harvard catalogue of seismic moment tensor solutions to investigate the distribution of incremental static stress caused by earthquakes. The stress measured in the focal zone of each event is approximated by stable distributions. In agreement with theoretical considerations, the exponent value of the distribution approaches zero as the time span of an earthquake catalogue (ΔT decreases. For large stress values α increases. We surmise that it is caused by the δ increase for small inter-earthquake distances due to location errors.

  6. Seismic-electromagnetic precursors of Romania's Vrancea earthquakes

    International Nuclear Information System (INIS)

    Enescu, B.D.; Enescu, C.; Constantin, A. P.

    1999-01-01

    Diagrams were plotted from electromagnetic data that were recorded at Muntele Rosu Observatory during December 1996 to January 1997, and December 1997 to September 1998. The times when Vrancea earthquakes of magnitudes M ≥ 3.9 occurred within these periods are marked on the diagrams.The parameters of the earthquakes are given in a table which also includes information on the magnetic and electric anomalies (perturbations) preceding these earthquakes. The magnetic data prove that Vrancea earthquakes are preceded by magnetic perturbations that may be regarded as their short-term precursors. Perturbations, which could likewise be seen as short-term precursors of Vrancea earthquakes, are also noticed in the electric records. Still, a number of electric data do cast a doubt on their forerunning nature. Some suggestions are made in the end of the paper on how electromagnetic research should go ahead to be of use for Vrancea earthquake prediction. (authors)

  7. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  8. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    Science.gov (United States)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  9. The 2011 M = 9.0 Tohoku oki earthquake more than doubled the probability of large shocks beneath Tokyo

    Science.gov (United States)

    Toda, Shinji; Stein, Ross S.

    2013-01-01

    1] The Kanto seismic corridor surrounding Tokyo has hosted four to five M ≥ 7 earthquakes in the past 400 years. Immediately after the Tohoku earthquake, the seismicity rate in the corridor jumped 10-fold, while the rate of normal focal mechanisms dropped in half. The seismicity rate decayed for 6–12 months, after which it steadied at three times the pre-Tohoku rate. The seismicity rate jump and decay to a new rate, as well as the focal mechanism change, can be explained by the static stress imparted by the Tohoku rupture and postseismic creep to Kanto faults. We therefore fit the seismicity observations to a rate/state Coulomb model, which we use to forecast the time-dependent probability of large earthquakes in the Kanto seismic corridor. We estimate a 17% probability of a M ≥ 7.0 shock over the 5 year prospective period 11 March 2013 to 10 March 2018, two-and-a-half times the probability had the Tohoku earthquake not struck

  10. Short-term mechanisms influencing volumetric brain dynamics

    Directory of Open Access Journals (Sweden)

    Nikki Dieleman

    2017-01-01

    Full Text Available With the use of magnetic resonance imaging (MRI and brain analysis tools, it has become possible to measure brain volume changes up to around 0.5%. Besides long-term brain changes caused by atrophy in aging or neurodegenerative disease, short-term mechanisms that influence brain volume may exist. When we focus on short-term changes of the brain, changes may be either physiological or pathological. As such determining the cause of volumetric dynamics of the brain is essential. Additionally for an accurate interpretation of longitudinal brain volume measures by means of neurodegeneration, knowledge about the short-term changes is needed. Therefore, in this review, we discuss the possible mechanisms influencing brain volumes on a short-term basis and set-out a framework of MRI techniques to be used for volumetric changes as well as the used analysis tools. 3D T1-weighted images are the images of choice when it comes to MRI of brain volume. These images are excellent to determine brain volume and can be used together with an analysis tool to determine the degree of volume change. Mechanisms that decrease global brain volume are: fluid restriction, evening MRI measurements, corticosteroids, antipsychotics and short-term effects of pathological processes like Alzheimer's disease, hypertension and Diabetes mellitus type II. Mechanisms increasing the brain volume include fluid intake, morning MRI measurements, surgical revascularization and probably medications like anti-inflammatory drugs and anti-hypertensive medication. Exercise was found to have no effect on brain volume on a short-term basis, which may imply that dehydration caused by exercise differs from dehydration by fluid restriction. In the upcoming years, attention should be directed towards studies investigating physiological short-term changes within the light of long-term pathological changes. Ultimately this may lead to a better understanding of the physiological short-term effects of

  11. Earthquake Prediction Research In Iceland, Applications For Hazard Assessments and Warnings

    Science.gov (United States)

    Stefansson, R.

    Earthquake prediction research in Iceland, applications for hazard assessments and warnings. The first multinational earthquake prediction research project in Iceland was the Eu- ropean Council encouraged SIL project of the Nordic countries, 1988-1995. The path selected for this research was to study the physics of crustal processes leading to earth- quakes. It was considered that small earthquakes, down to magnitude zero, were the most significant for this purpose, because of the detailed information which they pro- vide both in time and space. The test area for the project was the earthquake prone region of the South Iceland seismic zone (SISZ). The PRENLAB and PRENLAB-2 projects, 1996-2000 supported by the European Union were a direct continuation of the SIL project, but with a more multidisciplinary approach. PRENLAB stands for "Earthquake prediction research in a natural labo- ratory". The basic objective was to advance our understanding in general on where, when and how dangerous NH10earthquake motion might strike. Methods were devel- oped to study crustal processes and conditions, by microearthquake information, by continuous GPS, InSAR, theoretical modelling, fault mapping and paleoseismology. New algorithms were developed for short term warnings. A very useful short term warning was issued twice in the year 2000, one for a sudden start of an eruption in Volcano Hekla February 26, and the other 25 hours before a second (in a sequence of two) magnitude 6.6 (Ms) earthquake in the South Iceland seismic zone in June 21, with the correct location and approximate size. A formal short term warning, although not going to the public, was also issued before a magnitude 5 earthquake in November 1998. In the presentation it will be shortly described what these warnings were based on. A general hazard assessmnets was presented in scientific journals 10-15 years ago assessing within a few kilometers the location of the faults of the two 2000 earthquakes and suggesting

  12. VLF/LF Radio Sounding of Ionospheric Perturbations Associated with Earthquakes

    Directory of Open Access Journals (Sweden)

    Masashi Hayakawa

    2007-07-01

    Full Text Available It is recently recognized that the ionosphere is very sensitive to seismic effects,and the detection of ionospheric perturbations associated with earthquakes, seems to bevery promising for short-term earthquake prediction. We have proposed a possible use ofVLF/LF (very low frequency (3-30 kHz /low frequency (30-300 kHz radio sounding ofthe seismo-ionospheric perturbations. A brief history of the use of subionospheric VLF/LFpropagation for the short-term earthquake prediction is given, followed by a significantfinding of ionospheric perturbation for the Kobe earthquake in 1995. After showingprevious VLF/LF results, we present the latest VLF/LF findings; One is the statisticalcorrelation of the ionospheric perturbation with earthquakes and the second is a case studyfor the Sumatra earthquake in December, 2004, indicating the spatical scale and dynamicsof ionospheric perturbation for this earthquake.

  13. A method for the estimation of the probability of damage due to earthquakes

    International Nuclear Information System (INIS)

    Alderson, M.A.H.G.

    1979-07-01

    The available information on seismicity within the United Kingdom has been combined with building damage data from the United States to produce a method of estimating the probability of damage to structures due to the occurrence of earthquakes. The analysis has been based on the use of site intensity as the major damage producing parameter. Data for structural, pipework and equipment items have been assumed and the overall probability of damage calculated as a function of the design level. Due account is taken of the uncertainties of the seismic data. (author)

  14. Constraining the Long-Term Average of Earthquake Recurrence Intervals From Paleo- and Historic Earthquakes by Assimilating Information From Instrumental Seismicity

    Science.gov (United States)

    Zoeller, G.

    2017-12-01

    Paleo- and historic earthquakes are the most important source of information for the estimationof long-term recurrence intervals in fault zones, because sequences of paleoearthquakes cover more than one seismic cycle. On the other hand, these events are often rare, dating uncertainties are enormous and the problem of missing or misinterpreted events leads to additional problems. Taking these shortcomings into account, long-term recurrence intervals are usually unstable as long as no additional information are included. In the present study, we assume that the time to the next major earthquake depends on the rate of small and intermediate events between the large ones in terms of a ``clock-change'' model that leads to a Brownian Passage Time distribution for recurrence intervals. We take advantage of an earlier finding that the aperiodicity of this distribution can be related to the Gutenberg-Richter-b-value, which is usually around one and can be estimated easily from instrumental seismicity in the region under consideration. This allows to reduce the uncertainties in the estimation of the mean recurrence interval significantly, especially for short paleoearthquake sequences and high dating uncertainties. We present illustrative case studies from Southern California and compare the method with the commonly used approach of exponentially distributed recurrence times assuming a stationary Poisson process.

  15. Response probability and response time: a straight line, the Tagging/Retagging interpretation of short term memory, an operational definition of meaningfulness and short term memory time decay and search time.

    Science.gov (United States)

    Tarnow, Eugen

    2008-12-01

    The functional relationship between correct response probability and response time is investigated in data sets from Rubin, Hinton and Wenzel, J Exp Psychol Learn Mem Cogn 25:1161-1176, 1999 and Anderson, J Exp Psychol [Hum Learn] 7:326-343, 1981. The two measures are linearly related through stimulus presentation lags from 0 to 594 s in the former experiment and for repeated learning of words in the latter. The Tagging/Retagging interpretation of short term memory is introduced to explain this linear relationship. At stimulus presentation the words are tagged. This tagging level drops slowly with time. When a probe word is reintroduced the tagging level has to increase for the word to be properly identified leading to a delay in response time. The tagging time is related to the meaningfulness of the words used-the more meaningful the word the longer the tagging time. After stimulus presentation the tagging level drops in a logarithmic fashion to 50% after 10 s and to 20% after 240 s. The incorrect recall and recognition times saturate in the Rubin et al. data set (they are not linear for large time lags), suggesting a limited time to search the short term memory structure: the search time for recall of unusual words is 1.7 s. For recognition of nonsense words the corresponding time is about 0.4 s, similar to the 0.243 s found in Cavanagh (1972).

  16. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  17. Earthquake-induced water-level fluctuations at Yucca Mountain, Nevada, June 1992

    International Nuclear Information System (INIS)

    O'Brien, G.M.

    1993-01-01

    This report presents earthquake-induced water-level and fluid-pressure data for wells in the Yucca Mountain area, Nevada, during June 1992. Three earthquakes occurred which caused significant water-level and fluid-pressure responses in wells. Wells USW H-5 and USW H-6 are continuously monitored to detect short-term responses caused by earthquakes. Two wells, monitored hourly, had significant, longer-term responses in water level following the earthquakes. On June 28, 1992, a 7.5-magnitude earthquake occurred near Landers, California causing an estimated maximum water-level change of 90 centimeters in well USW H-5. Three hours later a 6.6-magnitude earthquake occurred near Big Bear Lake, California; the maximum water-level fluctuation was 20 centimeters in well USW H-5. A 5.6-magnitude earthquake occurred at Little Skull Mountain, Nevada, on June 29, approximately 23 kilometers from Yucca Mountain. The maximum estimated short-term water-level fluctuation from the Little Skull Mountain earthquake was 40 centimeters in well USW H-5. The water level in well UE-25p number-sign 1, monitored hourly, decreased approximately 50 centimeters over 3 days following the Little Skull Mountain earthquake. The water level in UE-25p number-sign 1 returned to pre-earthquake levels in approximately 6 months. The water level in the lower interval of well USW H-3 increased 28 centimeters following the Little Skull Mountain earthquake. The Landers and Little Skull Mountain earthquakes caused responses in 17 intervals of 14 hourly monitored wells, however, most responses were small and of short duration. For several days following the major earthquakes, many smaller magnitude aftershocks occurred causing measurable responses in the continuously monitored wells

  18. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    NARCIS (Netherlands)

    Cheong, S.A.; Tan, T.L.; Chen, C.-C.; Chang, W.-L.; Liu, Z.; Chew, L.Y.; Sloot, P.M.A.; Johnson, N.F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting

  19. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  20. Adaptively smoothed seismicity earthquake forecasts for Italy

    Directory of Open Access Journals (Sweden)

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  1. Credible occurrence probabilities for extreme geophysical events: earthquakes, volcanic eruptions, magnetic storms

    Science.gov (United States)

    Love, Jeffrey J.

    2012-01-01

    Statistical analysis is made of rare, extreme geophysical events recorded in historical data -- counting the number of events $k$ with sizes that exceed chosen thresholds during specific durations of time $\\tau$. Under transformations that stabilize data and model-parameter variances, the most likely Poisson-event occurrence rate, $k/\\tau$, applies for frequentist inference and, also, for Bayesian inference with a Jeffreys prior that ensures posterior invariance under changes of variables. Frequentist confidence intervals and Bayesian (Jeffreys) credibility intervals are approximately the same and easy to calculate: $(1/\\tau)[(\\sqrt{k} - z/2)^{2},(\\sqrt{k} + z/2)^{2}]$, where $z$ is a parameter that specifies the width, $z=1$ ($z=2$) corresponding to $1\\sigma$, $68.3\\%$ ($2\\sigma$, $95.4\\%$). If only a few events have been observed, as is usually the case for extreme events, then these "error-bar" intervals might be considered to be relatively wide. From historical records, we estimate most likely long-term occurrence rates, 10-yr occurrence probabilities, and intervals of frequentist confidence and Bayesian credibility for large earthquakes, explosive volcanic eruptions, and magnetic storms.

  2. Statistical validation of earthquake related observations

    Science.gov (United States)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  3. Short-term capture of the Earth-Moon system

    Science.gov (United States)

    Qi, Yi; de Ruiter, Anton

    2018-06-01

    In this paper, the short-term capture (STC) of an asteroid in the Earth-Moon system is proposed and investigated. First, the space condition of STC is analysed and five subsets of the feasible region are defined and discussed. Then, the time condition of STC is studied by parameter scanning in the Sun-Earth-Moon-asteroid restricted four-body problem. Numerical results indicate that there is a clear association between the distributions of the time probability of STC and the five subsets. Next, the influence of the Jacobi constant on STC is examined using the space and time probabilities of STC. Combining the space and time probabilities of STC, we propose a STC index to evaluate the probability of STC comprehensively. Finally, three potential STC asteroids are found and analysed.

  4. USGS Online Short-term Hazard Maps: Experiences in the First Year of Implementation

    Science.gov (United States)

    Gerstenberger, M. C.; Jones, L. M.

    2005-12-01

    In May of 2005, following review by the California Earthquake Prediction Evaluation Council, the USGS launched a website that displays the probability of experiencing Modified Mercalli Intensity VI in the next 24 hours. With a forecast based on a relatively simple application of the Gutenberg-Richter relationship and the modified Omori law, the maps are primarily aimed at providing information related to aftershock hazard. Initial response to the system has been mostly positive but has required an effort toward public education. Particularly, it has been difficult to communicate the important difference between a probabilistic forecast and a binary earthquake "prediction". Even with the familiar use of probabilities in weather maps and recent use of terms such as Modified Mercalli Intensity, these, and other terms, are often misunderstood by the media and public. Additionally, the fact that our methodology is not targeted at large independent events has sometimes been difficult to convey to scientists as well as the public. Initial interest in the webpages has been high with greater than 700,000 individual visits between going live in late May, 2005 and the end of June, 2005. This accounts for more than 1/3 of the visits to the USGS-Pasadena webpages in that period. Visits have declined through July and August, but individual daily visits average around 3,000/day.

  5. Memory effect in M ≥ 7 earthquakes of Taiwan

    Science.gov (United States)

    Wang, Jeen-Hwa

    2014-07-01

    The M ≥ 7 earthquakes that occurred in the Taiwan region during 1906-2006 are taken to study the possibility of memory effect existing in the sequence of those large earthquakes. Those events are all mainshocks. The fluctuation analysis technique is applied to analyze two sequences in terms of earthquake magnitude and inter-event time represented in the natural time domain. For both magnitude and inter-event time, the calculations are made for three data sets, i.e., the original order data, the reverse-order data, and that of the mean values. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of both magnitude and inter-event time data. In addition, the phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Results lead to a negative answer. Together with all types of information in study, we make a conclusion that the earthquake sequence in study is short-term corrected and thus the short-term memory effect would be operative.

  6. M≥7 Earthquake rupture forecast and time-dependent probability for the Sea of Marmara region, Turkey

    Science.gov (United States)

    Murru, Maura; Akinci, Aybige; Falcone, Guiseppe; Pucci, Stefano; Console, Rodolfo; Parsons, Thomas E.

    2016-01-01

    We forecast time-independent and time-dependent earthquake ruptures in the Marmara region of Turkey for the next 30 years using a new fault-segmentation model. We also augment time-dependent Brownian Passage Time (BPT) probability with static Coulomb stress changes (ΔCFF) from interacting faults. We calculate Mw > 6.5 probability from 26 individual fault sources in the Marmara region. We also consider a multisegment rupture model that allows higher-magnitude ruptures over some segments of the Northern branch of the North Anatolian Fault Zone (NNAF) beneath the Marmara Sea. A total of 10 different Mw=7.0 to Mw=8.0 multisegment ruptures are combined with the other regional faults at rates that balance the overall moment accumulation. We use Gaussian random distributions to treat parameter uncertainties (e.g., aperiodicity, maximum expected magnitude, slip rate, and consequently mean recurrence time) of the statistical distributions associated with each fault source. We then estimate uncertainties of the 30-year probability values for the next characteristic event obtained from three different models (Poisson, BPT, and BPT+ΔCFF) using a Monte Carlo procedure. The Gerede fault segment located at the eastern end of the Marmara region shows the highest 30-yr probability, with a Poisson value of 29%, and a time-dependent interaction probability of 48%. We find an aggregated 30-yr Poisson probability of M >7.3 earthquakes at Istanbul of 35%, which increases to 47% if time dependence and stress transfer are considered. We calculate a 2-fold probability gain (ratio time-dependent to time-independent) on the southern strands of the North Anatolian Fault Zone.

  7. Short-term droughts forecast using Markov chain model in Victoria, Australia

    Science.gov (United States)

    Rahmat, Siti Nazahiyah; Jayasuriya, Niranjali; Bhuiyan, Muhammed A.

    2017-07-01

    A comprehensive risk management strategy for dealing with drought should include both short-term and long-term planning. The objective of this paper is to present an early warning method to forecast drought using the Standardised Precipitation Index (SPI) and a non-homogeneous Markov chain model. A model such as this is useful for short-term planning. The developed method has been used to forecast droughts at a number of meteorological monitoring stations that have been regionalised into six (6) homogenous clusters with similar drought characteristics based on SPI. The non-homogeneous Markov chain model was used to estimate drought probabilities and drought predictions up to 3 months ahead. The drought severity classes defined using the SPI were computed at a 12-month time scale. The drought probabilities and the predictions were computed for six clusters that depict similar drought characteristics in Victoria, Australia. Overall, the drought severity class predicted was quite similar for all the clusters, with the non-drought class probabilities ranging from 49 to 57 %. For all clusters, the near normal class had a probability of occurrence varying from 27 to 38 %. For the more moderate and severe classes, the probabilities ranged from 2 to 13 % and 3 to 1 %, respectively. The developed model predicted drought situations 1 month ahead reasonably well. However, 2 and 3 months ahead predictions should be used with caution until the models are developed further.

  8. Tokyo Metropolitan Earthquake Preparedness Project - A Progress Report

    Science.gov (United States)

    Hayashi, H.

    2010-12-01

    Munich Re once ranked that Tokyo metropolitan region, the capital of Japan, is the most vulnerable area for earthquake disasters, followed by San Francisco Bay Area, US and Osaka, Japan. Seismologists also predict that Tokyo metropolitan region may have at least one near-field earthquake with a probability of 70% for the next 30 years. Given this prediction, Japanese Government took it seriously to conduct damage estimations and revealed that, as the worst case scenario, if a7.3 magnitude earthquake under heavy winds as shown in the fig. 1, it would kill a total of 11,000 people and a total of direct and indirect losses would amount to 112,000,000,000,000 yen(1,300,000,000,000, 1=85yen) . In addition to mortality and financial losses, a total of 25 million people would be severely impacted by this earthquake in four prefectures. If this earthquake occurs, 300,000 elevators will be stopped suddenly, and 12,500 persons would be confined in them for a long time. Seven million people will come to use over 20,000 public shelters spread over the impacted area. Over one millions temporary housing units should be built to accommodate 4.6 million people who lost their dwellings. 2.5 million people will relocate to outside of the damaged area. In short, an unprecedented scale of earthquake disaster is expected and we must prepare for it. Even though disaster mitigation is undoubtedly the best solution, it is more realistic that the expected earthquake would hit before we complete this business. In other words, we must take into account another solution to make the people and the assets in this region more resilient for the Tokyo metropolitan earthquake. This is the question we have been tackling with for the last four years. To increase societal resilience for Tokyo metropolitan earthquake, we adopted a holistic approach to integrate both emergency response and long-term recovery. There are three goals for long-term recovery, which consists of Physical recovery, Economic

  9. Experimental Study of Thermal Field Evolution in the Short-Impending Stage Before Earthquakes

    Science.gov (United States)

    Ren, Yaqiong; Ma, Jin; Liu, Peixun; Chen, Shunyun

    2017-08-01

    Phenomena at critical points are vital for identifying the short-impending stage prior to earthquakes. The peak stress is a critical point when stress is converted from predominantly accumulation to predominantly release. We call the duration between the peak stress and instability "the meta-instability stage", which refers to the short-impending stage of earthquakes. The meta-instability stage consists of a steady releasing quasi-static stage and an accelerated releasing quasi-dynamic stage. The turning point of the above two stages is the remaining critical point. To identify the two critical points in the field, it is necessary to study the characteristic phenomena of various physical fields in the meta-instability stage in the laboratory, and the strain and displacement variations were studied. Considering that stress and relative displacement can be detected by thermal variations and peculiarities in the full-field observations, we employed a cooled thermal infrared imaging system to record thermal variations in the meta-instability stage of stick slip events generated along a simulated, precut planer strike slip fault in a granodiorite block on a horizontally bilateral servo-controlled press machine. The experimental results demonstrate the following: (1) a large area of decreasing temperatures in wall rocks and increasing temperatures in sporadic sections of the fault indicate entrance into the meta-instability stage. (2) The rapid expansion of regions of increasing temperatures on the fault and the enhancement of temperature increase amplitude correspond to the turning point from the quasi-static stage to the quasi-dynamic stage. Our results reveal thermal indicators for the critical points prior to earthquakes that provide clues for identifying the short-impending stage of earthquakes.

  10. Audit of long-term and short-term liabilities

    Directory of Open Access Journals (Sweden)

    Korinko M.D.

    2017-03-01

    Full Text Available The article determines the importance of long-term and short-term liabilities for the management of financial and material resources of an enterprise. It reviews the aim, objects and information generators for realization of audit of short-term and long-term obligations. The organizing and methodical providing of audit of long-term and short-term liabilities of an enterprise are generalized. The authors distinguish the stages of realization of audit of long-term and short-term liabilities, the aim of audit on each of the presented stages, and recommend methodical techniques. It is fixed that it is necessary to conduct the estimation of the systems of internal control and record-keeping of an enterprise by implementation of public accountant procedures for determination of volume and maintenance of selection realization. After estimating the indicated systems, a public accountant determines the methodology for realization of public accountant verification of long-term and short-term liabilities. The analytical procedures that public accountants are expedient to use for realization of audit of short-term and long-term obligations are determined. The authors suggest the classification of the educed defects on the results of the conducted public accountant verification of short-term and long-term obligations.

  11. Earthquake prediction in Japan and natural time analysis of seismicity

    Science.gov (United States)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  12. Probabilistic tsunami hazard assessment based on the long-term evaluation of subduction-zone earthquakes along the Sagami Trough, Japan

    Science.gov (United States)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Ohsumi, T.; Morikawa, N.; Kawai, S.; Maeda, T.; Matsuyama, H.; Toyama, N.; Kito, T.; Murata, Y.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.; Hakamata, T.

    2017-12-01

    For the forthcoming large earthquakes along the Sagami Trough where the Philippine Sea Plate is subducting beneath the northeast Japan arc, the Earthquake Research Committee(ERC) /Headquarters for Earthquake Research Promotion, Japanese government (2014a) assessed that M7 and M8 class earthquakes will occur there and defined the possible extent of the earthquake source areas. They assessed 70% and 0% 5% of the occurrence probability within the next 30 years (from Jan. 1, 2014), respectively, for the M7 and M8 class earthquakes. First, we set possible 10 earthquake source areas(ESAs) and 920 ESAs, respectively, for M8 and M7 class earthquakes. Next, we constructed 125 characterized earthquake fault models (CEFMs) and 938 CEFMs, respectively, for M8 and M7 class earthquakes, based on "tsunami receipt" of ERC (2017) (Kitoh et al., 2016, JpGU). All the CEFMs are allowed to have a large slip area for expression of fault slip heterogeneity. For all the CEFMs, we calculate tsunamis by solving a nonlinear long wave equation, using FDM, including runup calculation, over a nesting grid system with a minimum grid size of 50 meters. Finally, we re-distributed the occurrence probability to all CEFMs (Abe et al., 2014, JpGU) and gathered excess probabilities for variable tsunami heights, calculated from all the CEFMs, at every observation point along Pacific coast to get PTHA. We incorporated aleatory uncertainties inherent in tsunami calculation and earthquake fault slip heterogeneity. We considered two kinds of probabilistic hazard models; one is "Present-time hazard model" under an assumption that the earthquake occurrence basically follows a renewal process based on BPT distribution if the latest faulting time was known. The other is "Long-time averaged hazard model" under an assumption that earthquake occurrence follows a stationary Poisson process. We fixed our viewpoint, for example, on the probability that the tsunami height will exceed 3 meters at coastal points in next

  13. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  14. Spatial Distribution of the Coefficient of Variation for the Paleo-Earthquakes in Japan

    Science.gov (United States)

    Nomura, S.; Ogata, Y.

    2015-12-01

    Renewal processes, point prccesses in which intervals between consecutive events are independently and identically distributed, are frequently used to describe this repeating earthquake mechanism and forecast the next earthquakes. However, one of the difficulties in applying recurrent earthquake models is the scarcity of the historical data. Most studied fault segments have few, or only one observed earthquake that often have poorly constrained historic and/or radiocarbon ages. The maximum likelihood estimate from such a small data set can have a large bias and error, which tends to yield high probability for the next event in a very short time span when the recurrence intervals have similar lengths. On the other hand, recurrence intervals at a fault depend on the long-term slip rate caused by the tectonic motion in average. In addition, recurrence times are also fluctuated by nearby earthquakes or fault activities which encourage or discourage surrounding seismicity. These factors have spatial trends due to the heterogeneity of tectonic motion and seismicity. Thus, this paper introduces a spatial structure on the key parameters of renewal processes for recurrent earthquakes and estimates it by using spatial statistics. Spatial variation of mean and variance parameters of recurrence times are estimated in Bayesian framework and the next earthquakes are forecasted by Bayesian predictive distributions. The proposal model is applied for recurrent earthquake catalog in Japan and its result is compared with the current forecast adopted by the Earthquake Research Committee of Japan.

  15. Can Vrancea earthquakes be accurately predicted from unusual bio-system behavior and seismic-electromagnetic records?

    International Nuclear Information System (INIS)

    Enescu, D.; Chitaru, C.; Enescu, B.D.

    1999-01-01

    The relevance of bio-seismic research for the short-term prediction of strong Vrancea earthquakes is underscored. An unusual animal behavior before and during Vrancea earthquakes is described and illustrated in the individual case of the major earthquake of March 4, 1977. Several hypotheses to account for the uncommon behavior of bio-systems in relation to earthquakes in general and strong Vrancea earthquakes in particular are discussed in the second section. It is reminded that promising preliminary results concerning the identification of seismic-electromagnetic precursor signals have been obtained in the Vrancea seismogenic area using special, highly sensitive equipment. The need to correlate bio-seismic and seismic-electromagnetic researches is evident. Further investigations are suggested and urgent steps are proposed in order to achieve a successful short-term prediction of strong Vrancea earthquakes. (authors)

  16. Probable variations of a passive safety containment for a 1700 MWe class PWR with passive safety systems

    International Nuclear Information System (INIS)

    Sato, Takashi; Fujiki, Yasunobu; Oikawa, Hirohide; Ofstun, Richard P.

    2009-01-01

    The paper presents probable variations of a passive safety containment for a PWR. The passive safety containment is named Mark P containment tentatively. It is a pressure suppression type containment for a large scale PWR with a BWR type passive containment cooling system (PCCS). More than 3-day grace period can be achieved even for a 1700 MWe class large scale PWR owing to the PCCS. The containment is a reinforced concrete containment vessel (RCCV). The design pressure of the RCCV can be low owing to the suppression pool (S/P) and no prestressed tendon is necessary. It is a single barrier CV that can withstand a large airplane crash by itself. This simple configuration results in good economy and short construction term. The BWR type passive safety systems also include the Passive Cooling and Depressurization System (PCDS). The PCDS has 3-day grace period for the SBO induced by a giant earthquake and can practically eliminate the residual risk of a giant earthquake beyond the design basis earthquake of Ss. It also has a safety function to automatically depressurize the primary system at accidents such as SGTR and eliminate the need for operator actions. It is a large 1700 MWe passive safety PWR that has more than 3-day grace period for extremely severe natural disasters including a giant earthquake, a mega hurricane, tsunami and so on; no containment failure at a SA establishing a no evacuation plant; protection for a large airplane crash with the RCCV single barrier; good economy and short construction term. (author)

  17. Short term depression unmasks the ghost frequency.

    Directory of Open Access Journals (Sweden)

    Tjeerd V Olde Scheper

    Full Text Available Short Term Plasticity (STP has been shown to exist extensively in synapses throughout the brain. Its function is more or less clear in the sense that it alters the probability of synaptic transmission at short time scales. However, it is still unclear what effect STP has on the dynamics of neural networks. We show, using a novel dynamic STP model, that Short Term Depression (STD can affect the phase of frequency coded input such that small networks can perform temporal signal summation and determination with high accuracy. We show that this property of STD can readily solve the problem of the ghost frequency, the perceived pitch of a harmonic complex in absence of the base frequency. Additionally, we demonstrate that this property can explain dynamics in larger networks. By means of two models, one of chopper neurons in the Ventral Cochlear Nucleus and one of a cortical microcircuit with inhibitory Martinotti neurons, it is shown that the dynamics in these microcircuits can reliably be reproduced using STP. Our model of STP gives important insights into the potential roles of STP in self-regulation of cortical activity and long-range afferent input in neuronal microcircuits.

  18. Auditory short-term memory activation during score reading.

    Science.gov (United States)

    Simoens, Veerle L; Tervaniemi, Mari

    2013-01-01

    Performing music on the basis of reading a score requires reading ahead of what is being played in order to anticipate the necessary actions to produce the notes. Score reading thus not only involves the decoding of a visual score and the comparison to the auditory feedback, but also short-term storage of the musical information due to the delay of the auditory feedback during reading ahead. This study investigates the mechanisms of encoding of musical information in short-term memory during such a complicated procedure. There were three parts in this study. First, professional musicians participated in an electroencephalographic (EEG) experiment to study the slow wave potentials during a time interval of short-term memory storage in a situation that requires cross-modal translation and short-term storage of visual material to be compared with delayed auditory material, as it is the case in music score reading. This delayed visual-to-auditory matching task was compared with delayed visual-visual and auditory-auditory matching tasks in terms of EEG topography and voltage amplitudes. Second, an additional behavioural experiment was performed to determine which type of distractor would be the most interfering with the score reading-like task. Third, the self-reported strategies of the participants were also analyzed. All three parts of this study point towards the same conclusion according to which during music score reading, the musician most likely first translates the visual score into an auditory cue, probably starting around 700 or 1300 ms, ready for storage and delayed comparison with the auditory feedback.

  19. Memory effect in M ≥ 6 earthquakes of South-North Seismic Belt, Mainland China

    Science.gov (United States)

    Wang, Jeen-Hwa

    2013-07-01

    The M ≥ 6 earthquakes occurred in the South-North Seismic Belt, Mainland China, during 1901-2008 are taken to study the possible existence of memory effect in large earthquakes. The fluctuation analysis technique is applied to analyze the sequences of earthquake magnitude and inter-event time represented in the natural time domain. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of earthquake magnitude and inter-event time. The migration of earthquakes in study is taken to discuss the possible correlation between events. The phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Together with all kinds of given information, we conclude that the earthquakes in study is short-term correlated and thus the short-term memory effect would be operative.

  20. Interevent times in a new alarm-based earthquake forecasting model

    Science.gov (United States)

    Talbi, Abdelhak; Nanjo, Kazuyoshi; Zhuang, Jiancang; Satake, Kenji; Hamdache, Mohamed

    2013-09-01

    This study introduces a new earthquake forecasting model that uses the moment ratio (MR) of the first to second order moments of earthquake interevent times as a precursory alarm index to forecast large earthquake events. This MR model is based on the idea that the MR is associated with anomalous long-term changes in background seismicity prior to large earthquake events. In a given region, the MR statistic is defined as the inverse of the index of dispersion or Fano factor, with MR values (or scores) providing a biased estimate of the relative regional frequency of background events, here termed the background fraction. To test the forecasting performance of this proposed MR model, a composite Japan-wide earthquake catalogue for the years between 679 and 2012 was compiled using the Japan Meteorological Agency catalogue for the period between 1923 and 2012, and the Utsu historical seismicity records between 679 and 1922. MR values were estimated by sampling interevent times from events with magnitude M ≥ 6 using an earthquake random sampling (ERS) algorithm developed during previous research. Three retrospective tests of M ≥ 7 target earthquakes were undertaken to evaluate the long-, intermediate- and short-term performance of MR forecasting, using mainly Molchan diagrams and optimal spatial maps obtained by minimizing forecasting error defined by miss and alarm rate addition. This testing indicates that the MR forecasting technique performs well at long-, intermediate- and short-term. The MR maps produced during long-term testing indicate significant alarm levels before 15 of the 18 shallow earthquakes within the testing region during the past two decades, with an alarm region covering about 20 per cent (alarm rate) of the testing region. The number of shallow events missed by forecasting was reduced by about 60 per cent after using the MR method instead of the relative intensity (RI) forecasting method. At short term, our model succeeded in forecasting the

  1. Short-term Mobility and Increased Partnership Concurrency among Men in Zimbabwe.

    Directory of Open Access Journals (Sweden)

    Susan Cassels

    Full Text Available Migration has long been understood as an underlying factor for HIV transmission, and sexual partner concurrency has been increasingly studied as an important component of HIV transmission dynamics. However, less work has examined the role of short-term mobility in sexual partner concurrency using a network approach. Short-term mobility may be a risk for HIV for the migrant's partner as well either through the partner's risk behaviors while the migrant is away, such as the partner having additional partners, or via exposure to the return migrant.Using data from the 2010-11 Zimbabwe Demographic and Health Survey, weighted generalized linear regression models were used to investigate the associations between short-term mobility and partnership concurrency at the individual and partnership levels.At the individual level, we find strong evidence of an association between short-term mobility and concurrency. Men who traveled were more likely to have concurrent partnerships compared to men who did not travel and the relationship was non-linear: each trip was associated with a 2% higher probability of concurrency, with a diminishing risk at 60 trips (p<0.001. At the partnership level, short-term mobility by the male only or both partners was associated with male concurrency. Couples in which the female only traveled exhibited less male concurrency.Short-term mobility has the ability to impact population-level transmission dynamics by facilitating partnership concurrency and thus onward HIV transmission. Short-term migrants may be an important population to target for HIV testing, treatment, or social and behavioral interventions to prevent the spread of HIV.

  2. Fault Branching and Long-Term Earthquake Rupture Scenario for Strike-Slip Earthquake

    Science.gov (United States)

    Klinger, Y.; CHOI, J. H.; Vallage, A.

    2017-12-01

    Careful examination of surface rupture for large continental strike-slip earthquakes reveals that for the majority of earthquakes, at least one major branch is involved in the rupture pattern. Often, branching might be either related to the location of the epicenter or located toward the end of the rupture, and possibly related to the stopping of the rupture. In this work, we examine large continental earthquakes that show significant branches at different scales and for which ground surface rupture has been mapped in great details. In each case, rupture conditions are described, including dynamic parameters, past earthquakes history, and regional stress orientation, to see if the dynamic stress field would a priori favor branching. In one case we show that rupture propagation and branching are directly impacted by preexisting geological structures. These structures serve as pathways for the rupture attempting to propagate out of its shear plane. At larger scale, we show that in some cases, rupturing a branch might be systematic, hampering possibilities for the development of a larger seismic rupture. Long-term geomorphology hints at the existence of a strong asperity in the zone where the rupture branched off the main fault. There, no evidence of throughgoing rupture could be seen along the main fault, while the branch is well connected to the main fault. This set of observations suggests that for specific configurations, some rupture scenarios involving systematic branching are more likely than others.

  3. Modelling the elements of country vulnerability to earthquake disasters.

    Science.gov (United States)

    Asef, M R

    2008-09-01

    Earthquakes have probably been the most deadly form of natural disaster in the past century. Diversity of earthquake specifications in terms of magnitude, intensity and frequency at the semicontinental scale has initiated various kinds of disasters at a regional scale. Additionally, diverse characteristics of countries in terms of population size, disaster preparedness, economic strength and building construction development often causes an earthquake of a certain characteristic to have different impacts on the affected region. This research focuses on the appropriate criteria for identifying the severity of major earthquake disasters based on some key observed symptoms. Accordingly, the article presents a methodology for identification and relative quantification of severity of earthquake disasters. This has led to an earthquake disaster vulnerability model at the country scale. Data analysis based on this model suggested a quantitative, comparative and meaningful interpretation of the vulnerability of concerned countries, and successfully explained which countries are more vulnerable to major disasters.

  4. Earthquake prediction the ory and its relation to precursors

    International Nuclear Information System (INIS)

    Negarestani, A.; Setayeshi, S.; Ghannadi-Maragheh, M.; Akasheh, B.

    2001-01-01

    Since we don't have enough knowledge about the Physics of earthquakes. therefore. the study of seismic precursors plays an important role in earthquake prediction. Earthquake prediction is a science which discusses about precursory phenomena during seismogenic process, and then investigates the correlation and association among them and the intrinsic relation between precursors and the seismogenic process. ar the end judges comprehensively the seismic status and finally makes earthquake prediction. There are two ways for predicting earthquake prediction. The first is to study the physics of seismogenic process and to determine the parameters in the process based on the source theories and the second way is to use seismic precursors. In this paper the theory of earthquake is reviewed. We also study theory of earthquake using models of earthquake origin, the relation between seismogenic process and various accompanying precursory phenomena. The earthquake prediction is divided into three categories: long-term, medium-term and short-term. We study seismic anomalous behavior. electric field, crustal deformation, gravity. magnetism of earth. change of groundwater variation. groundwater geochemistry and change of Radon gas emission. Finally, it is concluded the there is a correlation between Radon gas emission and earthquake phenomena. Meanwhile, there are some samples from actual processing in this area

  5. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.

    1997-01-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  6. My Road to Transform Faulting 1963; Long-Term Precursors to Recent Great Earthquakes

    Science.gov (United States)

    Sykes, L. R.

    2017-12-01

    My road to plate tectonics started serendipitously in 1963 in a remote area of the southeast Pacific when I was studying the propagation of short-period seismic surface waves for my PhD. The earthquakes I used as sources were poorly located. I discovered that my relocated epicenters followed the crest of the East Pacific Rise but then suddenly took a sharp turn to the east at what I interpreted to be a major fracture zone 1000 km long before turning again to the north near 55 degrees south. I noted that earthquakes along that zone only occurred between the two ridge crests, an observation Tuzo Wilson used to develop his hypothesis of transform faulting. Finding a great, unknown fracture zone led me to conclude that work on similar faults that intersect the Mid-Oceanic Ridge System was more important than my study of surface waves. I found similar great faults over the next two years and obtained refined locations of earthquakes along several island arcs. When I was in Fiji and Tonga during 1965 studying deep earthquakes, James Dorman wrote to me about Wilson's paper and I thought about testing his hypothesis. I started work on it the spring of 1966 immediately after I learned about the symmetrical "magic magnetic anomaly profile" across the East Pacific Rise of Pitman and Heirtzler. I quickly obtained earthquake mechanisms that verified the transform hypothesis and its related concepts of seafloor spreading and continental drift. As an undergraduate in the late 1950s, my mentor told me that respectable young earth scientists should not work on vague and false mobilistic concepts like continental drift since continents cannot plow through strong oceanic crust. Hence, until spring 1966, I did not take continental drift seriously. The second part of my presentation involves new evidence from seismology and GPS of what appear to be long-term precursors to a number of great earthquakes of the past decade.

  7. The impact of the Canterbury earthquakes on prescribing for mental health.

    Science.gov (United States)

    Beaglehole, Ben; Bell, Caroline; Frampton, Christopher; Hamilton, Greg; McKean, Andrew

    2015-08-01

    The aim of this study is to evaluate the impact of the Canterbury earthquakes on the mental health of the local population by examining prescribing patterns of psychotropic medication. Dispensing data from community pharmacies for antidepressants, antipsychotics, anxiolytics and sedatives/hypnotics are routinely recorded in a national database. The close relationship between prescribing and dispensing provides the opportunity to assess prescribing trends for Canterbury compared to national data and therefore examines the longitudinal impact of the earthquakes on prescribing patterns. Short-term increases in the use of anxiolytics and sedatives/hypnotics were observed after the most devastating February 2011 earthquake, but this effect was not sustained. There were no observable effects of the earthquakes on antidepressant or antipsychotic dispensing. Short-term increases in dispensing were only observed for the classes of anxiolytics and sedatives/hypnotics. No sustained changes in dispensing occurred. These findings suggest that long-term detrimental effects on the mental health of the Canterbury population were either not present or have not resulted in increased prescribing of psychotropic medication. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  8. Study of short term memory status in adult bipolar disorder patients in south Indian population.

    Science.gov (United States)

    Aslam, Mohammed; Siddiq, Mohamed; Dhundasi, Salim A; Das, Kusal K; Kulkarni, B R

    2011-01-01

    The present study was undertaken to establish short term memory status in bipolar disorder cases as compared with normal age and sex matched control group in Bijapur (Karnataka). Results showed that a significant decrease in short term memory status in bipolar disorder cases as compared to their control group .Loss of attention, decreased processing speed and executive function patterns may be the probable causes of such observations.

  9. Slip weakening, strain and short-termpreseismic disturbances

    Directory of Open Access Journals (Sweden)

    V. A. Morgounov

    2004-06-01

    Full Text Available The problem of short-term earthquake precursors is discussed. In contrast to the increasing number of reports on short-lived precursors of various types, direct strain measurements cannot detect clearly expressed preseismic anomalies, as follows from the aseismic nucleation mechanism. Based on previously published data and the assumption that the attenuation of the stress-strain field is proportional to r- 3, a possible scenario of the final stage of earthquake nucleation process is proposed on the basis of the slip weakening mechanism in the source and the associated mosaic pattern of precursors on the Earth?s surface. The formulas for estimating the maximum distance of precursor detection and minimum duration of the final stage of inelastic deformation preceding brittle failure of rocks are derived. The data of electromagnetic precursors are interpreted in terms of a skin-layer model. A considerable increase in strain rates at the final stage of the earthquake nucleation provides an opportunity to explain teleseismic effects before strong earthquakes in terms of normalized epicenter distance. The modeling results are compared with in situ observations.

  10. Pediatric polytrauma : Short-term and long-term outcomes

    NARCIS (Netherlands)

    vanderSluis, CK; Kingma, J; Eisma, WH; tenDuis, HJ

    Objective: To assess the short-term and long-term outcomes of pediatric polytrauma patients and to analyze the extent to which short-term outcomes can predict long-term outcomes. Materials and Methods: Ail pediatric polytrauma patients (Injury Severity Score of greater than or equal to 16, less than

  11. Stress Regime in the Nepalese Himalaya from Recent Earthquakes.

    Science.gov (United States)

    Pant, M.; Karplus, M. S.; Velasco, A. A.; Nabelek, J.; Kuna, V. M.; Ghosh, A.; Mendoza, M.; Adhikari, L. B.; Sapkota, S. N.; Klemperer, S. L.; Patlan, E.

    2017-12-01

    The two recent earthquakes, April 25, 2015 Mw 7.8 (Gorkha earthquake) and May 12, 2015 Mw 7.2, at the Indo-Eurasian plate margin killed thousands of people and caused billion dollars of property loss. In response to these events, we deployed a dense array of seismometers to record the aftershocks along Gorkha earthquake rupture area. Our network NAMASTE (Nepal Array Measuring Aftershock Seismicity Trailing Earthquake) included 45 different seismic stations (16 short period, 25 broadband, and 4 strong motion sensors) covering a large area from north-central Nepal to south of the Main Frontal Thrust at a spacing of 20 km. The instruments recorded aftershocks from June 2015 to May 2016. We used time domain short term average (STA) and long term average (LTA) algorithms (1/10s and 4/40s) respectively to detect the arrivals and then developed an earthquake catalog containing 9300 aftershocks. We are manually picking the P-wave first motion arrival polarity to develop a catalog of focal mechanisms for the larger magnitude (>M3.0) events with adequate (>10) arrivals. We hope to characterize the seismicity and stress mechanisms of the complex fault geometries in the Nepalese Himalaya and to address the geophysical processes controlling seismic cycles in the Indo-Eurasian plate margin.

  12. Long-term associative learning predicts verbal short-term memory performance

    OpenAIRE

    Jones, Gary; Macken, Bill

    2017-01-01

    Studies using tests such as digit span and nonword repetition have implicated short-term memory across a range of developmental domains. Such tests ostensibly assess specialized processes for the short-term manipulation and maintenance of information that are often argued to enable long-term learning. However, there is considerable evidence for an influence of long-term linguistic learning on performance in short-term memory tasks that brings into question the role of a specialized short-term...

  13. Long-term associative learning predicts verbal short-term memory performance.

    Science.gov (United States)

    Jones, Gary; Macken, Bill

    2018-02-01

    Studies using tests such as digit span and nonword repetition have implicated short-term memory across a range of developmental domains. Such tests ostensibly assess specialized processes for the short-term manipulation and maintenance of information that are often argued to enable long-term learning. However, there is considerable evidence for an influence of long-term linguistic learning on performance in short-term memory tasks that brings into question the role of a specialized short-term memory system separate from long-term knowledge. Using natural language corpora, we show experimentally and computationally that performance on three widely used measures of short-term memory (digit span, nonword repetition, and sentence recall) can be predicted from simple associative learning operating on the linguistic environment to which a typical child may have been exposed. The findings support the broad view that short-term verbal memory performance reflects the application of long-term language knowledge to the experimental setting.

  14. Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2010-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish .

  15. Effects of short term and long term Extremely Low Frequency Magnetic Field on depressive disorder in mice: Involvement of nitric oxide pathway.

    Science.gov (United States)

    Madjid Ansari, Alireza; Farzampour, Shahrokh; Sadr, Ali; Shekarchi, Babak; Majidzadeh-A, Keivan

    2016-02-01

    Previous reports on the possible effects of Extremely Low Frequency Magnetic Fields (ELF MF) on mood have been paradoxical in different settings while no study has yet been conducted on animal behavior. In addition, it was shown that ELF MF exposure makes an increase in brain nitric oxide level. Therefore, in the current study, we aimed to assess the possible effect(s) of ELF MF exposure on mice Forced Swimming Test (FST) and evaluate the probable role of the increased level of nitric oxide in the observed behavior. Male adult mice NMRI were recruited to investigate the short term and long term ELF MF exposure (0.5 mT and 50 Hz, single 2h and 2 weeks 2h a day). Locomotor behavior was assessed by using open-field test (OFT) followed by FST to evaluate the immobility time. Accordingly, NΩ-nitro-l-arginine methyl ester 30 mg/kg was used to exert anti-depressant like effect. According to the results, short term exposure did not alter the immobility time, whereas long term exposure significantly reduces immobility time (pmice, whereas short term exposure has no significant effect. Also, reversing the anti-depressant activity of L-NAME indicates a probable increase in the brain nitric oxide. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. On the reliability of the geomagnetic quake as a short time earthquake's precursor for the Sofia region

    Directory of Open Access Journals (Sweden)

    S. Cht. Mavrodiev

    2004-01-01

    Full Text Available The local 'when' for earthquake prediction is based on the connection between geomagnetic 'quakes' and the next incoming minimum or maximum of tidal gravitational potential. The probability time window for the predicted earthquake is for the tidal minimum approximately ±1 day and for the maximum ±2 days. The preliminary statistic estimation on the basis of distribution of the time difference between occurred and predicted earthquakes for the period 2002-2003 for the Sofia region is given. The possibility for creating a local 'when, where' earthquake research and prediction NETWORK is based on the accurate monitoring of the electromagnetic field with special space and time scales under, on and over the Earth's surface. The periodically upgraded information from seismic hazard maps and other standard geodetic information, as well as other precursory information, is essential.

  17. Is It Possible to Predict Strong Earthquakes?

    Science.gov (United States)

    Polyakov, Y. S.; Ryabinin, G. V.; Solovyeva, A. B.; Timashev, S. F.

    2015-07-01

    The possibility of earthquake prediction is one of the key open questions in modern geophysics. We propose an approach based on the analysis of common short-term candidate precursors (2 weeks to 3 months prior to strong earthquake) with the subsequent processing of brain activity signals generated in specific types of rats (kept in laboratory settings) who reportedly sense an impending earthquake a few days prior to the event. We illustrate the identification of short-term precursors using the groundwater sodium-ion concentration data in the time frame from 2010 to 2014 (a major earthquake occurred on 28 February 2013) recorded at two different sites in the southeastern part of the Kamchatka Peninsula, Russia. The candidate precursors are observed as synchronized peaks in the nonstationarity factors, introduced within the flicker-noise spectroscopy framework for signal processing, for the high-frequency component of both time series. These peaks correspond to the local reorganizations of the underlying geophysical system that are believed to precede strong earthquakes. The rodent brain activity signals are selected as potential "immediate" (up to 2 weeks) deterministic precursors because of the recent scientific reports confirming that rodents sense imminent earthquakes and the population-genetic model of K irshvink (Soc Am 90, 312-323, 2000) showing how a reliable genetic seismic escape response system may have developed over the period of several hundred million years in certain animals. The use of brain activity signals, such as electroencephalograms, in contrast to conventional abnormal animal behavior observations, enables one to apply the standard "input-sensor-response" approach to determine what input signals trigger specific seismic escape brain activity responses.

  18. Application of geochemical methods in earthquake prediction in China

    Energy Technology Data Exchange (ETDEWEB)

    Fong-liang, J.; Gui-ru, L.

    1981-05-01

    Several geochemical anomalies were observed before the Haichen, Longling, Tangshan, and Songpan earthquakes and their strong aftershocks. They included changes in groundwater radon levels; chemical composition of the groundwater (concentration of Ca/sup + +/, Mg/sup + +/, Cl/sup -/, So/sub 4//sup , and HCO/sub 3//sup -/ ions); conductivity; and dissolved gases such as H/sub 2/, CO/sub 2/, etc. In addition, anomalous changes in water color and quality were observed before these large earthquakes. Before some events gases escaped from the surface, and there were reports of ''ground odors'' being smelled by local residents. The large amount of radon data can be grouped into long-term and short-term anomalies. The long-term anomalies have a radon emission build up time of from a few months to more than a year. The short-term anomalies have durations from a few hours or less to a few months.

  19. Earthquake forecast for the Wasatch Front region of the Intermountain West

    Science.gov (United States)

    DuRoss, Christopher B.

    2016-04-18

    The Working Group on Utah Earthquake Probabilities has assessed the probability of large earthquakes in the Wasatch Front region. There is a 43 percent probability of one or more magnitude 6.75 or greater earthquakes and a 57 percent probability of one or more magnitude 6.0 or greater earthquakes in the region in the next 50 years. These results highlight the threat of large earthquakes in the region.

  20. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  1. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  2. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  3. Similarity as an organising principle in short-term memory.

    Science.gov (United States)

    LeCompte, D C; Watkins, M J

    1993-03-01

    The role of stimulus similarity as an organising principle in short-term memory was explored in a series of seven experiments. Each experiment involved the presentation of a short sequence of items that were drawn from two distinct physical classes and arranged such that item class changed after every second item. Following presentation, one item was re-presented as a probe for the 'target' item that had directly followed it in the sequence. Memory for the sequence was considered organised by class if probability of recall was higher when the probe and target were from the same class than when they were from different classes. Such organisation was found when one class was auditory and the other was visual (spoken vs. written words, and sounds vs. pictures). It was also found when both classes were auditory (words spoken in a male voice vs. words spoken in a female voice) and when both classes were visual (digits shown in one location vs. digits shown in another). It is concluded that short-term memory can be organised on the basis of sensory modality and on the basis of certain features within both the auditory and visual modalities.

  4. Tsunami hazard assessments with consideration of uncertain earthquakes characteristics

    Science.gov (United States)

    Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.

    2017-12-01

    The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for

  5. A three-step Maximum-A-Posterior probability method for InSAR data inversion of coseismic rupture with application to four recent large earthquakes in Asia

    Science.gov (United States)

    Sun, J.; Shen, Z.; Burgmann, R.; Liang, F.

    2012-12-01

    We develop a three-step Maximum-A-Posterior probability (MAP) method for coseismic rupture inversion, which aims at maximizing the a posterior probability density function (PDF) of elastic solutions of earthquake rupture. The method originates from the Fully Bayesian Inversion (FBI) and the Mixed linear-nonlinear Bayesian inversion (MBI) methods , shares the same a posterior PDF with them and keeps most of their merits, while overcoming its convergence difficulty when large numbers of low quality data are used and improving the convergence rate greatly using optimization procedures. A highly efficient global optimization algorithm, Adaptive Simulated Annealing (ASA), is used to search for the maximum posterior probability in the first step. The non-slip parameters are determined by the global optimization method, and the slip parameters are inverted for using the least squares method without positivity constraint initially, and then damped to physically reasonable range. This step MAP inversion brings the inversion close to 'true' solution quickly and jumps over local maximum regions in high-dimensional parameter space. The second step inversion approaches the 'true' solution further with positivity constraints subsequently applied on slip parameters using the Monte Carlo Inversion (MCI) technique, with all parameters obtained from step one as the initial solution. Then the slip artifacts are eliminated from slip models in the third step MAP inversion with fault geometry parameters fixed. We first used a designed model with 45 degree dipping angle and oblique slip, and corresponding synthetic InSAR data sets to validate the efficiency and accuracy of method. We then applied the method on four recent large earthquakes in Asia, namely the 2010 Yushu, China earthquake, the 2011 Burma earthquake, the 2011 New Zealand earthquake and the 2008 Qinghai, China earthquake, and compared our results with those results from other groups. Our results show the effectiveness of

  6. Chest injuries associated with earthquakes: an analysis of injuries sustained during the 2008 Wen-Chuan earthquake in China.

    Science.gov (United States)

    Hu, Jia; Guo, Ying-Qiang; Zhang, Er-Yong; Tan, Jin; Shi, Ying-Kang

    2010-08-01

    The goal of this study was to analyze the patterns, therapeutic modalities, and short-term outcomes of patients with chest injuries in the aftermath of the Wen-Chuan earthquake, which occurred on May 12, 2008 and registered 8.0 on the Richter scale. Of the 1522 patients who were referred to the West China Hospital of Sichuan University from May 12 to May 27, 169 patients (11.1%) had suffered major chest injuries. The type of injury, the presence of infection, Abbreviated Injury Score (AIS 2005), New Injury Severity Score (NISS), treatment, and short-term outcome were all documented for each case. Isolated chest injuries were diagnosed in 129 patients (76.3%), while multiple injuries with a major chest trauma were diagnosed in 40 patients (23.7%). The mean AIS and the median NISS of the hospitalized patients with chest injuries were 2.5 and 13, respectively. The mortality rate was 3.0% (5 patients). Most of the chest injuries were classified as minor to moderate trauma; however, coexistent multiple injuries and subsequent infection should be carefully considered in medical response strategies. Coordinated efforts among emergency medical support groups and prior training in earthquake preparedness and rescue in earthquake-prone areas are therefore necessary for efficient evacuation and treatment of catastrophic casualties.

  7. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    Science.gov (United States)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  8. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  9. The suitability of short-term measurements of radon in the built environment

    International Nuclear Information System (INIS)

    Denman, A.R.; Groves-Kirkby, C.J.; Phillips, P.S.; Crockett, R.G.M.; Woolridge, A.C.

    2008-01-01

    Although domestic and workplace radon concentration levels often show marked diurnal/short-term variation, overall health risk is determined by the long-term average level, and many national protocols advocate the use of long exposure periods, usually three months, to assess long-term risk. Simple passive measurement techniques, e.g. track-etch, activated charcoal and electret, can, however, provide reasonably accurate determinations with exposures as short as one week, and there is pressure from users and stake holders for assessments within this time period. We report evaluation of the effectiveness of one-week, one-month and three-month exposures over a period of one year in a designated Radon Affected Area in the United Kingdom (UK). Although short-term exposures did not compromise measurement accuracy, short-term radon variability rendered one-week measurements less reliable in predicting annual average radon levels via the conventional methodology. Analysis permitted estimation of the maximum and minimum short-term measured domestic radon concentrations at which there was 95% probability of the predicted annual average being below or above the UK Action Level of 200 Bq·m -3 respectively. Between these limits, the short-term result is equivocal, requiring repetition, and the 'equivocal range' for one-week measurements is significantly wider than for three-month exposures. In any geographical area, domestic radon concentrations are distributed log normally, with many properties having low average levels; a small number exhibit excessive levels, and this distribution must be considered when defining exposures for a radon measurement programme. In low-radon areas, where 1% of houses might exceed the Action Level, a one-week assessment will find that fewer outcomes are equivocal. For high-radon areas, with 20% or more houses over the Action Level, more than 50% of one-week outcomes will be equivocal, requiring repeats. The results of this work will be presented

  10. Short-term memory loss associated with rosuvastatin.

    Science.gov (United States)

    Galatti, Laura; Polimeni, Giovanni; Salvo, Francesco; Romani, Marcello; Sessa, Aurelio; Spina, Edoardo

    2006-08-01

    Memory loss and cognitive impairment have been reported in the literature in association with several 3-hydroxy-3-methylglutaryl coenzyme A reductase inhibitors (statins), but we found no published case reports associated with rosuvastatin. To our knowledge, this is the first reported case of rosuvastatin-related short-term memory loss. A 53-year-old Caucasian man with hypercholesterolemia experienced memory loss after being treated with rosuvastatin 10 mg/day. He had no other concomitant conditions or drug therapies. After discontinuation of rosuvastatin, the neuropsychiatric adverse reaction resolved gradually, suggesting a probable drug association. During the following year, the patient remained free from neuropsychiatric disturbances. Clinicians should be aware of possible adverse cognitive reactions during statin therapy, including rosuvastatin.

  11. Biochemical and hematologic changes after short-term space flight

    Science.gov (United States)

    Leach, C. S.

    1992-01-01

    Clinical laboratory data from blood samples obtained from astronauts before and after 28 flights (average duration = 6 days) of the Space Shuttle were analyzed by the paired t-test and the Wilcoxon signed-rank test and compared with data from the Skylab flights (duration approximately 28, 59, and 84 days). Angiotensin I and aldosterone were elevated immediately after short-term space flights, but the response of angiotensin I was delayed after Skylab flights. Serum calcium was not elevated after Shuttle flights, but magnesium and uric acid decreased after both Shuttle and Skylab. Creatine phosphokinase in serum was reduced after Shuttle but not Skylab flights, probably because exercises to prevent deconditioning were not performed on the Shuttle. Total cholesterol was unchanged after Shuttle flights, but low density lipoprotein cholesterol increased and high density lipoprotein cholesterol decreased. The concentration of red blood cells was elevated after Shuttle flights and reduced after Skylab flights. Reticulocyte count was decreased after both short- and long-term flights, indicating that a reduction in red blood cell mass is probably more closely related to suppression of red cell production than to an increase in destruction of erythrocytes. Serum ferritin and number of platelets were also elevated after Shuttle flights. In determining the reasons for postflight differences between the shorter and longer flights, it is important to consider not only duration but also countermeasures, differences between spacecraft, and procedures for landing and egress.

  12. Gender differences in success at quitting smoking: Short- and long-term outcomes.

    Science.gov (United States)

    Marqueta, Adriana; Nerín, Isabel; Gargallo, Pilar; Beamonte, Asunción

    2016-06-14

    Smoking cessation treatments are effective in men and women. However, possible sex-related differences in the outcome of these treatments remain a controversial topic. This study evaluated whether there were differences between men and women in the success of smoking cessation treatment, including gender-tailored components, in the short and long term (> 1 year). A telephone survey was carried out between September 2008 and June 2009 in smokers attended in a Smoking Cessation Clinic. All patients who have successfully completed treatment (3 months) were surveyed by telephone to determine their long-term abstinence. Those who remained abstinent were requested to attend the Smoking Cessation Clinic for biochemical validation (expired CO ≤10 ppm). The probability of remaining abstinent in the long-term was calculated using a Kaplan-Meier survival analysis. The treatment success rate at 3-months was 41.3% (538/1302) with no differences by sex 89% (479/538) among those located in the telephonic follow-up study and 47.6% (256/479) were abstinent without differences by sex (p = .519); abstinence was validated with CO less than 10 ppm in 191 of the 256 (53.9% men and 46.1% women). In the survival analysis, the probability of men and women remaining abstinent in the long-term was not significant. There are no differences by sex in the outcome of smoking cessation treatment that included gender-tailored components in the short and long term (> 1 year).

  13. A dynamic model of liquid containers (tanks) with legs and probability analysis of response to simulated earthquake

    International Nuclear Information System (INIS)

    Fujita, Takafumi; Shimosaka, Haruo

    1980-01-01

    This paper is described on the results of analysis of the response of liquid containers (tanks) to earthquakes. Sine wave oscillation was applied experimentally to model tanks with legs. A model with one degree of freedom is good enough for the analysis. To investigate the reason of this fact, the response multiplication factor of tank displacement was analysed. The shapes of the model tanks were rectangular and cylindrical. Analyses were made by a potential theory. The experimental studies show that the characteristics of attenuation of oscillation was non-linear. The model analysis of this non-linear attenuation was also performed. Good agreement between the experimental and the analytical results was recognized. The probability analysis of the response to earthquake with simulated shock waves was performed, using the above mentioned model, and good agreement between the experiment and the analysis was obtained. (Kato, T.)

  14. Radon anomalies preceding earthquakes which occurred in the UK, in summer and autumn 2002

    International Nuclear Information System (INIS)

    Crockett, R.G.M.; Gillmore, G.K.; Phillips, P.S.; Denman, A.R.; Groves-Kirkby, C.J.

    2006-01-01

    During the course of an investigation into domestic radon levels in Northamptonshire, two hourly sampling real-time radon detectors were operated simultaneously in separate locations 2.25 km apart in Northampton, in the English East Midlands, for a 25-week period. This period of operation encompassed the period in September 2002 during which the Dudley earthquake (magnitude - 5.0) and smaller aftershocks occurred in the English West Midlands, UK. We report herein our observations regarding the occurrence of simultaneous short-period radon anomalies and their timing in relation to the Dudley, and other, earthquakes which occurred during the monitoring period. Analysis of the radon time-series reveals a short period when the two time-series displayed simultaneous in-phase short-term (6-9 h) radon anomalies prior to the main Dudley earthquake. Subsequent investigation revealed that a similar period occurred prior to another smaller but recorded earthquake in the English Channel

  15. The case of escape probability as linear in short time

    Science.gov (United States)

    Marchewka, A.; Schuss, Z.

    2018-02-01

    We derive rigorously the short-time escape probability of a quantum particle from its compactly supported initial state, which has a discontinuous derivative at the boundary of the support. We show that this probability is linear in time, which seems to be a new result. The novelty of our calculation is the inclusion of the boundary layer of the propagated wave function formed outside the initial support. This result has applications to the decay law of the particle, to the Zeno behaviour, quantum absorption, time of arrival, quantum measurements, and more.

  16. Factors Influencing Short-term Synaptic Plasticity in the Avian Cochlear Nucleus Magnocellularis

    Directory of Open Access Journals (Sweden)

    Jason Tait Sanchez Quinones

    2015-01-01

    Full Text Available Defined as reduced neural responses during high rates of activity, synaptic depression is a form of short-term plasticity important for the temporal filtering of sound. In the avian cochlear nucleus magnocellularis (NM, an auditory brainstem structure, mechanisms regulating short-term synaptic depression include pre-, post-, and extrasynaptic factors. Using varied paired-pulse stimulus intervals, we found that the time course of synaptic depression lasts up to four seconds at late-developing NM synapses. Synaptic depression was largely reliant on exogenous Ca 2+ -dependent probability of presynaptic neurotransmitter release, and to a lesser extent, on the desensitization of postsynaptic α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid-type glutamate receptor (AMPA-R. Interestingly, although extrasynaptic glutamate clearance did not play a significant role in regulating synaptic depression, blocking glutamate clearance at early-developing synapses altered synaptic dynamics, changing responses from depression to facilitation. These results suggest a developmental shift in the relative reliance on pre-, post-, and extrasynaptic factors in regulating short-term synaptic plasticity in NM.

  17. What are the differences between long-term, short-term, and working memory?

    Science.gov (United States)

    Cowan, Nelson

    2008-01-01

    In the recent literature there has been considerable confusion about the three types of memory: long-term, short-term, and working memory. This chapter strives to reduce that confusion and makes up-to-date assessments of these types of memory. Long- and short-term memory could differ in two fundamental ways, with only short-term memory demonstrating (1) temporal decay and (2) chunk capacity limits. Both properties of short-term memory are still controversial but the current literature is rather encouraging regarding the existence of both decay and capacity limits. Working memory has been conceived and defined in three different, slightly discrepant ways: as short-term memory applied to cognitive tasks, as a multi-component system that holds and manipulates information in short-term memory, and as the use of attention to manage short-term memory. Regardless of the definition, there are some measures of memory in the short term that seem routine and do not correlate well with cognitive aptitudes and other measures (those usually identified with the term "working memory") that seem more attention demanding and do correlate well with these aptitudes. The evidence is evaluated and placed within a theoretical framework depicted in Fig. 1.

  18. Readiness for change and short-term outcomes of female adolescents in residential treatment for anorexia nervosa.

    Science.gov (United States)

    McHugh, Matthew D

    2007-11-01

    To determine if readiness for change (RFC) at admission predicted length of stay (LOS) and short-term outcomes among female adolescents in residential treatment for anorexia nervosa (AN). Using a prospective cohort design to collect data from participants (N = 65) at admission and discharge, Kaplan-Meier survival analysis and Cox regression tested whether RFC on admission predicted time in LOS to a favorable short-term outcome--a composite endpoint based on minimum criteria for weight gain, drive for thinness, depression, anxiety, and health-related quality of life (HRQOL). Participants with low RFC had a mean survival time to a favorable short-term outcome of 59.4 days compared to 34.1 days for those with high RFC (log rank = 8.44, df = 1, p = .003). The probability of a favorable short-term outcome was 5.30 times greater for participants with high RFC. Readiness for change is a useful predictor of a favorable short-term outcome and should be considered in the assessment profile of patients with AN. (c) 2007 by Wiley Periodicals, Inc.

  19. Stabilizing intermediate-term medium-range earthquake predictions

    International Nuclear Information System (INIS)

    Kossobokov, V.G.; Romashkova, L.L.; Panza, G.F.; Peresan, A.

    2001-12-01

    A new scheme for the application of the intermediate-term medium-range earthquake prediction algorithm M8 is proposed. The scheme accounts for the natural distribution of seismic activity, eliminates the subjectivity in the positioning of the areas of investigation and provides additional stability of the predictions with respect to the original variant. According to the retroactive testing in Italy and adjacent regions, this improvement is achieved without any significant change of the alarm volume in comparison with the results published so far. (author)

  20. Long-term perspectives on giant earthquakes and tsunamis at subduction zones

    Science.gov (United States)

    Satake, K.; Atwater, B.F.; ,

    2007-01-01

    Histories of earthquakes and tsunamis, inferred from geological evidence, aid in anticipating future catastrophes. This natural warning system now influences building codes and tsunami planning in the United States, Canada, and Japan, particularly where geology demonstrates the past occurrence of earthquakes and tsunamis larger than those known from written and instrumental records. Under favorable circumstances, paleoseismology can thus provide long-term advisories of unusually large tsunamis. The extraordinary Indian Ocean tsunami of 2004 resulted from a fault rupture more than 1000 km in length that included and dwarfed fault patches that had broken historically during lesser shocks. Such variation in rupture mode, known from written history at a few subduction zones, is also characteristic of earthquake histories inferred from geology on the Pacific Rim. Copyright ?? 2007 by Annual Reviews. All rights reserved.

  1. Earthquake outlook for the San Francisco Bay region 2014–2043

    Science.gov (United States)

    Aagaard, Brad T.; Blair, James Luke; Boatwright, John; Garcia, Susan H.; Harris, Ruth A.; Michael, Andrew J.; Schwartz, David P.; DiLeo, Jeanne S.; Jacques, Kate; Donlin, Carolyn

    2016-06-13

    Using information from recent earthquakes, improved mapping of active faults, and a new model for estimating earthquake probabilities, the 2014 Working Group on California Earthquake Probabilities updated the 30-year earthquake forecast for California. They concluded that there is a 72 percent probability (or likelihood) of at least one earthquake of magnitude 6.7 or greater striking somewhere in the San Francisco Bay region before 2043. Earthquakes this large are capable of causing widespread damage; therefore, communities in the region should take simple steps to help reduce injuries, damage, and disruption, as well as accelerate recovery from these earthquakes.

  2. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  3. Amplitude of foreshocks as a possible seismic precursor to earthquakes

    Science.gov (United States)

    Lindh, A.G.

    1978-01-01

    In recent years, we have made significant progress in being able to recognize the long-range pattern of events that precede large earthquakes. For example, in a recent issue of the Earthquake Information Bulletin, we saw how the pioneering work of S.A. Fedotov of the U.S.S.R in the Kamchatka-Kurile Islands region has been applied worldwide to forecast where large, shallow earthquakes might occur in the next decades. Indeed, such a "seismic gap" off the coast of Alaska was filled by the 1972 Sitka earthquake. Promising results are slowly accumulating from other techniques that suggest that intermediate-term precursors might also be seen: among these are tilt and geomagnetic anomalies and anomalous land uplift. But the crucial point remains that short-term precursors (days to hours) will be needed in many cases if there is to be a significant saving of lives. 

  4. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  5. Implication of conjugate faulting in the earthquake brewing and originating process

    Energy Technology Data Exchange (ETDEWEB)

    Jones, L.M. (Massachusetts Inst. of Tech., Cambridge); Deng, Q.; Jiang, P.

    1980-03-01

    The earthquake sequence, precursory and geologo-structural background of the Haicheng, Tangshan, Songpan-Pingwu earthquakes are discussed in this article. All of these earthquakes occurred in a seismic zone controlled by the main boundary faults of an intraplate fault block. However, the fault plane of a main earthquake does not consist of the same faults, but is rather a related secondary fault. They formed altogether a conjugate shearing rupture zone under the action of a regional tectonic stress field. As to the earthquake sequence, the foreshocks and aftershocks might occur on the conjugate fault planes within an epicentral region rather than be limited to the fault plane of a main earthquake, such as the distribution of foreshocks and aftershocks of the Haicheng earthquake. The characteristics of the long-, medium-, and imminent-term earthquake precursory anomalies of the three mentioned earthquakes, especially the character of well-studies anomaly phenomena in electrical resistivity, radon emission, groundwater and animal behavior, have been investigated. The studies of these earthquake precursors show that they were distributed in an area rather more extensive than the epicentral region. Some fault zones in the conjugate fault network usually appeared as distributed belts or concentrated zones of earthquake precursory anomalies, and can be traced in the medium-long term precursory field, but seem more distinct in the short-imminent term precursory anomalous field. These characteristics can be explained by the rupture and sliding originating along the conjugate shear network and the concentration of stress in the regional stress field.

  6. The Demonstration of Short-Term Consolidation.

    Science.gov (United States)

    Jolicoeur, Pierre; Dell'Acqua, Roberto

    1998-01-01

    Results of seven experiments involving 112 college students or staff using a dual-task approach provide evidence that encoding information into short-term memory involves a distinct process termed short-term consolidation (STC). Results suggest that STC has limited capacity and that it requires central processing mechanisms. (SLD)

  7. Earthquake precursory events around epicenters and local active faults

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S. B.; Haydari Azad, F.

    2013-05-01

    The chain of underground events which are triggered by seismic activities and physical/chemical interactions prior to a shake in the earth's crust may produce surface and above surface phenomena. During the past decades many researchers have been carried away to seek the possibility of short term earthquake prediction using remote sensing data. Currently, there are several theories about the preparation stages of earthquakes most of which stress on raises in heat and seismic waves as the main signs of an impending earthquakes. Their differences only lie in the secondary phenomena which are triggered by these events. In any case, with the recent advances in remote sensing sensors and techniques now we are able to provide wider, more accurate monitoring of land, ocean and atmosphere. Among all theoretical factors, changes in Surface Latent Heat Flux (SLHF), Sea & Land Surface Temperature (SST & LST) and surface chlorophyll-a are easier to record from earth observing satellites. SLHF is the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere. Abnormal variations in this factor have been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. In case of oceanic earthquakes, higher temperature at the ocean beds may lead to higher amount of Chl-a on the sea surface. On the other hand, it has been also said that the leak of Radon gas which occurs as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT). We have chosen to perform a statistical, long-term, and short-term approach by considering the reoccurrence intervals of past

  8. Fixed recurrence and slip models better predict earthquake behavior than the time- and slip-predictable models 1: repeating earthquakes

    Science.gov (United States)

    Rubinstein, Justin L.; Ellsworth, William L.; Chen, Kate Huihsuan; Uchida, Naoki

    2012-01-01

    The behavior of individual events in repeating earthquake sequences in California, Taiwan and Japan is better predicted by a model with fixed inter-event time or fixed slip than it is by the time- and slip-predictable models for earthquake occurrence. Given that repeating earthquakes are highly regular in both inter-event time and seismic moment, the time- and slip-predictable models seem ideally suited to explain their behavior. Taken together with evidence from the companion manuscript that shows similar results for laboratory experiments we conclude that the short-term predictions of the time- and slip-predictable models should be rejected in favor of earthquake models that assume either fixed slip or fixed recurrence interval. This implies that the elastic rebound model underlying the time- and slip-predictable models offers no additional value in describing earthquake behavior in an event-to-event sense, but its value in a long-term sense cannot be determined. These models likely fail because they rely on assumptions that oversimplify the earthquake cycle. We note that the time and slip of these events is predicted quite well by fixed slip and fixed recurrence models, so in some sense they are time- and slip-predictable. While fixed recurrence and slip models better predict repeating earthquake behavior than the time- and slip-predictable models, we observe a correlation between slip and the preceding recurrence time for many repeating earthquake sequences in Parkfield, California. This correlation is not found in other regions, and the sequences with the correlative slip-predictable behavior are not distinguishable from nearby earthquake sequences that do not exhibit this behavior.

  9. What are the differences between long-term, short-term, and working memory?

    OpenAIRE

    Cowan, Nelson

    2008-01-01

    In the recent literature there has been considerable confusion about the three types of memory: long-term, short-term, and working memory. This chapter strives to reduce that confusion and makes up-to-date assessments of these types of memory. Long- and short-term memory could differ in two fundamental ways, with only short-term memory demonstrating (1) temporal decay and (2) chunk capacity limits. Both properties of short-term memory are still controversial but the current literature is rath...

  10. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  11. Short-term memory across eye blinks.

    Science.gov (United States)

    Irwin, David E

    2014-01-01

    The effect of eye blinks on short-term memory was examined in two experiments. On each trial, participants viewed an initial display of coloured, oriented lines, then after a retention interval they viewed a test display that was either identical or different by one feature. Participants kept their eyes open throughout the retention interval on some blocks of trials, whereas on others they made a single eye blink. Accuracy was measured as a function of the number of items in the display to determine the capacity of short-term memory on blink and no-blink trials. In separate blocks of trials participants were instructed to remember colour only, orientation only, or both colour and orientation. Eye blinks reduced short-term memory capacity by approximately 0.6-0.8 items for both feature and conjunction stimuli. A third, control, experiment showed that a button press during the retention interval had no effect on short-term memory capacity, indicating that the effect of an eye blink was not due to general motoric dual-task interference. Eye blinks might instead reduce short-term memory capacity by interfering with attention-based rehearsal processes.

  12. Short-term memory and dual task performance

    Science.gov (United States)

    Regan, J. E.

    1982-01-01

    Two hypotheses concerning the way in which short-term memory interacts with another task in a dual task situation are considered. It is noted that when two tasks are combined, the activity of controlling and organizing performance on both tasks simultaneously may compete with either task for a resource; this resource may be space in a central mechanism or general processing capacity or it may be some task-specific resource. If a special relationship exists between short-term memory and control, especially if there is an identity relationship between short-term and a central controlling mechanism, then short-term memory performance should show a decrement in a dual task situation. Even if short-term memory does not have any particular identity with a controlling mechanism, but both tasks draw on some common resource or resources, then a tradeoff between the two tasks in allocating resources is possible and could be reflected in performance. The persistent concurrence cost in memory performance in these experiments suggests that short-term memory may have a unique status in the information processing system.

  13. The influence of short-term concentration peaks on exposure risks in the vicinity of an episodic release of hydrogen sulphide

    International Nuclear Information System (INIS)

    1981-01-01

    In order to propose a methodology by which the influence of short-term concentration peaks on exposure risks could be estimated in the vicinity of an atmospheric release of hydrogen sulphide (H 2 S), an extensive and up-to-date review of H 2 S toxicity was conducted, with emphasis on acute and sub-acute poisoning. The literature on animal studies and cases of human exposure were used to derive a lethal dose relationship(concentration-exposure time) appropriate for the general population. A statistical model was developed which calculates the probability of short-term concentrations exceeding the lethal level given the downwind range from the release, the short-term averaging time of interest, the long-term average concentration, and metereological and terrain conditions . Results were obtained for passive releases of H 2 S under a range of hypothetical conditions. Interpretation of these results is given in terms of the overall probability of lethal exposure during a 30-minute episode. The likely influence of heavy water plant gas dispersion systems is also addressed. (author)

  14. Long-term versus short-term deformation of the meizoseismal area of the 2008 Achaia-Elia (MW 6.4) earthquake in NW Peloponnese, Greece: Evidence from historical triangulation and morphotectonic data

    Science.gov (United States)

    Stiros, Stathis; Moschas, Fanis; Feng, Lujia; Newman, Andrew

    2013-04-01

    The deformation of the meizoseismal area of the 2008 Achaia-Elia (MW 6.4) earthquake in NW Peloponnese, of the first significant strike slip earthquake in continental Greece, was examined in two time scales; of 102 years, based on the analysis of high-accuracy historical triangulation data describing shear, and of 105-106 years, based on the analysis of the hydrographic network of the area for signs of streams offset by faulting. Our study revealed pre-seismic accumulation of shear strain of the order of 0.2 μrad/year in the study area, consistent with recent GPS evidence, but no signs of significant strike slip-induced offsets in the hydrographic network. These results confirm the hypothesis that the 2008 fault, which did not reached the surface and was not associated with significant seismic ground deformation, probably because of a surface flysch layer filtering high-strain events, was associated with an immature or a dormant, recently activated fault. This fault, about 150 km long and discordant to the morphotectonic trends of the area, seems first, to contain segments which have progressively reactivated in a specific direction in the last 20 years, reminiscent of the North Anatolian Fault, and second, to limit an 150 km wide (recent?) shear zone in the internal part of the arc, in a region mostly dominated by thrust faulting and strong destructive earthquakes. Deformation of the first main strike slip fault in continental Greece analyzed. Triangulation data show preseismic shear, hydrographic net no previous faulting. Surface shear deformation only in low strain rates. Immature or reactivated dormant strike slip fault, with gradual oriented rupturing. Interplay between shear and thrusting along the arc.

  15. Estimation of failure probability on real structure utilized by earthquake observation data

    International Nuclear Information System (INIS)

    Matsubara, Masayoshi

    1995-01-01

    The objective of this report is to propose the procedure which estimates the structural response on a real structure by utilizing earthquake observation data using Neural network system. We apply the neural network system to estimate the ground motion of the site by enormous earthquake data published from Japan Meteorological Agency. The proposed procedure has some possibility to estimate the correlation between earthquake and response adequately. (author)

  16. Myrmecochory and short-term seed fate in Rhamnus alaternus: Ant species and seed characteristics

    Science.gov (United States)

    Bas, J. M.; Oliveras, J.; Gómez, C.

    2009-05-01

    Benefits conferred on plants in ant-mediated seed dispersal mutualisms (myrmecochory) depend on the fate of transported seeds. We studied the effects of elaiosome presence, seed size and seed treatment (with and without passage through a bird's digestive tract) on short-term seed fate in Rhamnus alaternus. In our study, we define short-term seed, or initial, seed fate, as the location where ants release the seeds after ant contact with it. The elaiosomes had the most influence on short-term fate, i.e. whether or not seeds were transported to the nest. The workers usually transported big seeds more often than small ones, but small ants did not transport large seeds. Effect of seed size on transport depended on the ant species and on the treatment of the seed (manual extraction simulating a direct fall from the parent plant vs. bird deposition corresponding to preliminary primary dispersal). Probability of removal of elaiosome-bearing seeds to the nest by Aphaenogaster senilis increased with increasing seed weight.

  17. Earthquake risk assessment of building structures

    International Nuclear Information System (INIS)

    Ellingwood, Bruce R.

    2001-01-01

    During the past two decades, probabilistic risk analysis tools have been applied to assess the performance of new and existing building structural systems. Structural design and evaluation of buildings and other facilities with regard to their ability to withstand the effects of earthquakes requires special considerations that are not normally a part of such evaluations for other occupancy, service and environmental loads. This paper reviews some of these special considerations, specifically as they pertain to probability-based codified design and reliability-based condition assessment of existing buildings. Difficulties experienced in implementing probability-based limit states design criteria for earthquake are summarized. Comparisons of predicted and observed building damage highlight the limitations of using current deterministic approaches for post-earthquake building condition assessment. The importance of inherent randomness and modeling uncertainty in forecasting building performance is examined through a building fragility assessment of a steel frame with welded connections that was damaged during the Northridge Earthquake of 1994. The prospects for future improvements in earthquake-resistant design procedures based on a more rational probability-based treatment of uncertainty are examined

  18. Short-term LNG-markets

    International Nuclear Information System (INIS)

    Eldegard, Tom; Lund, Arne-Christian; Miltersen, Kristian; Rud, Linda

    2005-01-01

    The global Liquefied Natural Gas (LNG) industry has experienced substantial growth in the past decades. In the traditional trade patterns of LNG the product has typically been handled within a dedicated chain of plants and vessels fully committed by long term contracts or common ownership, providing risk sharing of large investments in a non-liquid market. Increasing gas prices and substantial cost reductions in all parts of the LNG chain have made LNG projects viable even if only part of the capacity is secured by long-term contracts, opening for more flexible trade of the remainder. Increasing gas demand, especially in power generation, combined with cost reductions in the cost of LNG terminals, open new markets for LNG. For the LNG supplier, the flexibility of shifting volumes between regions represents an additional value. International trade in LNG has been increasing, now accounting for more than one fifth of the world's cross-border gas trade. Despite traditional vertical chain bonds, increased flexibility has contributed in fact to an increasing LNG spot trade, representing 8% of global trade in 2002. The focus of this paper is on the development of global short-term LNG markets, and their role with respect to efficiency and security of supply in European gas markets. Arbitrage opportunities arising from price differences between regional markets (such as North America versus Europe) are important impetuses for flexible short-term trade. However, the short-term LNG trade may suffer from problems related to market access, e.g. limited access to terminals and regulatory issues, as well as rigidities connected to vertical binding within the LNG chain. Important issues related to the role of short-term LNG-trade in the European gas market are: Competition, flexibility in meeting peak demand, security of supply and consequences of differences in pricing policies (oil-linked prices in Europe and spot market prices in North America). (Author)

  19. Subduction zone earthquake probably triggered submarine hydrocarbon seepage offshore Pakistan

    Science.gov (United States)

    Fischer, David; José M., Mogollón; Michael, Strasser; Thomas, Pape; Gerhard, Bohrmann; Noemi, Fekete; Volkhard, Spiess; Sabine, Kasten

    2014-05-01

    Seepage of methane-dominated hydrocarbons is heterogeneous in space and time, and trigger mechanisms of episodic seep events are not well constrained. It is generally found that free hydrocarbon gas entering the local gas hydrate stability field in marine sediments is sequestered in gas hydrates. In this manner, gas hydrates can act as a buffer for carbon transport from the sediment into the ocean. However, the efficiency of gas hydrate-bearing sediments for retaining hydrocarbons may be corrupted: Hypothesized mechanisms include critical gas/fluid pressures beneath gas hydrate-bearing sediments, implying that these are susceptible to mechanical failure and subsequent gas release. Although gas hydrates often occur in seismically active regions, e.g., subduction zones, the role of earthquakes as potential triggers of hydrocarbon transport through gas hydrate-bearing sediments has hardly been explored. Based on a recent publication (Fischer et al., 2013), we present geochemical and transport/reaction-modelling data suggesting a substantial increase in upward gas flux and hydrocarbon emission into the water column following a major earthquake that occurred near the study sites in 1945. Calculating the formation time of authigenic barite enrichments identified in two sediment cores obtained from an anticlinal structure called "Nascent Ridge", we find they formed 38-91 years before sampling, which corresponds well to the time elapsed since the earthquake (62 years). Furthermore, applying a numerical model, we show that the local sulfate/methane transition zone shifted upward by several meters due to the increased methane flux and simulated sulfate profiles very closely match measured ones in a comparable time frame of 50-70 years. We thus propose a causal relation between the earthquake and the amplified gas flux and present reflection seismic data supporting our hypothesis that co-seismic ground shaking induced mechanical fracturing of gas hydrate-bearing sediments

  20. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  1. Health education and promotion at the site of an emergency: experience from the Chinese Wenchuan earthquake response.

    Science.gov (United States)

    Tian, Xiangyang; Zhao, Genming; Cao, Dequan; Wang, Duoquan; Wang, Liang

    2016-03-01

    Theories and strategies of social mobilization, capacity building, mass and interpersonal communication, as well as risk communication and behavioral change were used to develop health education and promotion campaigns to decrease and prevent injuries and infectious diseases among the survivors of the Wenchuan earthquake in May 2008. We evaluated the effectiveness of the campaigns and short-term interventions using mixed-methods. The earthquake survivors' health knowledge, skills, and practice improved significantly with respect to injury protection, food and water safety, environmental and personal hygiene, and disease prevention. No infectious disease outbreaks were reported after the earthquake, and the epidemic level was lower than before the earthquake. After a short-term intervention among the students of Leigu Township Primary and Junior School, the proportion of those with personal hygiene increased from 59.7% to 98.3% (pearthquakes play an important role in preventing injuries and infectious diseases among survivors. © The Author(s) 2014.

  2. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  3. Short-term incentive schemes for hospital managers

    Directory of Open Access Journals (Sweden)

    Lucas Malambe

    2013-10-01

    Full Text Available Orientation: Short-term incentives, considered to be an extrinsic motivation, are commonly used to motivate performance. This study explored hospital managers’ perceptions of short term incentives in maximising performance and retention. Research purpose: The study explored the experiences, views and perceptions of private hospital managers in South Africa regarding the use of short-term incentives to maximise performance and retention, as well as the applicability of the findings to public hospitals. Motivation for the study: Whilst there is an established link between performance reward schemes and organisational performance, there is little understanding of the effects of short term incentives on the performance and retention of hospital managers within the South African context. Research design, approach, and method: The study used a qualitative research design: interviews were conducted with a purposive sample of 19 hospital managers, and a thematic content analysis was performed. Main findings: Short-term incentives may not be the primary motivator for hospital managers, but they do play a critical role in sustaining motivation. Participants indicated that these schemes could also be applicable to public hospitals. Practical/managerial implications: Hospital managers are inclined to be more motivated by intrinsic than extrinsic factors. However, hospital managers (as middle managers also seem to be motivated by short-term incentives. A combination of intrinsic and extrinsic motivators should thus be used to maximise performance and retention. Contribution/value-add: Whilst the study sought to explore hospital managers’ perceptions of short-term incentives, it also found that an adequate balance between internal and external motivators is key to implementing an effective short-term incentive scheme.

  4. Short-Term Intercultural Psychotherapy: Ethnographic Inquiry

    Science.gov (United States)

    Seeley, Karen M.

    2004-01-01

    This article examines the challenges specific to short-term intercultural treatments and recently developed approaches to intercultural treatments based on notions of cultural knowledge and cultural competence. The article introduces alternative approaches to short-term intercultural treatments based on ethnographic inquiry adapted for clinical…

  5. Long Aftershock Sequences within Continents and Implications for Earthquake Hazard Assessment

    Science.gov (United States)

    Stein, S. A.; Liu, M.

    2014-12-01

    Recent seismicity in the Tangshan region in North China has prompted concern about a repetition of the 1976 M7.8 earthquake that destroyed the city, killing more than 242,000 people. However, the decay of seismicity there implies that the recent earthquakes are probably aftershocks of the 1976 event. This 37-year sequence is an example of the phenomenon that aftershock sequences within continents are often significantly longer than the typical 10 years at plate boundaries. The long sequence of aftershocks in continents is consistent with a simple friction-based model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Hence the slowly-deforming continents tend to have aftershock sequences significantly longer than at rapidly-loaded plate boundaries. This effect has two consequences for hazard assessment. First, within the heavily populated continents that are typically within plate interiors, assessments of earthquake hazards rely significantly on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. This assumption would lead to overestimation of the hazard in presently active areas and underestimation elsewhere, if some of these small events are aftershocks. Second, successful attempts to remove aftershocks from catalogs used for hazard assessment would underestimate the hazard, because much of the hazard is due to the aftershocks, and the declustering algorithms implicitly assume short aftershock sequences and thus do not remove long-duration ones.

  6. Why do short term workers have high mortality?

    DEFF Research Database (Denmark)

    Kolstad, Henrik; Olsen, Jørn

    1999-01-01

    or violence, the rate ratios for short term employment were 2.30 (95% Cl 1.74-3.06) and 1.86 (95% Cl 1.35-2.56), respectively. An unhealthy lifestyle may also be a determinant of short term employment. While it is possible in principle to adjust for lifestyle factors if proper data are collected, the health......Increased mortality is often reported among workers in short term employment. This may indicate either a health-related selection process or the presence of different lifestyle or social conditions among short term workers. The authors studied these two aspects of short term employment among 16...

  7. An accident diagnosis algorithm using long short-term memory

    Directory of Open Access Journals (Sweden)

    Jaemin Yang

    2018-05-01

    Full Text Available Accident diagnosis is one of the complex tasks for nuclear power plant (NPP operators. In abnormal or emergency situations, the diagnostic activity of the NPP states is burdensome though necessary. Numerous computer-based methods and operator support systems have been suggested to address this problem. Among them, the recurrent neural network (RNN has performed well at analyzing time series data. This study proposes an algorithm for accident diagnosis using long short-term memory (LSTM, which is a kind of RNN, which improves the limitation for time reflection. The algorithm consists of preprocessing, the LSTM network, and postprocessing. In the LSTM-based algorithm, preprocessed input variables are calculated to output the accident diagnosis results. The outputs are also postprocessed using softmax to determine the ranking of accident diagnosis results with probabilities. This algorithm was trained using a compact nuclear simulator for several accidents: a loss of coolant accident, a steam generator tube rupture, and a main steam line break. The trained algorithm was also tested to demonstrate the feasibility of diagnosing NPP accidents. Keywords: Accident Diagnosis, Long Short-term Memory, Recurrent Neural Network, Softmax

  8. Controls on the long term earthquake behavior of an intraplate fault revealed by U-Th and stable isotope analyses of syntectonic calcite veins

    Science.gov (United States)

    Williams, Randolph; Goodwin, Laurel; Sharp, Warren; Mozley, Peter

    2017-04-01

    U-Th dates on calcite precipitated in coseismic extension fractures in the Loma Blanca normal fault zone, Rio Grande rift, NM, USA, constrain earthquake recurrence intervals from 150-565 ka. This is the longest direct record of seismicity documented for a fault in any tectonic environment. Combined U-Th and stable isotope analyses of these calcite veins define 13 distinct earthquake events. These data show that for more than 400 ka the Loma Blanca fault produced earthquakes with a mean recurrence interval of 40 ± 7 ka. The coefficient of variation for these events is 0.40, indicating strongly periodic seismicity consistent with a time-dependent model of earthquake recurrence. Stochastic statistical analyses further validate the inference that earthquake behavior on the Loma Blanca was time-dependent. The time-dependent nature of these earthquakes suggests that the seismic cycle was fundamentally controlled by a stress renewal process. However, this periodic cycle was punctuated by an episode of clustered seismicity at 430 ka. Recurrence intervals within the earthquake cluster were as low as 5-11 ka. Breccia veins formed during this episode exhibit carbon isotope signatures consistent with having formed through pronounced degassing of a CO2 charged brine during post-failure, fault-localized fluid migration. The 40 ka periodicity of the long-term earthquake record of the Loma Blanca fault is similar in magnitude to recurrence intervals documented through paleoseismic studies of other normal faults in the Rio Grande rift and Basin and Range Province. We propose that it represents a background rate of failure in intraplate extension. The short-term, clustered seismicity that occurred on the fault records an interruption of the stress renewal process, likely by elevated fluid pressure in deeper structural levels of the fault, consistent with fault-valve behavior. The relationship between recurrence interval and inferred fluid degassing suggests that pore fluid pressure

  9. A Short-Term Outage Model of Wind Turbines with Doubly Fed Induction Generators Based on Supervisory Control and Data Acquisition Data

    Directory of Open Access Journals (Sweden)

    Peng Sun

    2016-10-01

    Full Text Available This paper presents a short-term wind turbine (WT outage model based on the data collected from a wind farm supervisory control and data acquisition (SCADA system. Neural networks (NNs are used to establish prediction models of the WT condition parameters that are dependent on environmental conditions such as ambient temperature and wind speed. The prediction error distributions are discussed and used to calculate probabilities of the operation of protection relays (POPRs that were caused by the threshold exceedance of the environmentally sensitive parameters. The POPRs for other condition parameters are based on the setting time of the operation of protection relays. The union probability method is used to integrate the probabilities of operation of each protection relay to predict the WT short term outage probability. The proposed method has been used for real 1.5 MW WTs with doubly fed induction generators (DFIGs. The results show that the proposed method is more effective in WT outage probability prediction than traditional methods.

  10. Short- and long-term cognitive effects of chronic cannabinoids administration in late-adolescence rats.

    Directory of Open Access Journals (Sweden)

    Hila Abush

    Full Text Available The use of cannabis can impair cognitive function, especially short-term memory. A controversial question is whether long-term cannabis use during the late-adolescence period can cause irreversible deficits in higher brain function that persist after drug use stops. In order to examine the short- and long-term effects of chronic exposure to cannabinoids, rats were administered chronic i.p. treatment with the CB1/CB2 receptor agonist WIN55,212-2 (WIN; 1.2 mg/kg for two weeks during the late adolescence period (post-natal days 45-60 and tested for behavioral and electrophysiological measures of cognitive performance 24 hrs, 10 and 30 days after the last drug injection. The impairing effects of chronic WIN on short-term memory in the water maze and the object recognition tasks as well as long-term potentiation (LTP in the ventral subiculum (vSub-nucleus accumbens (NAc pathway were temporary as they lasted only 24 h or 10 d after withdrawal. However, chronic WIN significantly impaired hippocampal dependent short-term memory measured in the object location task 24 hrs, 10, 30, and 75 days after the last drug injection. Our findings suggest that some forms of hippocampal-dependent short-term memory are sensitive to chronic cannabinoid administration but other cognitive impairments are temporary and probably result from a residue of cannabinoids in the brain or acute withdrawal effects from cannabinoids. Understanding the effects of cannabinoids on cognitive function may provide us with tools to overcome these impairments and for cannabinoids to be more favorably considered for clinical use.

  11. Short term memory may be the depletion of the readily releasable pool of presynaptic neurotransmitter vesicles of a metastable long term memory trace pattern.

    Science.gov (United States)

    Tarnow, Eugen

    2009-09-01

    The Tagging/Retagging model of short term memory was introduced earlier (Tarnow in Cogn Neurodyn 2(4):347-353, 2008) to explain the linear relationship between response time and correct response probability for word recall and recognition: At the initial stimulus presentation the words displayed tag the corresponding long term memory locations. The tagging process is linear in time and takes about one second to reach a tagging level of 100%. After stimulus presentation the tagging level decays logarithmically with time to 50% after 14 s and to 20% after 220 s. If a probe word is reintroduced the tagging level has to return to 100% for the word to be properly identified, which leads to a delay in response time. This delay is proportional to the tagging loss. The tagging level is directly related to the probability of correct word recall and recognition. Evidence presented suggests that the tagging level is the level of depletion of the Readily Releasable Pool (RRP) of neurotransmitter vesicles at presynaptic terminals. The evidence includes the initial linear relationship between tagging level and time as well as the subsequent logarithmic decay of the tagging level. The activation of a short term memory may thus be the depletion of RRP (exocytosis) and short term memory decay may be the ensuing recycling of the neurotransmitter vesicles (endocytosis). The pattern of depleted presynaptic terminals corresponds to the long term memory trace.

  12. Thermal Radiation Anomalies Associated with Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulinets, Sergey; Kafatos, Menas C.; Taylor, Patrick

    2017-01-01

    Recent developments of remote sensing methods for Earth satellite data analysis contribute to our understanding of earthquake related thermal anomalies. It was realized that the thermal heat fluxes over areas of earthquake preparation is a result of air ionization by radon (and other gases) and consequent water vapor condensation on newly formed ions. Latent heat (LH) is released as a result of this process and leads to the formation of local thermal radiation anomalies (TRA) known as OLR (outgoing Longwave radiation, Ouzounov et al, 2007). We compare the LH energy, obtained by integrating surface latent heat flux (SLHF) over the area and time with released energies associated with these events. Extended studies of the TRA using the data from the most recent major earthquakes allowed establishing the main morphological features. It was also established that the TRA are the part of more complex chain of the short-term pre-earthquake generation, which is explained within the framework of a lithosphere-atmosphere coupling processes.

  13. Reliability analysis of service water system under earthquake

    International Nuclear Information System (INIS)

    Yu Yu; Qian Xiaoming; Lu Xuefeng; Wang Shengfei; Niu Fenglei

    2013-01-01

    Service water system is one of the important safety systems in nuclear power plant, whose failure probability is always gained by system reliability analysis. The probability of equipment failure under the earthquake is the function of the peak acceleration of earthquake motion, while the occurrence of earthquake is of randomicity, thus the traditional fault tree method in current probability safety assessment is not powerful enough to deal with such case of conditional probability problem. An analysis frame was put forward for system reliability evaluation in seismic condition in this paper, in which Monte Carlo simulation was used to deal with conditional probability problem. Annual failure probability of service water system was calculated, and failure probability of 1.46X10 -4 per year was obtained. The analysis result is in accordance with the data which indicate equipment seismic resistance capability, and the rationality of the model is validated. (authors)

  14. Verbal Short-Term Memory Span in Speech-Disordered Children: Implications for Articulatory Coding in Short-Term Memory.

    Science.gov (United States)

    Raine, Adrian; And Others

    1991-01-01

    Children with speech disorders had lower short-term memory capacity and smaller word length effect than control children. Children with speech disorders also had reduced speech-motor activity during rehearsal. Results suggest that speech rate may be a causal determinant of verbal short-term memory capacity. (BC)

  15. Real-time energy resources scheduling considering short-term and very short-term wind forecast

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Marco; Sousa, Tiago; Morais, Hugo; Vale, Zita [Polytechnic of Porto (Portugal). GECAD - Knowledge Engineering and Decision Support Research Center

    2012-07-01

    This paper proposes an energy resources management methodology based on three distinct time horizons: day-ahead scheduling, hour-ahead scheduling, and real-time scheduling. In each scheduling process the update of generation and consumption operation and of the storage and electric vehicles storage status are used. Besides the new operation conditions, the most accurate forecast values of wind generation and of consumption using results of short-term and very short-term methods are used. A case study considering a distribution network with intensive use of distributed generation and electric vehicles is presented. (orig.)

  16. Swedish National Seismic Network (SNSN). A short report on recorded earthquakes during the fourth quarter of the year 2010

    Energy Technology Data Exchange (ETDEWEB)

    Boedvarsson, Reynir (Uppsala Univ. (Sweden), Dept. of Earth Sciences)

    2011-01-15

    According to an agreement with Swedish Nuclear Fuel and Waste Management Company (SKB) and Uppsala Univ., the Dept. of Earth Sciences has continued to carry out observations of seismic events at seismic stations within the Swedish National Seismic Network (SNSN). This short report gives brief information about the recorded seismicity during October through December 2010. The Swedish National Seismic Network consists of 62 stations. During October through December, 2,241 events were located whereof 158 are estimated as real earthquakes, 1,457 are estimated as explosions, 444 are induced earthquakes in the vicinity of the mines in Kiruna and Malmberget and 182 events are still considered as uncertain but these are most likely explosions and are mainly located outside the network. One earthquake had a magnitude above M{sub L} = 2.0 during the period. In November one earthquake was located 13 km SW of Haernoesand with a magnitude of M{sub L} = 2.1. The largest earthquake in October had a magnitude of M{sub L} = 1.7 and was located 12 km NE of Eksjoe and in December an earthquake with a magnitude of M{sub L} = 1.8 was located 19 km north of Motala

  17. Long-term impact of earthquake stress on fasting glucose control and diabetes prevalence among Chinese adults of Tangshan.

    Science.gov (United States)

    An, Cuixia; Zhang, Yun; Yu, Lulu; Li, Na; Song, Mei; Wang, Lan; Zhao, Xiaochuan; Gao, Yuanyuan; Wang, Xueyi

    2014-01-01

    To investigate the long-term influence of stresses from the 1976 Tangshan earthquake on blood glucose control and the incidence of diabetes mellitus in Chinese people of Tangshan. 1,551 adults ≥ 37 years of age were recruited for this investigation in Tangshan city of China, where one of the deadliest earthquakes occurred in 1796. All subjects finished a questionnaire. 1,030 of them who experienced that earthquake were selected into the exposure group, while 521 were gathered as the control group who have not exposed to any earthquake. The numbers of subjects who were first identified with diabetes or had normal FBG but with diabetic history were added for the calculation of diabetes prevalence. Statistic-analysis was applied on the baseline data, and incidences of IFG as well as diabetes among all groups. Statistic comparisons indicate there is no significant difference on average fasting glucose levels between the control group and the exposure group. However, the prevalence of IFG and diabetes among the exposure group displays significant variance with the control group. The prevalence of diabetes among exposure groups is significantly higher than the control group. Women are more likely to have diabetes after experiencing earthquake stresses compared to men. The earthquake stress was linked to higher diabetes incidence as an independent factor. The earthquake stress has long-term impacts on diabetes incidence as an independent risk factor. Emerging and long-term managements regarding the care of IFG and diabetes in populations exposed to earthquake stress should be concerned.

  18. The Mind and Brain of Short-Term Memory

    OpenAIRE

    Jonides, John; Lewis, Richard L.; Nee, Derek Evan; Lustig, Cindy A.; Berman, Marc G.; Moore, Katherine Sledge

    2008-01-01

    The past 10 years have brought near-revolutionary changes in psychological theories about short-term memory, with similarly great advances in the neurosciences. Here, we critically examine the major psychological theories (the “mind”) of short-term memory and how they relate to evidence about underlying brain mechanisms. We focus on three features that must be addressed by any satisfactory theory of short-term memory. First, we examine the evidence for the architecture of short-term memory, w...

  19. Very-long-term and short-term chromatic adaptation: are their influences cumulative?

    Science.gov (United States)

    Belmore, Suzanne C; Shevell, Steven K

    2011-02-09

    Very-long-term (VLT) chromatic adaptation results from exposure to an altered chromatic environment for days or weeks. Color shifts from VLT adaptation are observed hours or days after leaving the altered environment. Short-term chromatic adaptation, on the other hand, results from exposure for a few minutes or less, with color shifts measured within seconds or a few minutes after the adapting light is extinguished; recovery to the pre-adapted state is complete in less than an hour. Here, both types of adaptation were combined. All adaptation was to reddish-appearing long-wavelength light. Shifts in unique yellow were measured following adaptation. Previous studies demonstrate shifts in unique yellow due to VLT chromatic adaptation, but shifts from short-term chromatic adaptation to comparable adapting light can be far greater than from VLT adaptation. The question considered here is whether the color shifts from VLT adaptation are cumulative with large shifts from short-term adaptation or, alternatively, does simultaneous short-term adaptation eliminate color shifts caused by VLT adaptation. The results show the color shifts from VLT and short-term adaptation together are cumulative, which indicates that both short-term and very-long-term chromatic adaptation affect color perception during natural viewing. Copyright © 2010 Elsevier Ltd. All rights reserved.

  20. [Medium- and long-term health effects of the L'Aquila earthquake (Central Italy, 2009) and of other earthquakes in high-income Countries: a systematic review].

    Science.gov (United States)

    Ripoll Gallardo, Alba; Alesina, Marta; Pacelli, Barbara; Serrone, Dario; Iacutone, Giovanni; Faggiano, Fabrizio; Della Corte, Francesco; Allara, Elias

    2016-01-01

    to compare the methodological characteristics of the studies investigating the middle- and long-term health effects of the L'Aquila earthquake with the features of studies conducted after other earthquakes occurred in highincome Countries. a systematic comparison between the studies which evaluated the health effects of the L'Aquila earthquake (Central Italy, 6th April 2009) and those conducted after other earthquakes occurred in comparable settings. Medline, Scopus, and 6 sources of grey literature were systematically searched. Inclusion criteria comprised measurement of health outcomes at least one month after the earthquake, investigation of earthquakes occurred in high-income Countries, and presence of at least one temporal or geographical control group. out of 2,976 titles, 13 studies regarding the L'Aquila earthquake and 51 studies concerning other earthquakes were included. The L'Aquila and the Kobe/Hanshin- Awaji (Japan, 17th January 1995) earthquakes were the most investigated. Studies on the L'Aquila earthquake had a median sample size of 1,240 subjects, a median duration of 24 months, and used most frequently a cross sectional design (7/13). Studies on other earthquakes had a median sample size of 320 subjects, a median duration of 15 months, and used most frequently a time series design (19/51). the L'Aquila studies often focussed on mental health, while the earthquake effects on mortality, cardiovascular outcomes, and health systems were less frequently evaluated. A more intensive use of routine data could benefit future epidemiological surveillance in the aftermath of earthquakes.

  1. The role of short-term memory impairment in nonword repetition, real word repetition, and nonword decoding: A case study.

    Science.gov (United States)

    Peter, Beate

    2018-01-01

    In a companion study, adults with dyslexia and adults with a probable history of childhood apraxia of speech showed evidence of difficulty with processing sequential information during nonword repetition, multisyllabic real word repetition and nonword decoding. Results suggested that some errors arose in visual encoding during nonword reading, all levels of processing but especially short-term memory storage/retrieval during nonword repetition, and motor planning and programming during complex real word repetition. To further investigate the role of short-term memory, a participant with short-term memory impairment (MI) was recruited. MI was confirmed with poor performance during a sentence repetition and three nonword repetition tasks, all of which have a high short-term memory load, whereas typical performance was observed during tests of reading, spelling, and static verbal knowledge, all with low short-term memory loads. Experimental results show error-free performance during multisyllabic real word repetition but high counts of sequence errors, especially migrations and assimilations, during nonword repetition, supporting short-term memory as a locus of sequential processing deficit during nonword repetition. Results are also consistent with the hypothesis that during complex real word repetition, short-term memory is bypassed as the word is recognized and retrieved from long-term memory prior to producing the word.

  2. Long-term effects of earthquake experience of young persons on cardiovascular disease risk factors

    Science.gov (United States)

    Li, Na; Wang, Yumei; Yu, Lulu; Song, Mei; Wang, Lan; Ji, Chunpeng

    2016-01-01

    Introduction The aim of the study was to study the long-term effect on cardiovascular disease risk factors of stress from direct experience of an earthquake as a young person. Material and methods We selected workers born between July 1, 1958 and July 1, 1976 who were examined at Kailuan General Hospital between May and October of 2013. Data on cardiovascular events were taken during the workers’ annual health examination conducted between 2006 and 2007. All subjects were divided into three groups according to their experience of the Tangshan earthquake of July 28, 1976, as follows: control group; exposed group 1 and exposed group 2. We compared cardiovascular disease risk factors between the three groups as well as by gender and age. Results One thousand one hundred and ninety-six workers were included in the final statistical analysis. Among all subjects, resting heart rate (p = 0.003), total cholesterol (p earthquake compared with unexposed controls, but were unrelated to loss of relatives. No significant difference in triglyceride levels was observed between the three groups (p = 0.900). Further refinement showed that the effects were restricted to males 40 years of age or older at the time of analysis, but were due primarily to age at the time of earthquake exposure (p = 0.002, p Earthquake experience in the early years of life has long-term effects on adult resting heart rate, total cholesterol, and fasting plasma glucose, especially among men. PMID:28144258

  3. Earthquake Activities Along the Strike-Slip Fault System on the Thailand-Myanmar Border

    Directory of Open Access Journals (Sweden)

    Santi Pailoplee

    2014-01-01

    Full Text Available This study investigates the present-day seismicity along the strike-slip fault system on the Thailand-Myanmar border. Using the earthquake catalogue the earthquake parameters representing seismic activities were evaluated in terms of the possible maximum magnitude, return period and earthquake occurrence probabilities. Three different hazardous areas could be distinguished from the obtained results. The most seismic-prone area was located along the northern segment of the fault system and can generate earthquakes of magnitude 5.0, 5.8, and 6.8 mb in the next 5, 10, and 50 years, respectively. The second most-prone area was the southern segment where earthquakes of magnitude 5.0, 6.0, and 7.0 mb might be generated every 18, 60, and 300 years, respectively. For the central segment, there was less than 30 and 10% probability that 6.0- and 7.0-mb earthquakes will be generated in the next 50 years. With regards to the significant infrastructures (dams in the vicinity, the operational Wachiralongkorn dam is situated in a low seismic hazard area with a return period of around 30 - 3000 years for a 5.0 - 7.0 mb earthquake. In contrast, the Hut Gyi, Srinakarin and Tha Thung Na dams are seismically at risk for earthquakes of mb 6.4 - 6.5 being generated in the next 50 years. Plans for a seismic-retrofit should therefore be completed and implemented while seismic monitoring in this region is indispensable.

  4. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  5. Constant strain accumulation rate between major earthquakes on the North Anatolian Fault.

    Science.gov (United States)

    Hussain, Ekbal; Wright, Tim J; Walters, Richard J; Bekaert, David P S; Lloyd, Ryan; Hooper, Andrew

    2018-04-11

    Earthquakes are caused by the release of tectonic strain accumulated between events. Recent advances in satellite geodesy mean we can now measure this interseismic strain accumulation with a high degree of accuracy. But it remains unclear how to interpret short-term geodetic observations, measured over decades, when estimating the seismic hazard of faults accumulating strain over centuries. Here, we show that strain accumulation rates calculated from geodetic measurements around a major transform fault are constant for its entire 250-year interseismic period, except in the ~10 years following an earthquake. The shear strain rate history requires a weak fault zone embedded within a strong lower crust with viscosity greater than ~10 20  Pa s. The results support the notion that short-term geodetic observations can directly contribute to long-term seismic hazard assessment and suggest that lower-crustal viscosities derived from postseismic studies are not representative of the lower crust at all spatial and temporal scales.

  6. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    Science.gov (United States)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  7. What Googling Trends Tell Us About Public Interest in Earthquakes

    Science.gov (United States)

    Tan, Y. J.; Maharjan, R.

    2017-12-01

    Previous studies have shown that immediately after large earthquakes, there is a period of increased public interest. This represents a window of opportunity for science communication and disaster relief fundraising efforts to reach more people. However, how public interest varies for different earthquakes has not been quantified systematically on a global scale. We analyze how global search interest for the term "earthquake" on Google varies following earthquakes of magnitude ≥ 5.5 from 2004 to 2016. We find that there is a spike in search interest after large earthquakes followed by an exponential temporal decay. Preliminary results suggest that the period of increased search interest scales with death toll and correlates with the period of increased media coverage. This suggests that the relationship between the period of increased public interest in earthquakes and death toll might be an effect of differences in media coverage. However, public interest never remains elevated for more than three weeks. Therefore, to take advantage of this short period of increased public interest, science communication and disaster relief fundraising efforts have to act promptly following devastating earthquakes.

  8. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  9. Future of Earthquake Early Warning: Quantifying Uncertainty and Making Fast Automated Decisions for Applications

    Science.gov (United States)

    Wu, Stephen

    Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications. Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake. To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that

  10. Short term memory in echo state networks

    OpenAIRE

    Jaeger, H.

    2001-01-01

    The report investigates the short-term memory capacity of echo state recurrent neural networks. A quantitative measure MC of short-term memory capacity is introduced. The main result is that MC 5 N for networks with linear Output units and i.i.d. input, where N is network size. Conditions under which these maximal memory capacities are realized are described. Several theoretical and practical examples demonstrate how the short-term memory capacities of echo state networks can be exploited for...

  11. Short presentation on some researches activities about near field earthquakes

    International Nuclear Information System (INIS)

    Donald, John

    2002-01-01

    The major hazard posed by earthquakes is often thought to be due to moderate to large magnitude events. However, there have been many cases where earthquakes of moderate and even small magnitude have caused very significant destruction when they have coincided with population centres. Even though the area of intense ground shaking caused by such events is generally small, the epicentral motions can be severe enough to cause damage even in well-engineered structures. Two issues are addressed here, the first being the identification of the minimum earthquake magnitude likely to cause damage to engineered structures and the limits of the near-field for small-to-moderate magnitude earthquakes. The second issue addressed is whether features of near-field ground motions such as directivity, which can significantly enhance the destructive potential, occur in small-to-moderate magnitude events. The accelerograms from the 1986 San Salvador (El Salvador) earthquake indicate that it may be non conservative to assume that near-field directivity effects only need to be considered for earthquakes of moment magnitude M 6.5 and greater. (author)

  12. Short term and medium term power distribution load forecasting by neural networks

    International Nuclear Information System (INIS)

    Yalcinoz, T.; Eminoglu, U.

    2005-01-01

    Load forecasting is an important subject for power distribution systems and has been studied from different points of view. In general, load forecasts should be performed over a broad spectrum of time intervals, which could be classified into short term, medium term and long term forecasts. Several research groups have proposed various techniques for either short term load forecasting or medium term load forecasting or long term load forecasting. This paper presents a neural network (NN) model for short term peak load forecasting, short term total load forecasting and medium term monthly load forecasting in power distribution systems. The NN is used to learn the relationships among past, current and future temperatures and loads. The neural network was trained to recognize the peak load of the day, total load of the day and monthly electricity consumption. The suitability of the proposed approach is illustrated through an application to real load shapes from the Turkish Electricity Distribution Corporation (TEDAS) in Nigde. The data represents the daily and monthly electricity consumption in Nigde, Turkey

  13. DOE (Department of Energy) natural phenomena guidelines earthquake design and evaluation

    International Nuclear Information System (INIS)

    Short, S.A.; Murray, R.C.; Kennedy, R.P.

    1989-01-01

    Design and evaluation guidelines for DOE (Department of Energy) facilities subjected to earthquake, wind/tornado, and flood have been developed. This paper describes the philosophy and procedures fr the design or evaluation of facilities for earthquake ground shaking. The guidelines are intended to meet probabilistic-based performance goals expressed in terms of annual probability of exceedance of some level of structural damage. Meeting performance goals can be accomplished by specifying hazard probabilities of exceedance along with seismic behavior evaluation procedures in which the level of conservatism introduced is controlled such that desired performance can be achieved. Limited inelastic behavior is permitted by permitting demand determined from elastic response spectrum analyses to exceed capacity by an allowable inelastic demand-capacity ratio specified in the guidelines for different materials and construction

  14. Earthquake chemical precursors in groundwater: a review

    Science.gov (United States)

    Paudel, Shukra Raj; Banjara, Sushant Prasad; Wagle, Amrita; Freund, Friedemann T.

    2018-03-01

    We review changes in groundwater chemistry as precursory signs for earthquakes. In particular, we discuss pH, total dissolved solids (TDS), electrical conductivity, and dissolved gases in relation to their significance for earthquake prediction or forecasting. These parameters are widely believed to vary in response to seismic and pre-seismic activity. However, the same parameters also vary in response to non-seismic processes. The inability to reliably distinguish between changes caused by seismic or pre-seismic activities from changes caused by non-seismic activities has impeded progress in earthquake science. Short-term earthquake prediction is unlikely to be achieved, however, by pH, TDS, electrical conductivity, and dissolved gas measurements alone. On the other hand, the production of free hydroxyl radicals (•OH), subsequent reactions such as formation of H2O2 and oxidation of As(III) to As(V) in groundwater, have distinctive precursory characteristics. This study deviates from the prevailing mechanical mantra. It addresses earthquake-related non-seismic mechanisms, but focused on the stress-induced electrification of rocks, the generation of positive hole charge carriers and their long-distance propagation through the rock column, plus on electrochemical processes at the rock-water interface.

  15. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  16. The Christchurch earthquake stroke incidence study.

    Science.gov (United States)

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Remotely Triggered Earthquakes Recorded by EarthScope's Transportable Array and Regional Seismic Networks: A Case Study Of Four Large Earthquakes

    Science.gov (United States)

    Velasco, A. A.; Cerda, I.; Linville, L.; Kilb, D. L.; Pankow, K. L.

    2013-05-01

    Changes in field stress required to trigger earthquakes have been classified in two basic ways: static and dynamic triggering. Static triggering occurs when an earthquake that releases accumulated strain along a fault stress loads a nearby fault. Dynamic triggering occurs when an earthquake is induced by the passing of seismic waves from a large mainshock located at least two or more fault lengths from the epicenter of the main shock. We investigate details of dynamic triggering using data collected from EarthScope's USArray and regional seismic networks located in the United States. Triggered events are identified using an optimized automated detector based on the ratio of short term to long term average (Antelope software). Following the automated processing, the flagged waveforms are individually analyzed, in both the time and frequency domains, to determine if the increased detection rates correspond to local earthquakes (i.e., potentially remotely triggered aftershocks). Here, we show results using this automated schema applied to data from four large, but characteristically different, earthquakes -- Chile (Mw 8.8 2010), Tokoku-Oki (Mw 9.0 2011), Baja California (Mw 7.2 2010) and Wells Nevada (Mw 6.0 2008). For each of our four mainshocks, the number of detections within the 10 hour time windows span a large range (1 to over 200) and statistically >20% of the waveforms show evidence of anomalous signals following the mainshock. The results will help provide for a better understanding of the physical mechanisms involved in dynamic earthquake triggering and will help identify zones in the continental U.S. that may be more susceptible to dynamic earthquake triggering.

  18. Development of compact long-term broadband ocean bottom seismometer for seafloor observation of slow earthquakes

    Science.gov (United States)

    Yamashita, Y.; Shinohara, M.; Yamada, T.; Shiobara, H.

    2017-12-01

    It is important to understand coupling between plates in a subduction zone for studies of earthquake generation. Recently low frequency tremor and very low frequency earthquake (VLFE) were discovered in plate boundary near a trench. These events (slow earthquakes) in shallow plate boundary should be related to slow slip on a plate boundary. For observation of slow earthquakes, Broad Band Ocean Bottom Seismometer (BBOBS) is useful, however a number of BBOBSs are limited due to cost. On the other hand, a number of Long-term OBSs (LT-OBSs) with recording period of one year are available. However, the LT-OBS has seismometer with a natural period of 1 second. Therefore frequency band of observation is slightly narrow for slow earthquakes. Therefore we developed a compact long-term broad-band OBS by replacement of the seismic sensor of the LT-OBSs to broadband seismometer.We adopted seismic sensor with natural period of 20 seconds (Trillium Compact Broadband Seismometer, Nanometrics). Because tilt of OBS on seafloor can not be controlled due to free-fall, leveling system for seismic sensor is necessary. The broadband seismic senor has cylinder shape with diameter of 90 mm and height of 100 mm, and the developed levelling system can mount the seismic sensor with no modification of shape. The levelling system has diameter of 160 mm and height of 110 mm, which is the same size as existing levelling system of the LT-OBS. The levelling system has two horizontal axes and each axis is driven by motor. Leveling can be performed up to 20 degrees by using micro-processor (Arduino). Resolution of levelling is less than one degree. The system immediately starts leveling by the power-on of controller. After levelling, the the seismic senor is powered and the controller records angles of levelling to SD RAM. Then the controller is shut down to consume no power. Compact long-term broadband ocean bottom seismometer is useful for observation of slow earthquakes on seafloor. In addition

  19. Short-Term Memory and Aphasia: From Theory to Treatment.

    Science.gov (United States)

    Minkina, Irene; Rosenberg, Samantha; Kalinyak-Fliszar, Michelene; Martin, Nadine

    2017-02-01

    This article reviews existing research on the interactions between verbal short-term memory and language processing impairments in aphasia. Theoretical models of short-term memory are reviewed, starting with a model assuming a separation between short-term memory and language, and progressing to models that view verbal short-term memory as a cognitive requirement of language processing. The review highlights a verbal short-term memory model derived from an interactive activation model of word retrieval. This model holds that verbal short-term memory encompasses the temporary activation of linguistic knowledge (e.g., semantic, lexical, and phonological features) during language production and comprehension tasks. Empirical evidence supporting this model, which views short-term memory in the context of the processes it subserves, is outlined. Studies that use a classic measure of verbal short-term memory (i.e., number of words/digits correctly recalled in immediate serial recall) as well as those that use more intricate measures (e.g., serial position effects in immediate serial recall) are discussed. Treatment research that uses verbal short-term memory tasks in an attempt to improve language processing is then summarized, with a particular focus on word retrieval. A discussion of the limitations of current research and possible future directions concludes the review. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  20. Earthquake induced liquefaction hazard, probability and risk assessment in the city of Kolkata, India: its historical perspective and deterministic scenario

    Science.gov (United States)

    Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.

    2018-01-01

    Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically

  1. An Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers-Part I

    Science.gov (United States)

    Courey, Karim; Wright, Clara; Asfour, Shihab; Bayliss, Jon; Ludwig, Larry

    2008-01-01

    Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that has a currently unknown probability associated with it. Due to contact resistance, electrical shorts may not occur at lower voltage levels. In this experiment, we study the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From this data, we can estimate the probability of an electrical short, as a function of voltage, given that a free tin whisker has bridged two adjacent exposed electrical conductors. In addition, three tin whiskers grown from the same Space Shuttle Orbiter card guide used in the aforementioned experiment were cross sectioned and studied using a focused ion beam (FIB).

  2. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  3. Earthquake forecasting studies using radon time series data in Taiwan

    Science.gov (United States)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  4. Short-term memories with a stochastic perturbation

    International Nuclear Information System (INIS)

    Pontes, Jose C.A. de; Batista, Antonio M.; Viana, Ricardo L.; Lopes, Sergio R.

    2005-01-01

    We investigate short-term memories in linear and weakly nonlinear coupled map lattices with a periodic external input. We use locally coupled maps to present numerical results about short-term memory formation adding a stochastic perturbation in the maps and in the external input

  5. Short-term memory

    Science.gov (United States)

    Toulouse, G.

    This is a rather bold attempt to bridge the gap between neuron structure and psychological data. We try to answer the question: Is there a relation between the neuronal connectivity in the human cortex (around 5,000) and the short-term memory capacity (7±2)? Our starting point is the Hopfield model (Hopfield 1982), presented in this volume by D.J. Amit.

  6. Earthquakes and Tectonics Expert Judgment Elicitation Project

    International Nuclear Information System (INIS)

    Coppersmith, K.J.; Perman, R.C.; Youngs, R.R.

    1993-02-01

    This report summarizes the results of the Earthquakes and Tectonics Expert Judgement Excitation Project sponsored by the Electric Power Research Institute (EPRI). The objectives of this study were two-fold: (1) to demonstrate methods for the excitation of expert judgement, and (2) to quantify the uncertainties associated with earthquake and tectonics issues for use in the EPRI-HLW performance assessment. Specifically, the technical issue considered is the probability of differential fault displacement through the proposed repository at Yucca Mountain, Nevada. For this study, a strategy for quantifying uncertainties was developed that relies on the judgements of multiple experts. A panel of seven geologists and seismologists was assembled to quantify the uncertainties associated with earthquake and tectonics issues for the performance assessment model. A series of technical workshops focusing on these issues were conducted. Finally, each expert was individually interviewed in order to elicit his judgement regarding the technical issues and to provide the technical basis for his assessment. This report summarizes the methodologies used to elicit the judgements of the earthquakes and tectonics experts (termed ''specialists''), and summarizes the technical assessments made by the expert panel

  7. Long-term change of activity of very low-frequency earthquakes in southwest Japan

    Science.gov (United States)

    Baba, S.; Takeo, A.; Obara, K.; Kato, A.; Maeda, T.; Matsuzawa, T.

    2017-12-01

    On plate interface near seismogenic zone of megathrust earthquakes, various types of slow earthquakes were detected including non-volcanic tremors, slow slip events (SSEs) and very low-frequency earthquakes (VLFEs). VLFEs are classified into deep VLFEs, which occur in the downdip side of the seismogenic zone, and shallow VLFEs, occur in the updip side, i.e. several kilometers in depth in southwest Japan. As a member of slow earthquake family, VLFE activity is expected to be a proxy of inter-plate slipping because VLFEs have the same mechanisms as inter-plate slipping and are detected during Episodic tremor and slip (ETS). However, long-term change of the VLFE seismicity has not been well constrained compared to deep low-frequency tremor. We thus studied long-term changes in the activity of VLFEs in southwest Japan where ETS and long-term SSEs have been most intensive. We used continuous seismograms of F-net broadband seismometers operated by NIED from April 2004 to March 2017. After applying the band-pass filter with a frequency range of 0.02—0.05 Hz, we adopted the matched-filter technique in detecting VLFEs. We prepared templates by calculating synthetic waveforms for each hypocenter grid assuming typical focal mechanisms of VLFEs. The correlation coefficients between templates and continuous F-net seismograms were calculated at each grid every 1s in all components. The grid interval is 0.1 degree for both longitude and latitude. Each VLFE was detected as an event if the average of correlation coefficients exceeds the threshold. We defined the detection threshold as eight times as large as the median absolute deviation of the distribution. At grids in the Bungo channel, where long-term SSEs occurred frequently, the cumulative number of detected VLFEs increases rapidly in 2010 and 2014, which were modulated by stress loading from the long-term SSEs. At inland grids near the Bungo channel, the cumulative number increases steeply every half a year. This stepwise

  8. What Can We Learn from a Simple Physics-Based Earthquake Simulator?

    Science.gov (United States)

    Artale Harris, Pietro; Marzocchi, Warner; Melini, Daniele

    2018-03-01

    Physics-based earthquake simulators are becoming a popular tool to investigate on the earthquake occurrence process. So far, the development of earthquake simulators is commonly led by the approach "the more physics, the better". However, this approach may hamper the comprehension of the outcomes of the simulator; in fact, within complex models, it may be difficult to understand which physical parameters are the most relevant to the features of the seismic catalog at which we are interested. For this reason, here, we take an opposite approach and analyze the behavior of a purposely simple earthquake simulator applied to a set of California faults. The idea is that a simple simulator may be more informative than a complex one for some specific scientific objectives, because it is more understandable. Our earthquake simulator has three main components: the first one is a realistic tectonic setting, i.e., a fault data set of California; the second is the application of quantitative laws for earthquake generation on each single fault, and the last is the fault interaction modeling through the Coulomb Failure Function. The analysis of this simple simulator shows that: (1) the short-term clustering can be reproduced by a set of faults with an almost periodic behavior, which interact according to a Coulomb failure function model; (2) a long-term behavior showing supercycles of the seismic activity exists only in a markedly deterministic framework, and quickly disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault; (3) faults that are strongly coupled in terms of Coulomb failure function model are synchronized in time only in a marked deterministic framework, and as before, such a synchronization disappears introducing a small degree of stochasticity on the recurrence of earthquakes on a fault. Overall, the results show that even in a simple and perfectly known earthquake occurrence world, introducing a small degree of

  9. Roles of Radon-222 and other natural radionuclides in earthquake prediction

    International Nuclear Information System (INIS)

    Smith, A.R.; Wollenberg, H.A.; Mosier, D.F.

    1980-01-01

    The concentration of 222 Rn in subsurface waters is one of the natural parameters being investigated to help develop the capability to predict destructive earthquakes. Since 1966, scientists in several nations have sought to link radon variations with ongoing seismic activity, primarily through the dilatancy model for earthquake occurrences. Within the range of these studies, alpha-, beta-, and gamma-radiation detection techniques have been used in both discrete-sampling and continiuous-monitoring programs. These measured techniques are reviewed in terms of instrumentation adapted to seismic-monitoring purposes. A recent Lawrence Berkeley Laboratory study conducted in central California incorporated discrete sampling of wells in the aftershock area of the 1975 Oroville earthquake and continuous monitoring of water radon in a well on the San Andreas Fault. The results presented show short-term radon variations that may be associated with aftershocks and diurnal changes that may reflect earth tidal forces

  10. Competitive short-term and long-term memory processes in spatial habituation.

    Science.gov (United States)

    Sanderson, David J; Bannerman, David M

    2011-04-01

    Exposure to a spatial location leads to habituation of exploration such that, in a novelty preference test, rodents subsequently prefer exploring a novel location to the familiar location. According to Wagner's (1981) theory of memory, short-term and long-term habituation are caused by separate and sometimes opponent processes. In the present study, this dual-process account of memory was tested. Mice received a series of exposure training trials to a location before receiving a novelty preference test. The novelty preference was greater when tested after a short, rather than a long, interval. In contrast, the novelty preference was weaker when exposure training trials were separated by a short, rather than a long interval. Furthermore, it was found that long-term habituation was determined by the independent effects of the amount of exposure training and the number of exposure training trials when factors such as the intertrial interval and the cumulative intertrial interval were controlled. A final experiment demonstrated that a long-term reduction of exploration could be caused by a negative priming effect due to associations formed during exploration. These results provide evidence against a single-process account of habituation and suggest that spatial habituation is determined by both short-term, recency-based memory and long-term, incrementally strengthened memory.

  11. Retrieval-Induced Inhibition in Short-Term Memory.

    Science.gov (United States)

    Kang, Min-Suk; Choi, Joongrul

    2015-07-01

    We used a visual illusion called motion repulsion as a model system for investigating competition between two mental representations. Subjects were asked to remember two random-dot-motion displays presented in sequence and then to report the motion directions for each. Remembered motion directions were shifted away from the actual motion directions, an effect similar to the motion repulsion observed during perception. More important, the item retrieved second showed greater repulsion than the item retrieved first. This suggests that earlier retrieval exerted greater inhibition on the other item being held in short-term memory. This retrieval-induced motion repulsion could be explained neither by reduced cognitive resources for maintaining short-term memory nor by continued inhibition between short-term memory representations. These results indicate that retrieval of memory representations inhibits other representations in short-term memory. We discuss mechanisms of retrieval-induced inhibition and their implications for the structure of memory. © The Author(s) 2015.

  12. Radon anomaly in soil gas as an earthquake precursor

    International Nuclear Information System (INIS)

    Miklavcic, I.; Radolic, V.; Vukovic, B.; Poje, M.; Varga, M.; Stanic, D.; Planinic, J.

    2008-01-01

    The mechanical processes of earthquake preparation are always accompanied by deformations; afterwards, the complex short- or long-term precursory phenomena can appear. Anomalies of radon concentrations in soil gas are registered a few weeks or months before many earthquakes. Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors at site A (Osijek) during a 4-year period, as well as by the Barasol semiconductor detector at site B (Kasina) during 2 years. We investigated the influence of the meteorological parameters on the temporal radon variations, and we determined the equation of the multiple regression that enabled the reduction (deconvolution) of the radon variation caused by the barometric pressure, rainfall and temperature. The pre-earthquake radon anomalies at site A indicated 46% of the seismic events, on criterion M≥3, R<200 km, and 21% at site B. Empirical equations between earthquake magnitude, epicenter distance and precursor time enabled estimation or prediction of an earthquake that will rise at the epicenter distance R from the monitoring site in expecting precursor time T

  13. Radon anomaly in soil gas as an earthquake precursor

    Energy Technology Data Exchange (ETDEWEB)

    Miklavcic, I.; Radolic, V.; Vukovic, B.; Poje, M.; Varga, M.; Stanic, D. [Department of Physics, University of Osijek, Trg Ljudevita Gaja 6, POB 125, 31000 Osijek (Croatia); Planinic, J. [Department of Physics, University of Osijek, Trg Ljudevita Gaja 6, POB 125, 31000 Osijek (Croatia)], E-mail: planinic@ffos.hr

    2008-10-15

    The mechanical processes of earthquake preparation are always accompanied by deformations; afterwards, the complex short- or long-term precursory phenomena can appear. Anomalies of radon concentrations in soil gas are registered a few weeks or months before many earthquakes. Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors at site A (Osijek) during a 4-year period, as well as by the Barasol semiconductor detector at site B (Kasina) during 2 years. We investigated the influence of the meteorological parameters on the temporal radon variations, and we determined the equation of the multiple regression that enabled the reduction (deconvolution) of the radon variation caused by the barometric pressure, rainfall and temperature. The pre-earthquake radon anomalies at site A indicated 46% of the seismic events, on criterion M{>=}3, R<200 km, and 21% at site B. Empirical equations between earthquake magnitude, epicenter distance and precursor time enabled estimation or prediction of an earthquake that will rise at the epicenter distance R from the monitoring site in expecting precursor time T.

  14. Evaluation of Short Term Memory Span Function In Children

    Directory of Open Access Journals (Sweden)

    Barış ERGÜL

    2016-12-01

    Full Text Available Although details of the information encoded in the short-term memory where it is stored temporarily be recorded in the working memory in the next stage. Repeating the information mentally makes it remain in memory for a long time. Studies investigating the relationship between short-term memory and reading skills that are carried out to examine the relationship between short-term memory processes and reading comprehension. In this study information coming to short-term memory and the factors affecting operation of short term memory are investigated with regression model. The aim of the research is to examine the factors (age, IQ and reading skills that are expected the have an effect on short-term memory in children through regression analysis. One of the assumptions of regression analysis is to examine which has constant variance and normal distribution of the error term. In this study, because the error term is not normally distributed, robust regression techniques were applied. Also, for each technique; coefficient of determination is determined. According to the findings, the increase in age, IQ and reading skills caused the increase in short term memory in children. After applying robust regression techniques, the Winsorized Least Squares (WLS technique gives the highest coefficient of determination.

  15. Critical behavior in earthquake energy dissipation

    Science.gov (United States)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  16. Evaluation of Short Term Memory Span Function In Children

    OpenAIRE

    Barış ERGÜL; Arzu ALTIN YAVUZ; Ebru GÜNDOĞAN AŞIK

    2016-01-01

    Although details of the information encoded in the short-term memory where it is stored temporarily be recorded in the working memory in the next stage. Repeating the information mentally makes it remain in memory for a long time. Studies investigating the relationship between short-term memory and reading skills that are carried out to examine the relationship between short-term memory processes and reading comprehension. In this study information coming to short-term memory and the factors ...

  17. Short-term Memory as a Processing Shift

    Science.gov (United States)

    Lewis-Smith, Marion Quinn

    1975-01-01

    The series of experiments described here examined the predictions for free recall from sequential models and the shift formulation, focusing on the roles of short- and long-term memory in the primacy/recency shift and on the effects of expectancies on short- and long-term memory. (Author/RK)

  18. On the relationship between short- and long-term memory

    DEFF Research Database (Denmark)

    Sørensen, Thomas Alrik

    James (1890) divided memory into separate stores; primary and secondary – or short-term and long-term memory. The interaction between the two stores often assumes that information initially is represented in volatile short-term store before entering and consolidating in the more durable long-term......, accepted). Counter to popular beliefs this suggest that long-term memory precedes short-term memory and not vice versa....... memory system (e.g. Atkinson & Shiffrin, 1968). Short-term memory seems to provide a surprising processing bottleneck where only a very limited amount of information can be represented at any given moment (Miller, 1956; Cowan, 2001). A number of studies have investigated the nature of this processing...

  19. Spatial and Temporal Characteristics of the Microseismicity Preceding the 2016 M L 6.6 Meinong Earthquake in Southern Taiwan

    Science.gov (United States)

    Pu, Hsin-Chieh

    2018-02-01

    Before the M L 6.6 Meinong earthquake in 2016, intermediate-term quiescence (Q i), foreshocks, and short-term quiescence (Q s) were extracted from a comprehensive earthquake catalog. In practice, these behaviors are thought to be the seismic indicators of an earthquake precursor, and their spatiotemporal characteristics may be associated with location, magnitude, and occurrence time of the following main shock. Hence, detailed examinations were carried out to derive the spatiotemporal characteristics of these meaningful seismic behaviors. First, the spatial range of the Q i that occurred for 96 days was revealed in and around the Meinong earthquake. Second, a series of foreshocks was present for 1 day, clustered at the southeastern end of the Meinong earthquake. Third, Q s was present for 3 days and was pronounced after the foreshocks. Although these behaviors were recorded difficultly because the Q i was characterized by microseismicity at the lower cut-off magnitude, between M L 1.2 and 1.6, and most of the foreshocks were comprised of earthquakes with a magnitude lower than 1.8, they carried meaningful precursory indicators preceding the Meinong earthquake. These indicators provide the information of (1) the hypocenter, which was indicated by the area including the Q i, foreshocks, and Q s; (2) the magnitude, which could be associated to the spatial range of the Q i; (3) the asperity locations, which might be related to the areas of extraordinary low seismicity; and (4) a short-term warning leading of 3 days, which could have been announced based on the occurrence of the Q s. Particularly, Q i also appeared before strong inland earthquakes so that Q i might be an anticipative phenomenon before a strong earthquake in Taiwan.

  20. Short-term memory and long-term memory are still different.

    Science.gov (United States)

    Norris, Dennis

    2017-09-01

    A commonly expressed view is that short-term memory (STM) is nothing more than activated long-term memory. If true, this would overturn a central tenet of cognitive psychology-the idea that there are functionally and neurobiologically distinct short- and long-term stores. Here I present an updated case for a separation between short- and long-term stores, focusing on the computational demands placed on any STM system. STM must support memory for previously unencountered information, the storage of multiple tokens of the same type, and variable binding. None of these can be achieved simply by activating long-term memory. For example, even a simple sequence of digits such as "1, 3, 1" where there are 2 tokens of the digit "1" cannot be stored in the correct order simply by activating the representations of the digits "1" and "3" in LTM. I also review recent neuroimaging data that has been presented as evidence that STM is activated LTM and show that these data are exactly what one would expect to see based on a conventional 2-store view. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Fast Weight Long Short-Term Memory

    OpenAIRE

    Keller, T. Anderson; Sridhar, Sharath Nittur; Wang, Xin

    2018-01-01

    Associative memory using fast weights is a short-term memory mechanism that substantially improves the memory capacity and time scale of recurrent neural networks (RNNs). As recent studies introduced fast weights only to regular RNNs, it is unknown whether fast weight memory is beneficial to gated RNNs. In this work, we report a significant synergy between long short-term memory (LSTM) networks and fast weight associative memories. We show that this combination, in learning associative retrie...

  2. Statistical mechanics of neocortical interactions. Derivation of short-term-memory capacity

    Science.gov (United States)

    Ingber, Lester

    1984-06-01

    A theory developed by the author to describe macroscopic neocortical interactions demonstrates that empirical values of chemical and electrical parameters of synaptic interactions establish several minima of the path-integral Lagrangian as a function of excitatory and inhibitory columnar firings. The number of possible minima, their time scales of hysteresis and probable reverberations, and their nearest-neighbor columnar interactions are all consistent with well-established empirical rules of human short-term memory. Thus, aspects of conscious experience are derived from neuronal firing patterns, using modern methods of nonlinear nonequilibrium statistical mechanics to develop realistic explicit synaptic interactions.

  3. Motor Habits in Visuo-manual Tracking: Manifestation of an Unconscious Short-Term Motor Memory?

    Directory of Open Access Journals (Sweden)

    Andreas Hufschmidt

    1990-01-01

    Full Text Available Normal subjects were tested in short, repetitive trials of a tracking task, with an identical shape of target movement being used throughout one session. Analysis of the net error curves (pursuit minus target movement revealed that subjects regularly exhibit a remoteness effect: neighbouring trials were more similar than distant ones. The effect is demonstrated to be stronger in the absence of visual cues, and was found to be absent in a patient with complete loss of proprioception when he was performing without visual feedback as well. The results are discussed in terms of a short term memory store contributing to unconscious movement habits in tracking. This may represent part of the motor learning process working together with conscious visuo-motor control mechanisms. Its function is probably related to the acquisition of automatic movements.

  4. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  5. Short-term power plant operation scheduling in thermal systems with long-term boundary conditions

    International Nuclear Information System (INIS)

    Wolter, H.

    1990-01-01

    For the first time, the modeling of long-term quantitative conditions within the short-term planning of the application of power stations is made via their shadow prices. It corresponds to a decomposition of the quantitative conditions by means of the method of the Langrange relaxation. The shadow prices determined by the planning for energy application regarding long- term quantitative conditions pass into the short-term planning for power station application and subsidize or rather punish the application of limited amounts as for as they are not claimed for sufficiently or excessively. The clear advantage of this modeling is that the short-term planning of power station application can deviate from the envisioned energy application regarding the total optimum, because the shadow prices contain all information about the cost effect of the energy shifts in the residual total period, which become necessary due to the deviations in the short-term period to be planned in the current short-term period. (orig./DG) [de

  6. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    Science.gov (United States)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly

  7. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    Science.gov (United States)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  8. Improving creativity performance by short-term meditation

    Science.gov (United States)

    2014-01-01

    Background One form of meditation intervention, the integrative body-mind training (IBMT) has been shown to improve attention, reduce stress and change self-reports of mood. In this paper we examine whether short-term IBMT can improve performance related to creativity and determine the role that mood may play in such improvement. Methods Forty Chinese undergraduates were randomly assigned to short-term IBMT group or a relaxation training (RT) control group. Mood and creativity performance were assessed by the Positive and Negative Affect Schedule (PANAS) and Torrance Tests of Creative Thinking (TTCT) questionnaire respectively. Results As predicted, the results indicated that short-term (30 min per day for 7 days) IBMT improved creativity performance on the divergent thinking task, and yielded better emotional regulation than RT. In addition, cross-lagged analysis indicated that both positive and negative affect may influence creativity in IBMT group (not RT group). Conclusions Our results suggested that emotion-related creativity-promoting mechanism may be attributed to short-term meditation. PMID:24645871

  9. Short term memory decays and high presentation rates hurry this decay: The Murdock free recall experiments interpreted in the Tagging/Retagging model

    OpenAIRE

    Tarnow, Dr. Eugen

    2009-01-01

    I show that the curious free recall data of Murdock (1962) can be explained by the Tagging/Retagging model of short term memory (Tarnow, 2009 and 2008) in which a short term memory item is a tagged long term memory item. The tagging (linear in time) corresponds to the synaptic process of exocytosis and the loss of tagging (logarithmic in time) corresponds to synaptic endocytosis. The Murdock recent item recall probabilities follow a logarithmic decay with time of recall. The slope of the d...

  10. Constraints on Dynamic Triggering from very Short term Microearthquake Aftershocks at Parkfield

    Science.gov (United States)

    Ampuero, J.; Rubin, A.

    2004-12-01

    The study of microearthquakes helps bridge the gap between laboratory experiments and data from large earthquakes, the two disparate scales that have contributed so far to our understanding of earthquake physics. Although they are frequent, microearthquakes are difficult to analyse. Applying high precision relocation techniques, Rubin and Gillard (2000) observed a pronounced asymmetry in the spatial distribution of the earliest and nearest aftershocks of microearthquakes along the San Andreas fault (they occur more often to the NW of the mainshock). It was suggested that this could be related to the velocity contrast across the fault. Preferred directivity of dynamic rupture pulses running along a bimaterial interface (to the SE in the case of the SAF) is expected on theoretical grounds. Our numerical simulations of crack-like rupture on such interfaces show a pronounced asymmetry of the stress histories beyond the rupture ends, and suggest two possible mechanisms for the observed asymmetry: First, that it results from an asymmmetry in the static stress field following arrest of the mainshock (closer to failure to the NW), or second, that it is due to a short-duration tensile pulse that propagates to the SE, which could reduce the number of aftershocks to the SE by dynamic triggering of any nucleation site close enough to failure to have otherwise produced an aftershock. To distinguish betwen these mechanisms we need observations of dynamic triggering in microseismicity. For small events triggered at a distance of some mainshock radii, triggering time scales are so short that seismograms of both events overlap. To detect the occurrence of compound events and very short term aftershocks in the HRSN Parkfield archived waveforms we have developed an automated search algorithm based on empirical Green's function (EGF) deconvolution. Optimal EGFs are first selected by the coherency of the cross-component convolution with respect to the target event. Then Landweber

  11. Long-term impact of earthquakes on sleep quality.

    Science.gov (United States)

    Tempesta, Daniela; Curcio, Giuseppe; De Gennaro, Luigi; Ferrara, Michele

    2013-01-01

    We investigated the impact of the 6.3 magnitude 2009 L'Aquila (Italy) earthquake on standardized self-report measures of sleep quality (Pittsburgh Sleep Quality Index, PSQI) and frequency of disruptive nocturnal behaviours (Pittsburgh Sleep Quality Index-Addendum, PSQI-A) two years after the natural disaster. Self-reported sleep quality was assessed in 665 L'Aquila citizens exposed to the earthquake compared with a different sample (n = 754) of L'Aquila citizens tested 24 months before the earthquake. In addition, sleep quality and disruptive nocturnal behaviours (DNB) of people exposed to the traumatic experience were compared with people that in the same period lived in different areas ranging between 40 and 115 km from the earthquake epicenter (n = 3574). The comparison between L'Aquila citizens before and after the earthquake showed a significant deterioration of sleep quality after the exposure to the trauma. In addition, two years after the earthquake L'Aquila citizens showed the highest PSQI scores and the highest incidence of DNB compared to subjects living in the surroundings. Interestingly, above-the-threshold PSQI scores were found in the participants living within 70 km from the epicenter, while trauma-related DNBs were found in people living in a range of 40 km. Multiple regressions confirmed that proximity to the epicenter is predictive of sleep disturbances and DNB, also suggesting a possible mediating effect of depression on PSQI scores. The psychological effects of an earthquake may be much more pervasive and long-lasting of its building destruction, lasting for years and involving a much larger population. A reduced sleep quality and an increased frequency of DNB after two years may be a risk factor for the development of depression and posttraumatic stress disorder.

  12. Fault activity characteristics in the northern margin of the Tibetan Plateau before the Menyuan Ms6.4 earthquake

    Directory of Open Access Journals (Sweden)

    Dongzhuo Xu

    2016-07-01

    Full Text Available Fault deformation characteristics in the northern margin of the Tibetan Plateau before the Menyuan Ms6.4 earthquake are investigated through time-series and structural geological analysis based on cross-fault observation data from the Qilian Mountain–Haiyuan Fault belt and the West Qinling Fault belt. The results indicate: 1 Group short-term abnormal variations appeared in the Qilian Mountain–Haiyuan Fault belt and the West Qinling Fault belt before the Menyuan Ms6.4 earthquake. 2 More medium and short-term anomalies appear in the middle-eastern segment of the Qilian Mountain Fault belt and the West Qinling Fault belt, suggesting that the faults' activities are strong in these areas. The faults' activities in the middle-eastern segment of the Qilian Fault belt result from extensional stress, as before the earthquake, whereas those in the West Qinling Fault belt are mainly compressional. 3 In recent years, moderate-strong earthquakes occurred in both the Kunlun Mountain and the Qilian Mountain Fault belts, and some energy was released. It is possible that the seismicity moved eastward under this regime. Therefore, we should pay attention to the West Qinling Mountain area where an Ms6–7 earthquake could occur in future.

  13. The Structure and Content of Long-Term and Short-Term Mate Preferences

    Directory of Open Access Journals (Sweden)

    Peter K. Jonason

    2013-12-01

    Full Text Available This study addresses two limitations in the mate preferences literature. First, research all-too-often relies on single-item assessments of mate preferences precluding more advanced statistical techniques like factor analysis. Second, when factor analysis could be done, it exclusively has done for long-term mate preferences, at the exclusion of short-term mate preferences. In this study (N = 401, we subjected 20 items designed to measure short- and long-term mate preferences to both principle components (n = 200 and confirmatory factor analysis (n = 201. In the long-term context, we replicated previous findings that there are three different categories of preferences: physical attractiveness, interpersonal warmth, and social status. In the short-term context, physical attractiveness occupied two parts of the structure, social status dropped out, and interpersonal warmth remained. Across short- and long-term contexts, there were slight changes in what defined the shared dimensions (i.e., physical attractiveness and interpersonal warmth, suggesting prior work that applies the same inventory to each context might be flawed. We also replicated sex differences and similarities in mate preferences and correlates with sociosexuality and mate value. We adopt an evolutionary paradigm to understand our results.

  14. A Probabilistic Short-Term Water Demand Forecasting Model Based on the Markov Chain

    Directory of Open Access Journals (Sweden)

    Francesca Gagliardi

    2017-07-01

    Full Text Available This paper proposes a short-term water demand forecasting method based on the use of the Markov chain. This method provides estimates of future demands by calculating probabilities that the future demand value will fall within pre-assigned intervals covering the expected total variability. More specifically, two models based on homogeneous and non-homogeneous Markov chains were developed and presented. These models, together with two benchmark models (based on artificial neural network and naïve methods, were applied to three real-life case studies for the purpose of forecasting the respective water demands from 1 to 24 h ahead. The results obtained show that the model based on a homogeneous Markov chain provides more accurate short-term forecasts than the one based on a non-homogeneous Markov chain, which is in line with the artificial neural network model. Both Markov chain models enable probabilistic information regarding the stochastic demand forecast to be easily obtained.

  15. Short-term energy outlook, annual supplement 1994

    International Nuclear Information System (INIS)

    1994-08-01

    The Short-Term Energy Outlook Annual Supplement (Supplement) is published once a year as a complement to the Short-Term Energy Outlook (Outlook), Quarterly Projections. The purpose of the Supplement is to review the accuracy of the forecasts published in the Outlook, make comparisons with other independent energy forecasts, and examine current energy topics that affect the forecasts

  16. Short-term energy outlook annual supplement, 1993

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-08-06

    The Short-Term Energy Outlook Annual Supplement (supplement) is published once a year as a complement to the Short-Term Energy Outlook (Outlook), Quarterly Projections. The purpose of the Supplement is to review the accuracy of the forecasts published in the Outlook, make comparisons with other independent energy forecasts, and examine current energy topics that affect the forecasts.

  17. Geosphere Stability for long-term isolation of radioactive waste. Case study for hydrological change with earthquakes and faulting

    International Nuclear Information System (INIS)

    Niwa, Masakazu

    2016-01-01

    Appropriate estimation and safety assessment for long-term changes in geological environment are essential to an improvement of reliability for geological disposal. Specifically, study on faults is important for understanding regional groundwater flow as well as an assessment as a trigger of future earthquakes. Here, possibility of changes in permeability of faulted materials induced by earthquakes was examined based on monitoring data of groundwater pressure before and after the 2011 off the Pacific coast of Tohoku Earthquake. (author)

  18. In Search of Decay in Verbal Short-Term Memory

    Science.gov (United States)

    Berman, Marc G.; Jonides, John; Lewis, Richard L.

    2009-01-01

    Is forgetting in the short term due to decay with the mere passage of time, interference from other memoranda, or both? Past research on short-term memory has revealed some evidence for decay and a plethora of evidence showing that short-term memory is worsened by interference. However, none of these studies has directly contrasted decay and…

  19. Attention Problems, Phonological Short-Term Memory, and Visuospatial Short-Term Memory: Differential Effects on Near- and Long-Term Scholastic Achievement

    Science.gov (United States)

    Sarver, Dustin E.; Rapport, Mark D.; Kofler, Michael J.; Scanlan, Sean W.; Raiker, Joseph S.; Altro, Thomas A.; Bolden, Jennifer

    2012-01-01

    The current study examined individual differences in children's phonological and visuospatial short-term memory as potential mediators of the relationship among attention problems and near- and long-term scholastic achievement. Nested structural equation models revealed that teacher-reported attention problems were associated negatively with…

  20. Developing an Empirical Model for Estimating the Probability of Electrical Short Circuits from Tin Whiskers. Part 2

    Science.gov (United States)

    Courey, Karim J.; Asfour, Shihab S.; Onar, Arzu; Bayliss, Jon A.; Ludwig, Larry L.; Wright, Maria C.

    2009-01-01

    To comply with lead-free legislation, many manufacturers have converted from tin-lead to pure tin finishes of electronic components. However, pure tin finishes have a greater propensity to grow tin whiskers than tin-lead finishes. Since tin whiskers present an electrical short circuit hazard in electronic components, simulations have been developed to quantify the risk of said short circuits occurring. Existing risk simulations make the assumption that when a free tin whisker has bridged two adjacent exposed electrical conductors, the result is an electrical short circuit. This conservative assumption is made because shorting is a random event that had an unknown probability associated with it. Note however that due to contact resistance electrical shorts may not occur at lower voltage levels. In our first article we developed an empirical probability model for tin whisker shorting. In this paper, we develop a more comprehensive empirical model using a refined experiment with a larger sample size, in which we studied the effect of varying voltage on the breakdown of the contact resistance which leads to a short circuit. From the resulting data we estimated the probability distribution of an electrical short, as a function of voltage. In addition, the unexpected polycrystalline structure seen in the focused ion beam (FIB) cross section in the first experiment was confirmed in this experiment using transmission electron microscopy (TEM). The FIB was also used to cross section two card guides to facilitate the measurement of the grain size of each card guide's tin plating to determine its finish.

  1. An overview of the National Earthquake Information Center acquisition software system, Edge/Continuous Waveform Buffer

    Science.gov (United States)

    Patton, John M.; Ketchum, David C.; Guy, Michelle R.

    2015-11-02

    This document provides an overview of the capabilities, design, and use cases of the data acquisition and archiving subsystem at the U.S. Geological Survey National Earthquake Information Center. The Edge and Continuous Waveform Buffer software supports the National Earthquake Information Center’s worldwide earthquake monitoring mission in direct station data acquisition, data import, short- and long-term data archiving, data distribution, query services, and playback, among other capabilities. The software design and architecture can be configured to support acquisition and (or) archiving use cases. The software continues to be developed in order to expand the acquisition, storage, and distribution capabilities.

  2. Short-Term Reciprocity in Late Parent-Child Relationships

    Science.gov (United States)

    Leopold, Thomas; Raab, Marcel

    2011-01-01

    Long-term concepts of parent-child reciprocity assume that the amount of support given and received is only balanced in a generalized fashion over the life course. We argue that reciprocity in parent-child relationships also operates in the short term. Our analysis of short-term reciprocity focuses on concurrent exchange in its main upward and…

  3. Relaxation creep model of impending earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Morgounov, V. A. [Russian Academy of Sciences, Institute of Physics of the Earth, Moscow (Russian Federation)

    2001-04-01

    The alternative view of the current status and perspective of seismic prediction studies is discussed. In the problem of the ascertainment of the uncertainty relation Cognoscibility-Unpredictability of Earthquakes, priorities of works on short-term earthquake prediction are defined due to the advantage that the final stage of nucleation of earthquake is characterized by a substantial activation of the process while its strain rate increases by the orders of magnitude and considerably increased signal-to-noise ratio. Based on the creep phenomenon under stress relaxation conditions, a model is proposed to explain different images of precursors of impending tectonic earthquakes. The onset of tertiary creep appears to correspond to the onset of instability and inevitably fails unless it unloaded. At this stage, the process acquires the self-regulating character to the greatest extent the property of irreversibility, one of the important components of prediction reliability. Data in situ suggest a principal possibility to diagnose the process of preparation by ground measurements of acoustic and electromagnetic emission in the rocks under constant strain in the condition of self-relaxed stress until the moment of fracture are discussed in context. It was obtained that electromagnetic emission precedes but does not accompany the phase of macrocrak development.

  4. Implementing a short-term loyalty program : case: Bosch Lawn & Garden and the Ventum short-term loyalty program

    OpenAIRE

    Logvinova, Veronika

    2015-01-01

    In 2015, one of the Bosch Home and Garden divisions, Bosch Lawn and Garden, has made a strategic decision to adopt a points-based short-term loyalty program called Ventum LG in the German supermarkets and petrol stations. It was decided that the base of this program will be completed Ventum PT short-term loyalty program which was managed by another division, Bosch Power Tools, and proved to be successful. This thesis aims to evaluate the worthiness of the Ventum LG loyalty program for Bosch L...

  5. Decay uncovered in nonverbal short-term memory.

    Science.gov (United States)

    Mercer, Tom; McKeown, Denis

    2014-02-01

    Decay theory posits that memory traces gradually fade away over the passage of time unless they are actively rehearsed. Much recent work exploring verbal short-term memory has challenged this theory, but there does appear to be evidence for trace decay in nonverbal auditory short-term memory. Numerous discrimination studies have reported a performance decline as the interval separating two tones is increased, consistent with a decay process. However, most of this tone comparison research can be explained in other ways, without reference to decay, and these alternative accounts were tested in the present study. In Experiment 1, signals were employed toward the end of extended retention intervals to ensure that listeners were alert to the presence and frequency content of the memoranda. In Experiment 2, a mask stimulus was employed in an attempt to distinguish between a highly detailed sensory trace and a longer-lasting short-term memory, and the distinctiveness of the stimuli was varied. Despite these precautions, slow-acting trace decay was observed. It therefore appears that the mere passage of time can lead to forgetting in some forms of short-term memory.

  6. Short-term energy outlook annual supplement, 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The Energy Information Administration (EIA) prepares quarterly, short-term energy supply, demand, and price projections for publication in February, May, August, and November in the Short-Term Energy Outlook (Outlook). An annual supplement analyzes the performance of previous forecasts, compares recent cases with those of other forecasting services, and discusses current topics related to the short-term energy markets. (See Short-Term Energy Outlook Annual Supplement, DOE/EIA-0202.) The forecast period for this issue of the Outlook extends from the third quarter of 1993 through the fourth quarter of 1994. Values for the second quarter of 1993, however, are preliminary EIA estimates (for example, some monthly values for petroleum supply and disposition are derived in part from weekly data reported in the Weekly Petroleum Status Report) or are calculated from model simulations using the latest exogenous information available (for example, electricity sales and generation are simulated using actual weather data). The historical energy data are EIA data published in the Monthly Energy Review, Petroleum Supply Monthly, and other EIA publications. Minor discrepancies between the data in these publications and the historical data in this Outlook are due to independent rounding

  7. Use of "Crowd-Sourcing" and other collaborations to solve the short-term, earthquake forecasting problem

    Science.gov (United States)

    Bleier, T.; Heraud, J. A.; Dunson, J. C.

    2015-12-01

    QuakeFinder (QF) and its international collaborators have installed and currently maintain 165 three-axis induction magnetometer instrument sites in California, Peru, Taiwan, Greece, Chile and Sumatra. The data from these instruments are being analyzed for pre-quake signatures. This analysis consists of both private research by QuakeFinder, and institutional collaborators (PUCP in Peru, NCU in Taiwan, PUCC in Chile, NOA in Greece, Syiah Kuala University in Indonesia, LASP at U of Colo., Stanford, and USGS). Recently, NASA Hq and QuakeFinder tried a new approach to help with the analysis of this huge (50+TB) data archive. A collaboration with Apirio/TopCoder, Harvard University, Amazon, QuakeFinder, and NASA Hq. resulted in an open algorithm development contest called "Quest for Quakes" in which contestants (freelance algorithm developers) attempted to identify quakes from a subset of the QuakeFinder data (3TB). The contest included a $25K prize pool, and contained 100 cases where earthquakes (and null sets) included data from up to 5 remote sites, near and far from quakes greater than M4. These data sets were made available through Amazon.com to hundreds of contestants over a two week contest period. In a more traditional approach, several new algorithms were tried by actively sharing the QF data with universities over a longer period. These algorithms included Principal Component Analysis-PCA and deep neural networks in an effort to automatically identify earthquake signals within typical, noise-filled environments. This presentation examines the pros and cons of employing these two approaches, from both logistical and scientific perspectives.

  8. Long-term impact of earthquake stress on fasting glucose control and diabetes prevalence among Chinese adults of Tangshan

    OpenAIRE

    An, Cuixia; Zhang, Yun; Yu, Lulu; Li, Na; Song, Mei; Wang, Lan; Zhao, Xiaochuan; Gao, Yuanyuan; Wang, Xueyi

    2014-01-01

    Objective: To investigate the long-term influence of stresses from the 1976 Tangshan earthquake on blood glucose control and the incidence of diabetes mellitus in Chinese people of Tangshan. Methods: 1,551 adults ≥ 37 years of age were recruited for this investigation in Tangshan city of China, where one of the deadliest earthquakes occurred in 1796. All subjects finished a questionnaire. 1,030 of them who experienced that earthquake were selected into the exposure group, while 521 were gathe...

  9. Near-real time 3D probabilistic earthquakes locations at Mt. Etna volcano

    Science.gov (United States)

    Barberi, G.; D'Agostino, M.; Mostaccio, A.; Patane', D.; Tuve', T.

    2012-04-01

    Automatic procedure for locating earthquake in quasi-real time must provide a good estimation of earthquakes location within a few seconds after the event is first detected and is strongly needed for seismic warning system. The reliability of an automatic location algorithm is influenced by several factors such as errors in picking seismic phases, network geometry, and velocity model uncertainties. On Mt. Etna, the seismic network is managed by INGV and the quasi-real time earthquakes locations are performed by using an automatic-picking algorithm based on short-term-average to long-term-average ratios (STA/LTA) calculated from an approximate squared envelope function of the seismogram, which furnish a list of P-wave arrival times, and the location algorithm Hypoellipse, with a 1D velocity model. The main purpose of this work is to investigate the performances of a different automatic procedure to improve the quasi-real time earthquakes locations. In fact, as the automatic data processing may be affected by outliers (wrong picks), the use of a traditional earthquake location techniques based on a least-square misfit function (L2-norm) often yield unstable and unreliable solutions. Moreover, on Mt. Etna, the 1D model is often unable to represent the complex structure of the volcano (in particular the strong lateral heterogeneities), whereas the increasing accuracy in the 3D velocity models at Mt. Etna during recent years allows their use today in routine earthquake locations. Therefore, we selected, as reference locations, all the events occurred on Mt. Etna in the last year (2011) which was automatically detected and located by means of the Hypoellipse code. By using this dataset (more than 300 events), we applied a nonlinear probabilistic earthquake location algorithm using the Equal Differential Time (EDT) likelihood function, (Font et al., 2004; Lomax, 2005) which is much more robust in the presence of outliers in the data. Successively, by using a probabilistic

  10. A spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3‐ETAS): Toward an operational earthquake forecast

    Science.gov (United States)

    Field, Edward; Milner, Kevin R.; Hardebeck, Jeanne L.; Page, Morgan T.; van der Elst, Nicholas; Jordan, Thomas H.; Michael, Andrew J.; Shaw, Bruce E.; Werner, Maximillan J.

    2017-01-01

    We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic‐type aftershock sequence (ETAS) component to the previously published time‐independent and long‐term time‐dependent forecasts. This combined model, referred to as UCERF3‐ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic‐rebound model for fault‐based ruptures, and a state‐of‐the‐art spatiotemporal clustering component. It also represents an attempt to merge fault‐based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude–frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3‐ETAS produces synthetic catalogs of M≥2.5 events, conditioned on any prior M≥2.5 events that are input to the model. We evaluate results with respect to both long‐term (1000 year) simulations as well as for 10‐year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3‐ETAS has many sources of uncertainty, as

  11. Parent-Offspring Conflict over Short-Term Mating Strategies

    Directory of Open Access Journals (Sweden)

    Spyroulla Georgiou

    2011-12-01

    Full Text Available Individuals engage in short-term mating strategies that enable them to obtain fitness benefits from casual relationships. These benefits, however, count for less and cost more to their parents. On this basis three hypotheses are tested. First, parents and offspring are likely to disagree over short-term mating strategies, with the former considering these as less acceptable than the latter. Second, parents are more likely to disapprove of the short-term mating strategies of their daughters than of their sons. Finally, mothers and fathers are expected to agree on how much they disagree over the short-term mating strategies of their children. Evidence from a sample of 148 Greek-Cypriot families (140 mothers, 105 fathers, 119 daughters, 77 sons provides support for the first two hypotheses and partial support for the third hypothesis. The implications of these findings for understanding family dynamics are further discussed.

  12. Rethinking earthquake-related DC-ULF electromagnetic phenomena: towards a physics-based approach

    Directory of Open Access Journals (Sweden)

    Q. Huang

    2011-11-01

    Full Text Available Numerous electromagnetic changes possibly related with earthquakes have been independently reported and have even been attempted to apply to short-term prediction of earthquakes. However, there are active debates on the above issue because the seismogenic process is rather complicated and the studies have been mainly empirical (i.e. a kind of experience-based approach. Thus, a physics-based study would be helpful for understanding earthquake-related electromagnetic phenomena and strengthening their applications. As a potential physics-based approach, I present an integrated research scheme, taking into account the interaction among observation, methodology, and physical model. For simplicity, this work focuses only on the earthquake-related DC-ULF electromagnetic phenomena. The main approach includes the following key problems: (1 how to perform a reliable and appropriate observation with some clear physical quantities; (2 how to develop a robust methodology to reveal weak earthquake-related electromagnetic signals from noisy background; and (3 how to develop plausible physical models based on theoretical analyses and/or laboratory experiments for the explanation of the earthquake-related electromagnetic signals observed in the field conditions.

  13. Quantitative Earthquake Prediction on Global and Regional Scales

    International Nuclear Information System (INIS)

    Kossobokov, Vladimir G.

    2006-01-01

    for mega-earthquakes of M9.0+. The monitoring at regional scales may require application of a recently proposed scheme for the spatial stabilization of the intermediate-term middle-range predictions. The scheme guarantees a more objective and reliable diagnosis of times of increased probability and is less restrictive to input seismic data. It makes feasible reestablishment of seismic monitoring aimed at prediction of large magnitude earthquakes in Caucasus and Central Asia, which to our regret, has been discontinued in 1991. The first results of the monitoring (1986-1990) were encouraging, at least for M6.5+

  14. Quantitative Earthquake Prediction on Global and Regional Scales

    Science.gov (United States)

    Kossobokov, Vladimir G.

    2006-03-01

    for mega-earthquakes of M9.0+. The monitoring at regional scales may require application of a recently proposed scheme for the spatial stabilization of the intermediate-term middle-range predictions. The scheme guarantees a more objective and reliable diagnosis of times of increased probability and is less restrictive to input seismic data. It makes feasible reestablishment of seismic monitoring aimed at prediction of large magnitude earthquakes in Caucasus and Central Asia, which to our regret, has been discontinued in 1991. The first results of the monitoring (1986-1990) were encouraging, at least for M6.5+.

  15. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    Science.gov (United States)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  16. Automatic Event Detection and Picking of P, S Seismic Phases for Earthquake Early Warning: A Case Study of the 2008 Wenchuan Earthquake

    Science.gov (United States)

    WANG, Z.; Zhao, B.

    2015-12-01

    We develop an automatic seismic phase arrival detection and picking algorithm for the impending earthquakes occurred with diverse focal mechanisms and depths. The polarization analysis of the three-component seismograms is utilized to distinguish between P and S waves through a sliding time window. When applying the short term average/long term average (STA/LTA) method to the polarized data, we also construct a new characteristics function that can sensitively reflect the changes of signals' amplitude and frequency, providing a better detection for the phase arrival. Then an improved combination method of the higher order statistics and the Akaike information criteria (AIC) picker is applied to the refined signal to lock on the arrival time with a higher degree of accuracy. We test our techniques to the aftershocks of the Ms8.0 Wenchuan earthquake, where hundreds of three-component acceleration records with magnitudes of 4.0 to 6.4 are treated. In comparison to the analyst picks, the results of the proposed detection algorithms are shown to perform well and can be applied from a single instrument within a network of stations for the large seismic events in the Earthquake Early Warning System (EEWS).

  17. Retrospective Evaluation of the Long-Term CSEP-Italy Earthquake Forecasts

    Science.gov (United States)

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-12-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. Here, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration.

  18. POST Earthquake Debris Management — AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  19. Circadian modulation of short-term memory in Drosophila.

    Science.gov (United States)

    Lyons, Lisa C; Roman, Gregg

    2009-01-01

    Endogenous biological clocks are widespread regulators of behavior and physiology, allowing for a more efficient allocation of efforts and resources over the course of a day. The extent that different processes are regulated by circadian oscillators, however, is not fully understood. We investigated the role of the circadian clock on short-term associative memory formation using a negatively reinforced olfactory-learning paradigm in Drosophila melanogaster. We found that memory formation was regulated in a circadian manner. The peak performance in short-term memory (STM) occurred during the early subjective night with a twofold performance amplitude after a single pairing of conditioned and unconditioned stimuli. This rhythm in memory is eliminated in both timeless and period mutants and is absent during constant light conditions. Circadian gating of sensory perception does not appear to underlie the rhythm in short-term memory as evidenced by the nonrhythmic shock avoidance and olfactory avoidance behaviors. Moreover, central brain oscillators appear to be responsible for the modulation as cryptochrome mutants, in which the antennal circadian oscillators are nonfunctional, demonstrate robust circadian rhythms in short-term memory. Together these data suggest that central, rather than peripheral, circadian oscillators modulate the formation of short-term associative memory and not the perception of the stimuli.

  20. The Role of Short-term Consolidation in Memory Persistence

    OpenAIRE

    Timothy J. Ricker

    2015-01-01

    Short-term memory, often described as working memory, is one of the most fundamental information processing systems of the human brain. Short-term memory function is necessary for language, spatial navigation, problem solving, and many other daily activities. Given its importance to cognitive function, understanding the architecture of short-term memory is of crucial importance to understanding human behavior. Recent work from several laboratories investigating the entry of information into s...

  1. The interaction of short-term and long-term memory in phonetic category formation

    Science.gov (United States)

    Harnsberger, James D.

    2002-05-01

    This study examined the role that short-term memory capacity plays in the relationship between novel stimuli (e.g., non-native speech sounds, native nonsense words) and phonetic categories in long-term memory. Thirty native speakers of American English were administered five tests: categorial AXB discrimination using nasal consonants from Malayalam; categorial identification, also using Malayalam nasals, which measured the influence of phonetic categories in long-term memory; digit span; nonword span, a short-term memory measure mediated by phonetic categories in long-term memory; and paired-associate word learning (word-word and word-nonword pairs). The results showed that almost all measures were significantly correlated with one another. The strongest predictor for the discrimination and word-nonword learning results was nonword (r=+0.62) and digit span (r=+0.51), respectively. When the identification test results were partialed out, only nonword span significantly correlated with discrimination. The results show a strong influence of short-term memory capacity on the encoding of phonetic detail within phonetic categories and suggest that long-term memory representations regulate the capacity of short-term memory to preserve information for subsequent encoding. The results of this study will also be discussed with regards to resolving the tension between episodic and abstract models of phonetic category structure.

  2. Evaluation of Earthquake-Induced Effects on Neighbouring Faults and Volcanoes: Application to the 2016 Pedernales Earthquake

    Science.gov (United States)

    Bejar, M.; Alvarez Gomez, J. A.; Staller, A.; Luna, M. P.; Perez Lopez, R.; Monserrat, O.; Chunga, K.; Herrera, G.; Jordá, L.; Lima, A.; Martínez-Díaz, J. J.

    2017-12-01

    It has long been recognized that earthquakes change the stress in the upper crust around the fault rupture and can influence the short-term behaviour of neighbouring faults and volcanoes. Rapid estimates of these stress changes can provide the authorities managing the post-disaster situation with a useful tool to identify and monitor potential threads and to update the estimates of seismic and volcanic hazard in a region. Space geodesy is now routinely used following an earthquake to image the displacement of the ground and estimate the rupture geometry and the distribution of slip. Using the obtained source model, it is possible to evaluate the remaining moment deficit and to infer the stress changes on nearby faults and volcanoes produced by the earthquake, which can be used to identify which faults and volcanoes are brought closer to failure or activation. Although these procedures are commonly used today, the transference of these results to the authorities managing the post-disaster situation is not straightforward and thus its usefulness is reduced in practice. Here we propose a methodology to evaluate the potential influence of an earthquake on nearby faults and volcanoes and create easy-to-understand maps for decision-making support after an earthquake. We apply this methodology to the Mw 7.8, 2016 Ecuador earthquake. Using Sentinel-1 SAR and continuous GPS data, we measure the coseismic ground deformation and estimate the distribution of slip. Then we use this model to evaluate the moment deficit on the subduction interface and changes of stress on the surrounding faults and volcanoes. The results are compared with the seismic and volcanic events that have occurred after the earthquake. We discuss potential and limits of the methodology and the lessons learnt from discussion with local authorities.

  3. Semantic and phonological contributions to short-term repetition and long-term cued sentence recall.

    Science.gov (United States)

    Meltzer, Jed A; Rose, Nathan S; Deschamps, Tiffany; Leigh, Rosie C; Panamsky, Lilia; Silberberg, Alexandra; Madani, Noushin; Links, Kira A

    2016-02-01

    The function of verbal short-term memory is supported not only by the phonological loop, but also by semantic resources that may operate on both short and long time scales. Elucidation of the neural underpinnings of these mechanisms requires effective behavioral manipulations that can selectively engage them. We developed a novel cued sentence recall paradigm to assess the effects of two factors on sentence recall accuracy at short-term and long-term stages. Participants initially repeated auditory sentences immediately following a 14-s retention period. After this task was complete, long-term memory for each sentence was probed by a two-word recall cue. The sentences were either concrete (high imageability) or abstract (low imageability), and the initial 14-s retention period was filled with either an undemanding finger-tapping task or a more engaging articulatory suppression task (Exp. 1, counting backward by threes; Exp. 2, repeating a four-syllable nonword). Recall was always better for the concrete sentences. Articulatory suppression reduced accuracy in short-term recall, especially for abstract sentences, but the sentences initially recalled following articulatory suppression were retained better at the subsequent cued-recall test, suggesting that the engagement of semantic mechanisms for short-term retention promoted encoding of the sentence meaning into long-term memory. These results provide a basis for using sentence imageability and subsequent memory performance as probes of semantic engagement in short-term memory for sentences.

  4. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  5. Spatial organization of foreshocks as a tool to forecast large earthquakes.

    Science.gov (United States)

    Lippiello, E; Marzocchi, W; de Arcangelis, L; Godano, C

    2012-01-01

    An increase in the number of smaller magnitude events, retrospectively named foreshocks, is often observed before large earthquakes. We show that the linear density probability of earthquakes occurring before and after small or intermediate mainshocks displays a symmetrical behavior, indicating that the size of the area fractured during the mainshock is encoded in the foreshock spatial organization. This observation can be used to discriminate spatial clustering due to foreshocks from the one induced by aftershocks and is implemented in an alarm-based model to forecast m > 6 earthquakes. A retrospective study of the last 19 years Southern California catalog shows that the daily occurrence probability presents isolated peaks closely located in time and space to the epicenters of five of the six m > 6 earthquakes. We find daily probabilities as high as 25% (in cells of size 0.04 × 0.04deg(2)), with significant probability gains with respect to standard models.

  6. Short-term and long-term sick-leave in Sweden

    DEFF Research Database (Denmark)

    Blank, N; Diderichsen, Finn

    1995-01-01

    The primary aim of the study was to analyse similarities and differences between repeated spells of short-term sick-leave (more than 3 spells of less than 7 days' duration in a 12-month period) and long-term absence through sickness (at least 1 spell of more than 59 days' duration in a 12-month p...

  7. Sleep deprivation accelerates delay-related loss of visual short-term memories without affecting precision.

    Science.gov (United States)

    Wee, Natalie; Asplund, Christopher L; Chee, Michael W L

    2013-06-01

    Visual short-term memory (VSTM) is an important measure of information processing capacity and supports many higher-order cognitive processes. We examined how sleep deprivation (SD) and maintenance duration interact to influence the number and precision of items in VSTM using an experimental design that limits the contribution of lapses at encoding. For each trial, participants attempted to maintain the location and color of three stimuli over a delay. After a retention interval of either 1 or 10 seconds, participants reported the color of the item at the cued location by selecting it on a color wheel. The probability of reporting the probed item, the precision of report, and the probability of reporting a nonprobed item were determined using a mixture-modeling analysis. Participants were studied twice in counterbalanced order, once after a night of normal sleep and once following a night of sleep deprivation. Sleep laboratory. Nineteen healthy college age volunteers (seven females) with regular sleep patterns. Approximately 24 hours of total SD. SD selectively reduced the number of integrated representations that can be retrieved after a delay, while leaving the precision of object information in the stored representations intact. Delay interacted with SD to lower the rate of successful recall. Visual short-term memory is compromised during sleep deprivation, an effect compounded by delay. However, when memories are retrieved, they tend to be intact.

  8. Fault roughness and strength heterogeneity control earthquake size and stress drop

    KAUST Repository

    Zielke, Olaf

    2017-01-13

    An earthquake\\'s stress drop is related to the frictional breakdown during sliding and constitutes a fundamental quantity of the rupture process. High-speed laboratory friction experiments that emulate the rupture process imply stress drop values that greatly exceed those commonly reported for natural earthquakes. We hypothesize that this stress drop discrepancy is due to fault-surface roughness and strength heterogeneity: an earthquake\\'s moment release and its recurrence probability depend not only on stress drop and rupture dimension but also on the geometric roughness of the ruptured fault and the location of failing strength asperities along it. Using large-scale numerical simulations for earthquake ruptures under varying roughness and strength conditions, we verify our hypothesis, showing that smoother faults may generate larger earthquakes than rougher faults under identical tectonic loading conditions. We further discuss the potential impact of fault roughness on earthquake recurrence probability. This finding provides important information, also for seismic hazard analysis.

  9. Future probabilities of coastal floods in Finland

    Science.gov (United States)

    Pellikka, Havu; Leijala, Ulpu; Johansson, Milla M.; Leinonen, Katri; Kahma, Kimmo K.

    2018-04-01

    Coastal planning requires detailed knowledge of future flooding risks, and effective planning must consider both short-term sea level variations and the long-term trend. We calculate distributions that combine short- and long-term effects to provide estimates of flood probabilities in 2050 and 2100 on the Finnish coast in the Baltic Sea. Our distributions of short-term sea level variations are based on 46 years (1971-2016) of observations from the 13 Finnish tide gauges. The long-term scenarios of mean sea level combine postglacial land uplift, regionally adjusted scenarios of global sea level rise, and the effect of changes in the wind climate. The results predict that flooding risks will clearly increase by 2100 in the Gulf of Finland and the Bothnian Sea, while only a small increase or no change compared to present-day conditions is expected in the Bothnian Bay, where the land uplift is stronger.

  10. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  11. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    Energy Technology Data Exchange (ETDEWEB)

    Türker, Tuğba, E-mail: tturker@ktu.edu.tr [Karadeniz Technical University, Department of Geophysics, Trabzon/Turkey (Turkey); Bayrak, Yusuf, E-mail: ybayrak@agri.edu.tr [Ağrı İbrahim Çeçen University, Ağrı/Turkey (Turkey)

    2016-04-18

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M{sub S}=7.3 and 1897, M{sub S}=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M{sub S} magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99

  12. A Poisson method application to the assessment of the earthquake hazard in the North Anatolian Fault Zone, Turkey

    International Nuclear Information System (INIS)

    Türker, Tuğba; Bayrak, Yusuf

    2016-01-01

    North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson method the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M_S=7.3 and 1897, M_S=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M_S magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9) %99 with an earthquake

  13. Very short-term probabilistic forecasting of wind power with generalized logit-Normal distributions

    DEFF Research Database (Denmark)

    Pinson, Pierre

    2012-01-01

    and probability masses at the bounds. Both auto-regressive and conditional parametric auto-regressive models are considered for the dynamics of their location and scale parameters. Estimation is performed in a recursive least squares framework with exponential forgetting. The superiority of this proposal over......Very-short-term probabilistic forecasts, which are essential for an optimal management of wind generation, ought to account for the non-linear and double-bounded nature of that stochastic process. They take here the form of discrete–continuous mixtures of generalized logit–normal distributions...

  14. Short-term Consumer Benefits of Dynamic Pricing

    OpenAIRE

    Dupont, Benjamin; De Jonghe, Cedric; Kessels, Kris; Belmans, Ronnie

    2011-01-01

    Consumer benefits of dynamic pricing depend on a variety of factors. Consumer characteristics and climatic circumstances widely differ, which forces a regional comparison. This paper presents a general overview of demand response programs and focuses on the short-term benefits of dynamic pricing for an average Flemish residential consumer. It reaches a methodology to develop a cost reflective dynamic pricing program and to estimate short-term bill savings. Participating in a dynamic pricing p...

  15. Tidal controls on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  16. Musical and Verbal Memory in Alzheimer's Disease: A Study of Long-Term and Short-Term Memory

    Science.gov (United States)

    Menard, Marie-Claude; Belleville, Sylvie

    2009-01-01

    Musical memory was tested in Alzheimer patients and in healthy older adults using long-term and short-term memory tasks. Long-term memory (LTM) was tested with a recognition procedure using unfamiliar melodies. Short-term memory (STM) was evaluated with same/different judgment tasks on short series of notes. Musical memory was compared to verbal…

  17. A least squares approach for efficient and reliable short-term versus long-term optimization

    DEFF Research Database (Denmark)

    Christiansen, Lasse Hjuler; Capolei, Andrea; Jørgensen, John Bagterp

    2017-01-01

    The uncertainties related to long-term forecasts of oil prices impose significant financial risk on ventures of oil production. To minimize risk, oil companies are inclined to maximize profit over short-term horizons ranging from months to a few years. In contrast, conventional production...... optimization maximizes long-term profits over horizons that span more than a decade. To address this challenge, the oil literature has introduced short-term versus long-term optimization. Ideally, this problem is solved by a posteriori multi-objective optimization methods that generate an approximation...... the balance between the objectives, leaving an unfulfilled potential to increase profits. To promote efficient and reliable short-term versus long-term optimization, this paper introduces a natural way to characterize desirable Pareto points and proposes a novel least squares (LS) method. Unlike hierarchical...

  18. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    Science.gov (United States)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  19. Is earthquake rate in south Iceland modified by seasonal loading?

    Science.gov (United States)

    Jonsson, S.; Aoki, Y.; Drouin, V.

    2017-12-01

    Several temporarily varying processes have the potential of modifying the rate of earthquakes in the south Iceland seismic zone, one of the two most active seismic zones in Iceland. These include solid earth tides, seasonal meteorological effects and influence from passing weather systems, and variations in snow and glacier loads. In this study we investigate the influence these processes may have on crustal stresses and stressing rates in the seismic zone and assess whether they appear to be influencing the earthquake rate. While historical earthquakes in the south Iceland have preferentially occurred in early summer, this tendency is less clear for small earthquakes. The local earthquake catalogue (going back to 1991, magnitude of completeness M6+ earthquakes, which occurred in June 2000 and May 2008. Standard Reasenberg earthquake declustering or more involved model independent stochastic declustering algorithms are not capable of fully eliminating the aftershocks from the catalogue. We therefore inspected the catalogue for the time period before 2000 and it shows limited seasonal tendency in earthquake occurrence. Our preliminary results show no clear correlation between earthquake rates and short-term stressing variations induced from solid earth tides or passing storms. Seasonal meteorological effects also appear to be too small to influence the earthquake activity. Snow and glacier load variations induce significant vertical motions in the area with peak loading occurring in Spring (April-May) and maximum unloading in Fall (Sept.-Oct.). Early summer occurrence of historical earthquakes therefore correlates with early unloading rather than with the peak unloading or unloading rate, which appears to indicate limited influence of this seasonal process on the earthquake activity.

  20. Do Short-Term Managerial Objectives Lead to Under- or Over-Investment in Long-Term Projects

    OpenAIRE

    Lucian Arye Bebchuk; Lars A. Stole

    1994-01-01

    This paper studies managerial decisions about investment in long-run projects in the presence of imperfect information (the market knows less about such investments than the firm's managers) and short-term managerial objectives (the managers are concerned about the short-term stock price as well as the long-term stock price). Prior work has suggested that imperfect information and short-term managerial objectives induce managers to underinvest in long-run projects. We show that either underin...

  1. Drought analysis and short-term forecast in the Aison River Basin (Greece

    Directory of Open Access Journals (Sweden)

    S. Kavalieratou

    2012-05-01

    Full Text Available A combined regional drought analysis and forecast is elaborated and applied to the Aison River Basin (Greece. The historical frequency, duration and severity were estimated using the standardized precipitation index (SPI computed on variable time scales, while short-term drought forecast was investigated by means of 3-D loglinear models. A quasi-association model with homogenous diagonal effect was proposed to fit the observed frequencies of class transitions of the SPI values computed on the 12-month time scale. Then, an adapted submodel was selected for each data set through the backward elimination method. The analysis and forecast of the drought class transition probabilities were based on the odds of the expected frequencies, estimated by these submodels, and the respective confidence intervals of these odds. The parsimonious forecast models fitted adequately the observed data. Results gave a comprehensive insight on drought behavior, highlighting a dominant drought period (1988–1991 with extreme drought events and revealing, in most cases, smooth drought class transitions. The proposed approach can be an efficient tool in regional water resources management and short-term drought warning, especially in irrigated districts.

  2. Aftershock Characteristics as a Means of Discriminating Explosions from Earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ford, S R; Walter, W R

    2009-05-20

    The behavior of aftershock sequences around the Nevada Test Site in the southern Great Basin is characterized as a potential discriminant between explosions and earthquakes. The aftershock model designed by Reasenberg and Jones (1989, 1994) allows for a probabilistic statement of earthquake-like aftershock behavior at any time after the mainshock. We use this model to define two types of aftershock discriminants. The first defines M{sub X}, or the minimum magnitude of an aftershock expected within a given duration after the mainshock with probability X. Of the 67 earthquakes with M > 4 in the study region, 63 of them produce an aftershock greater than M{sub 99} within the first seven days after a mainshock. This is contrasted with only six of 93 explosions with M > 4 that produce an aftershock greater than M{sub 99} for the same period. If the aftershock magnitude threshold is lowered and the M{sub 90} criteria is used, then no explosions produce an aftershock greater than M{sub 90} for durations that end more than 17 days after the mainshock. The other discriminant defines N{sub X}, or the minimum cumulative number of aftershocks expected for given time after the mainshock with probability X. Similar to the aftershock magnitude discriminant, five earthquakes do not produce more aftershocks than N{sub 99} within 7 days after the mainshock. However, within the same period all but one explosion produce less aftershocks then N{sub 99}. One explosion is added if the duration is shortened to two days after than mainshock. The cumulative number aftershock discriminant is more reliable, especially at short durations, but requires a low magnitude of completeness for the given earthquake catalog. These results at NTS are quite promising and should be evaluated at other nuclear test sites to understand the effects of differences in the geologic setting and nuclear testing practices on its performance.

  3. Temporal distribution of earthquakes using renewal process in the Dasht-e-Bayaz region

    Science.gov (United States)

    Mousavi, Mehdi; Salehi, Masoud

    2018-01-01

    Temporal distribution of earthquakes with M w > 6 in the Dasht-e-Bayaz region, eastern Iran has been investigated using time-dependent models. Based on these types of models, it is assumed that the times between consecutive large earthquakes follow a certain statistical distribution. For this purpose, four time-dependent inter-event distributions including the Weibull, Gamma, Lognormal, and the Brownian Passage Time (BPT) are used in this study and the associated parameters are estimated using the method of maximum likelihood estimation. The suitable distribution is selected based on logarithm likelihood function and Bayesian Information Criterion. The probability of the occurrence of the next large earthquake during a specified interval of time was calculated for each model. Then, the concept of conditional probability has been applied to forecast the next major ( M w > 6) earthquake in the site of our interest. The emphasis is on statistical methods which attempt to quantify the probability of an earthquake occurring within a specified time, space, and magnitude windows. According to obtained results, the probability of occurrence of an earthquake with M w > 6 in the near future is significantly high.

  4. Sleep Quality, Short-Term and Long-Term CPAP Adherence

    Science.gov (United States)

    Somiah, Manya; Taxin, Zachary; Keating, Joseph; Mooney, Anne M.; Norman, Robert G.; Rapoport, David M.; Ayappa, Indu

    2012-01-01

    Study Objectives: Adherence to CPAP therapy is low in patients with obstructive sleep apnea/hypopnea syndrome (OSAHS). The purpose of the present study was to evaluate the utility of measures of sleep architecture and sleep continuity on the CPAP titration study as predictors of both short- and long-term CPAP adherence. Methods: 93 patients with OSAHS (RDI 42.8 ± 34.3/h) underwent in-laboratory diagnostic polysomnography, CPAP titration, and follow-up polysomnography (NPSG) on CPAP. Adherence to CPAP was objectively monitored. Short-term (ST) CPAP adherence was averaged over 14 days immediately following the titration study. Long-term (LT) CPAP adherence was obtained in 56/93 patients after approximately 2 months of CPAP use. Patients were grouped into CPAP adherence groups for ST ( 4 h) and LT adherence ( 4 h). Sleep architecture, sleep disordered breathing (SDB) indices, and daytime outcome variables from the diagnostic and titration NPSGs were compared between CPAP adherence groups. Results: There was a significant relationship between ST and LT CPAP adherence (r = 0.81, p CPAP adherence groups had significantly lower %N2 and greater %REM on the titration NPSG. A model combining change in sleep efficiency and change in sleep continuity between the diagnostic and titration NPSGs predicted 17% of the variance in LT adherence (p = 0.006). Conclusions: These findings demonstrate that characteristics of sleep architecture, even on the titration NPSG, may predict some of the variance in CPAP adherence. Better sleep quality on the titration night was related to better CPAP adherence, suggesting that interventions to improve sleep on/prior to the CPAP titration study might be used as a therapeutic intervention to improve CPAP adherence. Citation: Somiah M; Taxin Z; Keating J; Mooney AM; Norman RG; Rapoport DM; Ayappa I. Sleep quality, short-term and long-term CPAP adherence. J Clin Sleep Med 2012;8(5):489-500. PMID:23066359

  5. Markers of preparatory attention predict visual short-term memory performance.

    Science.gov (United States)

    Murray, Alexandra M; Nobre, Anna C; Stokes, Mark G

    2011-05-01

    Visual short-term memory (VSTM) is limited in capacity. Therefore, it is important to encode only visual information that is most likely to be relevant to behaviour. Here we asked which aspects of selective biasing of VSTM encoding predict subsequent memory-based performance. We measured EEG during a selective VSTM encoding task, in which we varied parametrically the memory load and the precision of recall required to compare a remembered item to a subsequent probe item. On half the trials, a spatial cue indicated that participants only needed to encode items from one hemifield. We observed a typical sequence of markers of anticipatory spatial attention: early attention directing negativity (EDAN), anterior attention directing negativity (ADAN), late directing attention positivity (LDAP); as well as of VSTM maintenance: contralateral delay activity (CDA). We found that individual differences in preparatory brain activity (EDAN/ADAN) predicted cue-related changes in recall accuracy, indexed by memory-probe discrimination sensitivity (d'). Importantly, our parametric manipulation of memory-probe similarity also allowed us to model the behavioural data for each participant, providing estimates for the quality of the memory representation and the probability that an item could be retrieved. We found that selective encoding primarily increased the probability of accurate memory recall; that ERP markers of preparatory attention predicted the cue-related changes in recall probability. Copyright © 2011. Published by Elsevier Ltd.

  6. Assessing the associative deficit of older adults in long-term and short-term/working memory.

    Science.gov (United States)

    Chen, Tina; Naveh-Benjamin, Moshe

    2012-09-01

    Older adults exhibit a deficit in associative long-term memory relative to younger adults. However, the literature is inconclusive regarding whether this deficit is attenuated in short-term/working memory. To elucidate the issue, three experiments assessed younger and older adults' item and interitem associative memory and the effects of several variables that might potentially contribute to the inconsistent pattern of results in previous studies. In Experiment 1, participants were tested on item and associative recognition memory with both long-term and short-term retention intervals in a single, continuous recognition paradigm. There was an associative deficit for older adults in the short-term and long-term intervals. Using only short-term intervals, Experiment 2 utilized mixed and blocked test designs to examine the effect of test event salience. Blocking the test did not attenuate the age-related associative deficit seen in the mixed test blocks. Finally, an age-related associative deficit was found in Experiment 3, under both sequential and simultaneous presentation conditions. Even while accounting for some methodological issues, the associative deficit of older adults is evident in short-term/working memory.

  7. Posttraumatic stress disorder and somatic symptoms among child and adolescent survivors following the Lushan earthquake in China: A six-month longitudinal study.

    Science.gov (United States)

    Zhang, Jun; Zhu, Shenyue; Du, Changhui; Zhang, Ye

    2015-08-01

    To explore somatic conditions in a sample of 2299 child and adolescent survivors of an earthquake and their relationship to posttraumatic stress disorder (PTSD) symptoms. The Children's Revised Impact of Event Scale, the Patient Health Questionnaire (PHQ)-13 scale, a short version of PHQ-15 scale that omits two items involving sexual pain/problems and menstrual problems, and a project-developed questionnaire were administered to participants three and six months after the earthquake. Among child and adolescent survivors, the prevalence rates of probable PTSD were 37.4 and 24.2% three and six months, respectively, after the earthquake. The most common somatic symptoms were trouble sleeping (58.4 and 48.4%), feeling tired or having low energy (52.0 and 46.1%), and stomach pain (45.8 and 45.4%) after three and six months, respectively. Several specific somatic symptoms evaluated three months after the earthquake including trouble sleeping, headache, and shortness of breath were predictors of the overall PTSD symptoms evaluated six months after the earthquake. Additionally, the symptom of hyperarousal evaluated after three months could predict the overall somatic symptoms evaluated after six months. PTSD and somatic symptoms were common after the earthquake, and a longitudinal association between PTSD and somatic symptoms was detected among child and adolescent survivors. These findings have implications in China and possibly elsewhere. Copyright © 2015 Elsevier Inc. All rights reserved.

  8. The pedagogy of Short-Term Study-Abroad Programs

    Directory of Open Access Journals (Sweden)

    Jude Gonsalvez

    2013-10-01

    Full Text Available This paper focuses on establishing guidelines on the pedagogy of short term study abroad programs. This study follows 33 students who participated in a short-term study-abroad program to India with the researcher from 2006 through 2011. The study relies heavily on the student reflections and expressions as they experienced them. It is qualitative in nature. Focus groups were the main method of data collection, where participants were invited to reflect, express, and share their experiences with one another. This provided an opportunity for the participants to come together, relive their experiences, and help provide information as to how and what type of an influence this short-term study-abroad program provided.

  9. Estimation of long-term probabilities for inadvertent intrusion into radioactive waste management areas

    International Nuclear Information System (INIS)

    Eedy, W.; Hart, D.

    1988-05-01

    The risk to human health from radioactive waste management sites can be calculated as the product of the probability of accidental exposure (intrusion) times the probability of a health effect from such exposure. This report reviews the literature and evaluates methods used to predict the probabilities for unintentional intrusion into radioactive waste management areas in Canada over a 10,000-year period. Methods to predict such probabilities are available. They generally assume a long-term stability in terms of existing resource uses and society in the management area. The major potential for errors results from the unlikeliness of these assumptions holding true over such lengthy periods of prediction

  10. Scenario-based earthquake hazard and risk assessment for Baku (Azerbaijan

    Directory of Open Access Journals (Sweden)

    G. Babayev

    2010-12-01

    Full Text Available A rapid growth of population, intensive civil and industrial building, land and water instabilities (e.g. landslides, significant underground water level fluctuations, and the lack of public awareness regarding seismic hazard contribute to the increase of vulnerability of Baku (the capital city of the Republic of Azerbaijan to earthquakes. In this study, we assess an earthquake risk in the city determined as a convolution of seismic hazard (in terms of the surface peak ground acceleration, PGA, vulnerability (due to building construction fragility, population features, the gross domestic product per capita, and landslide's occurrence, and exposure of infrastructure and critical facilities. The earthquake risk assessment provides useful information to identify the factors influencing the risk. A deterministic seismic hazard for Baku is analysed for four earthquake scenarios: near, far, local, and extreme events. The seismic hazard models demonstrate the level of ground shaking in the city: PGA high values are predicted in the southern coastal and north-eastern parts of the city and in some parts of the downtown. The PGA attains its maximal values for the local and extreme earthquake scenarios. We show that the quality of buildings and the probability of their damage, the distribution of urban population, exposure, and the pattern of peak ground acceleration contribute to the seismic risk, meanwhile the vulnerability factors play a more prominent role for all earthquake scenarios. Our results can allow elaborating strategic countermeasure plans for the earthquake risk mitigation in the Baku city.

  11. Genetic deletion of melanin-concentrating hormone neurons impairs hippocampal short-term synaptic plasticity and hippocampal-dependent forms of short-term memory.

    Science.gov (United States)

    Le Barillier, Léa; Léger, Lucienne; Luppi, Pierre-Hervé; Fort, Patrice; Malleret, Gaël; Salin, Paul-Antoine

    2015-11-01

    The cognitive role of melanin-concentrating hormone (MCH) neurons, a neuronal population located in the mammalian postero-lateral hypothalamus sending projections to all cortical areas, remains poorly understood. Mainly activated during paradoxical sleep (PS), MCH neurons have been implicated in sleep regulation. The genetic deletion of the only known MCH receptor in rodent leads to an impairment of hippocampal dependent forms of memory and to an alteration of hippocampal long-term synaptic plasticity. By using MCH/ataxin3 mice, a genetic model characterized by a selective deletion of MCH neurons in the adult, we investigated the role of MCH neurons in hippocampal synaptic plasticity and hippocampal-dependent forms of memory. MCH/ataxin3 mice exhibited a deficit in the early part of both long-term potentiation and depression in the CA1 area of the hippocampus. Post-tetanic potentiation (PTP) was diminished while synaptic depression induced by repetitive stimulation was enhanced suggesting an alteration of pre-synaptic forms of short-term plasticity in these mice. Behaviorally, MCH/ataxin3 mice spent more time and showed a higher level of hesitation as compared to their controls in performing a short-term memory T-maze task, displayed retardation in acquiring a reference memory task in a Morris water maze, and showed a habituation deficit in an open field task. Deletion of MCH neurons could thus alter spatial short-term memory by impairing short-term plasticity in the hippocampus. Altogether, these findings could provide a cellular mechanism by which PS may facilitate memory encoding. Via MCH neuron activation, PS could prepare the day's learning by increasing and modulating short-term synaptic plasticity in the hippocampus. © 2015 Wiley Periodicals, Inc.

  12. Relationship between short and long term radon measurements

    International Nuclear Information System (INIS)

    Martinez, T.; Ramirez, D.; Navarrete, M.; Cabrera, L.; Ramirez, A.; Gonzalez, P.

    2000-01-01

    In this work the radon group of the Faculty of Chemistry at the National University of Mexico presents the results obtained in the establishment of a relation between the short and long term radon measures made with passive electret detectors E-PERM type LLT and HST. The measures were carried out inside single family dwellings (open house condition) located in the southeast of Mexico City (in Xochimilco) during the four seasons of the year 1997. A correlation was established between the short term measures (five days) and those of a long term for every season as well as an annual average, with an equation that relates them. The objective and advantage of this correlation are that with a short term measure it is possible to predict the annual mean radon concentration, that represents a saving of human and economic resources. (author)

  13. Impact of short-term severe accident management actions in a long-term perspective. Final Report

    International Nuclear Information System (INIS)

    2000-03-01

    The present systems for severe accident management are focused on mitigating the consequences of special severe accident phenomena and to reach a safe plant state. However, in the development of strategies and procedures for severe accident management, it is also important to consider the long-term perspective of accident management and especially to secure the safe state of the plant. The main reason for this is that certain short-term actions have an impact on the long-term scenario. Both positive and negative effects from short-term actions on the accident management in the long-term perspective have been included in this paper. Short-term actions are accident management measures taken within about 24 hours after the initiating event. The purpose of short-term actions is to reach a stable status of the plant. The main goal in the long-term perspective is to maintain the reactor in a stable state and prevent uncontrolled releases of activity. The purpose of this short Technical Note, deliberately limited in scope, is to draw attention to potential long-term problems, important to utilities and regulatory authorities, arising from the way a severe accident would be managed during the first hours. Its objective is to encourage discussions on the safest - and maybe also most economical - way to manage a severe accident in the long term by not making the situation worse through inappropriate short-term actions, and on the identification of short-term actions likely to make long-term management easier and safer. The Note is intended as a contribution to the knowledge base put at the disposal of Member countries through international collaboration. The scope of the work has been limited to a literature search. Useful further activities have been identified. However, there is no proposal, at this stage, for more detailed work to be undertaken under the auspices of the CSNI. Plant-specific applications would need to be developed by utilities

  14. Depressive symptoms and associated psychosocial factors among adolescent survivors 30 months after 2008 Wenchuan earthquake: A follow-up study

    Directory of Open Access Journals (Sweden)

    Xuliang eShi

    2016-03-01

    Full Text Available AbstractPurpose: This study longitudinally investigated the changes of depressive symptoms among adolescent survivors over two years and a half after the 2008 Wenchuan earthquake in China, as well as the predictive effects of demographic characteristics, earthquake exposure, negative life events, social support and dispositional resilience on the risk of depressive symptoms at two time points after the earthquake.Methods: Participants were 1573 adolescent survivors (720 males and 853 females, mean age at initial survey =15 ± 1.26, whose depressive symptoms were assessed at 6 months (T6m and 30 months (T30m post-earthquake. Data on demographics, earthquake exposure and dispositional resilience were collected at T6m. Negative life events and social support were measured at T6m and 24 months (T24m post-earthquake.Results: The prevalence rates of probable depression, 27.5% at T6m and 27.2% at T30m, maintained relatively stable over time. Female gender was related with higher risk of depressive symptoms at both T6m and T30m, while being only-child could only predict higher risk of depressive symptoms at T30m. Negative life events and social support at T6m, as well as earthquake exposure, were concurrently associated with increased risk of depressive symptoms at T6m, but not associated with the risk of depressive symptoms at T30m, while negative life events and social support at T24m could predict depressive symptoms at T30m, all of which suggested that these variables may have strong but short-term effect on adolescents’ depressive symptoms post-earthquake. Besides, dispositional resilience was evidenced as a relatively stable negative predictor for depressive symptoms.Conclusions: These findings could inform mental health professionals regarding how to screen adolescent survivors at high risk for depression, so as to provide them with timely and appropriate mental health services based on the identified risk and protective factors for depressive

  15. Ordered short-term memory differs in signers and speakers: Implications for models of short-term memory

    OpenAIRE

    Bavelier, Daphne; Newport, Elissa L.; Hall, Matt; Supalla, Ted; Boutla, Mrim

    2008-01-01

    Capacity limits in linguistic short-term memory (STM) are typically measured with forward span tasks in which participants are asked to recall lists of words in the order presented. Using such tasks, native signers of American Sign Language (ASL) exhibit smaller spans than native speakers (Boutla, Supalla, Newport, & Bavelier, 2004). Here, we test the hypothesis that this population difference reflects differences in the way speakers and signers maintain temporal order information in short-te...

  16. Short-term and long-term deflection of reinforced hollow core ...

    African Journals Online (AJOL)

    This paper presents a study on different methods of analysis that are currently used by design codes to predict the short-term and long-term deflection of reinforced concrete slab systems and compares the predicted deflections with measured deflections. The experimental work to measure deflections involved the testing of ...

  17. Robust short-term memory without synaptic learning.

    Directory of Open Access Journals (Sweden)

    Samuel Johnson

    Full Text Available Short-term memory in the brain cannot in general be explained the way long-term memory can--as a gradual modification of synaptic weights--since it takes place too quickly. Theories based on some form of cellular bistability, however, do not seem able to account for the fact that noisy neurons can collectively store information in a robust manner. We show how a sufficiently clustered network of simple model neurons can be instantly induced into metastable states capable of retaining information for a short time (a few seconds. The mechanism is robust to different network topologies and kinds of neural model. This could constitute a viable means available to the brain for sensory and/or short-term memory with no need of synaptic learning. Relevant phenomena described by neurobiology and psychology, such as local synchronization of synaptic inputs and power-law statistics of forgetting avalanches, emerge naturally from this mechanism, and we suggest possible experiments to test its viability in more biological settings.

  18. Robust short-term memory without synaptic learning.

    Science.gov (United States)

    Johnson, Samuel; Marro, J; Torres, Joaquín J

    2013-01-01

    Short-term memory in the brain cannot in general be explained the way long-term memory can--as a gradual modification of synaptic weights--since it takes place too quickly. Theories based on some form of cellular bistability, however, do not seem able to account for the fact that noisy neurons can collectively store information in a robust manner. We show how a sufficiently clustered network of simple model neurons can be instantly induced into metastable states capable of retaining information for a short time (a few seconds). The mechanism is robust to different network topologies and kinds of neural model. This could constitute a viable means available to the brain for sensory and/or short-term memory with no need of synaptic learning. Relevant phenomena described by neurobiology and psychology, such as local synchronization of synaptic inputs and power-law statistics of forgetting avalanches, emerge naturally from this mechanism, and we suggest possible experiments to test its viability in more biological settings.

  19. Robust Short-Term Memory without Synaptic Learning

    Science.gov (United States)

    Johnson, Samuel; Marro, J.; Torres, Joaquín J.

    2013-01-01

    Short-term memory in the brain cannot in general be explained the way long-term memory can – as a gradual modification of synaptic weights – since it takes place too quickly. Theories based on some form of cellular bistability, however, do not seem able to account for the fact that noisy neurons can collectively store information in a robust manner. We show how a sufficiently clustered network of simple model neurons can be instantly induced into metastable states capable of retaining information for a short time (a few seconds). The mechanism is robust to different network topologies and kinds of neural model. This could constitute a viable means available to the brain for sensory and/or short-term memory with no need of synaptic learning. Relevant phenomena described by neurobiology and psychology, such as local synchronization of synaptic inputs and power-law statistics of forgetting avalanches, emerge naturally from this mechanism, and we suggest possible experiments to test its viability in more biological settings. PMID:23349664

  20. Electrical streaming potential precursors to catastrophic earthquakes in China

    Directory of Open Access Journals (Sweden)

    F. Qian

    1997-06-01

    Full Text Available The majority of anomalies in self-potential at 7 stations within 160 km from the epicentre showed a similar pattern of rapid onset and slow decay during and before the M 7.8 Tangshan earthquake of 1976. Considering that some of these anomalies associated with episodical spouting from boreholes or the increase in pore pressure in wells, observed anomalies are streaming potential generated by local events of sudden movements and diffusion process of high-pressure fluid in parallel faults. These transient events triggered by tidal forces exhibited a periodic nature and the statistical phenomenon to migrate towards the epicentre about one month before the earthquake. As a result of events, the pore pressure reached its final equilibrium state and was higher than that in the initial state in a large enough section of the fault region. Consequently, local effective shear strength of the material in the fault zone decreased and finally the catastrophic earthquake was induced. Similar phenomena also occurred one month before the M 7.3 Haichen earthquake of 1975. Therefore, a short term earthquake prediction can be made by electrical measurements, which are the kind of geophysical measurements most closely related to pore fluid behaviors of the deep crust.

  1. A Bayesian Method for Short-Term Probabilistic Forecasting of Photovoltaic Generation in Smart Grid Operation and Control

    Directory of Open Access Journals (Sweden)

    Gabriella Ferruzzi

    2013-02-01

    Full Text Available A new short-term probabilistic forecasting method is proposed to predict the probability density function of the hourly active power generated by a photovoltaic system. Firstly, the probability density function of the hourly clearness index is forecasted making use of a Bayesian auto regressive time series model; the model takes into account the dependence of the solar radiation on some meteorological variables, such as the cloud cover and humidity. Then, a Monte Carlo simulation procedure is used to evaluate the predictive probability density function of the hourly active power by applying the photovoltaic system model to the random sampling of the clearness index distribution. A numerical application demonstrates the effectiveness and advantages of the proposed forecasting method.

  2. Width of the Surface Rupture Zone for Thrust Earthquakes and Implications for Earthquake Fault Zoning: Chi-Chi 1999 and Wenchuan 2008 Earthquakes

    Science.gov (United States)

    Boncio, P.; Caldarella, M.

    2016-12-01

    We analyze the zones of coseismic surface faulting along thrust faults, whit the aim of defining the most appropriate criteria for zoning the Surface Fault Rupture Hazard (SFRH) along thrust faults. Normal and strike-slip faults were deeply studied in the past, while thrust faults were not studied with comparable attention. We analyze the 1999 Chi-Chi, Taiwan (Mw 7.6) and 2008 Wenchuan, China (Mw 7.9) earthquakes. Several different types of coseismic fault scarps characterize the two earthquakes, depending on the topography, fault geometry and near-surface materials. For both the earthquakes, we collected from the literature, or measured in GIS-georeferenced published maps, data about the Width of the coseismic Rupture Zone (WRZ). The frequency distribution of WRZ compared to the trace of the main fault shows that the surface ruptures occur mainly on and near the main fault. Ruptures located away from the main fault occur mainly in the hanging wall. Where structural complexities are present (e.g., sharp bends, step-overs), WRZ is wider then for simple fault traces. We also fitted the distribution of the WRZ dataset with probability density functions, in order to define a criterion to remove outliers (e.g., by selecting 90% or 95% probability) and define the zone where the probability of SFRH is the highest. This might help in sizing the zones of SFRH during seismic microzonation (SM) mapping. In order to shape zones of SFRH, a very detailed earthquake geologic study of the fault is necessary. In the absence of such a very detailed study, during basic (First level) SM mapping, a width of 350-400 m seems to be recommended (95% of probability). If the fault is carefully mapped (higher level SM), one must consider that the highest SFRH is concentrated in a narrow zone, 50 m-wide, that should be considered as a "fault-avoidance (or setback) zone". These fault zones should be asymmetric. The ratio of footwall to hanging wall (FW:HW) calculated here ranges from 1:5 to 1:3.

  3. Short-term memory binding deficits in Alzheimer's disease

    OpenAIRE

    Parra, Mario; Abrahams, S.; Fabi, K.; Logie, R.; Luzzi, S.; Della Sala, Sergio

    2009-01-01

    Alzheimer's disease impairs long term memories for related events (e.g. faces with names) more than for single events (e.g. list of faces or names). Whether or not this associative or ‘binding’ deficit is also found in short-term memory has not yet been explored. In two experiments we investigated binding deficits in verbal short-term memory in Alzheimer's disease. Experiment 1 : 23 patients with Alzheimer's disease and 23 age and education matched healthy elderly were recruited. Participants...

  4. Relationship between short-term sexual strategies and sexual jealousy.

    Science.gov (United States)

    Mathes, Eugene W

    2005-02-01

    In a classic study, Buss, Larson, Westen, and Semmelroth reported that men were more distressed by the thought of a partner's sexual infidelity (sexual jealousy) and women were more distressed by the thought of a partner's emotional infidelity (emotional jealousy). Initially, Buss and his associates explained these results by suggesting that men are concerned about uncertainty of paternity, that is, the possibility of raising another man's child while believing the child is their own. However, later they explained the results in terms of men's preference for short-term sexual strategies. The purpose of this research was to test the explanation of short-term sexual strategies. Men and women subjects were instructed to imagine themselves in a relationship which was either short-term (primarily sexual) or long-term (involving commitment) and then respond to Buss's jealousy items. It was hypothesized that, when both men and women imagined a short-term relationship, they would be more threatened by a partner's sexual infidelity, and, when they imagined a long-term relationship, they would be more threatened by a partner's emotional infidelity. Support was found for this hypothesis.

  5. Plant state display device after occurrence of earthquake

    International Nuclear Information System (INIS)

    Kitada, Yoshio; Yonekura, Kazuyoshi.

    1992-01-01

    If a nuclear power plant should encounter earthquakes, an earthquake response analysis value previously stored and the earthquakes observed are compared to judge the magnitude of the earthquakes. From the result of the judgement, a possibility that an abnormality is recognized in plant equipment systems after the earthquakes is evaluated, in comparison with a previously stored earthquake fragility data base of each of equipment/systems. The result of the evaluation is displayed in a central control chamber. The plant equipment system is judged such that abnormalities are recognized at a high probability is evaluated by a previously stored earthquake PSA method for the influence of the abnormality on plant safety, and the result is displayed in the central control chamber. (I.S.)

  6. Impact of long-term and short-term therapies on seminal parameters

    Directory of Open Access Journals (Sweden)

    Jlenia Elia

    2013-04-01

    Full Text Available Aim: The aim of this work was: i to evaluate the prevalence of male partners of subfertile couples being treated with long/short term therapies for non andrological diseases; ii to study their seminal profile for the possible effects of their treatments on spermatogenesis and/or epididymal maturation. Methods: The study group was made up of 723 subjects, aged between 25 and 47 years. Semen analysis was performed according to World Health Organization (WHO guidelines (1999. The Superimposed Image Analysis System (SIAS, which is based on the computerized superimposition of spermatozoa images, was used to assess sperm motility parameters. Results: The prevalence of subjects taking pharmacological treatments was 22.7% (164/723. The prevalence was 3.7% (27/723 for the Short-Term Group and 18.9% (137/723 for the Long-Term Group. The subjects of each group were also subdivided into subgroups according to the treatments being received. Regarding the seminal profile, we did not observe a significant difference between the Long-Term, Short-Term or the Control Group. However, regarding the subgroups, we found a significant decrease in sperm number and progressive motility percentage in the subjects receiving treatment with antihypertensive drugs compared with the other subgroups and the Control Group. Conclusions: In the management of infertile couples, the potential negative impact on seminal parameters of any drugs being taken as Long-Term Therapy should be considered. The pathogenic mechanism needs to be clarified.

  7. Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) Model - An Unified Concept for Earthquake Precursors Validation

    Science.gov (United States)

    Pulinets, S.; Ouzounov, D.

    2010-01-01

    The paper presents a conception of complex multidisciplinary approach to the problem of clarification the nature of short-term earthquake precursors observed in atmosphere, atmospheric electricity and in ionosphere and magnetosphere. Our approach is based on the most fundamental principles of tectonics giving understanding that earthquake is an ultimate result of relative movement of tectonic plates and blocks of different sizes. Different kind of gases: methane, helium, hydrogen, and carbon dioxide leaking from the crust can serve as carrier gases for radon including underwater seismically active faults. Radon action on atmospheric gases is similar to the cosmic rays effects in upper layers of atmosphere: it is the air ionization and formation by ions the nucleus of water condensation. Condensation of water vapor is accompanied by the latent heat exhalation is the main cause for observing atmospheric thermal anomalies. Formation of large ion clusters changes the conductivity of boundary layer of atmosphere and parameters of the global electric circuit over the active tectonic faults. Variations of atmospheric electricity are the main source of ionospheric anomalies over seismically active areas. Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model can explain most of these events as a synergy between different ground surface, atmosphere and ionosphere processes and anomalous variations which are usually named as short-term earthquake precursors. A newly developed approach of Interdisciplinary Space-Terrestrial Framework (ISTF) can provide also a verification of these precursory processes in seismically active regions. The main outcome of this paper is the unified concept for systematic validation of different types of earthquake precursors united by physical basis in one common theory.

  8. Model documentation report: Short-Term Hydroelectric Generation Model

    International Nuclear Information System (INIS)

    1993-08-01

    The purpose of this report is to define the objectives of the Short- Term Hydroelectric Generation Model (STHGM), describe its basic approach, and to provide details on the model structure. This report is intended as a reference document for model analysts, users, and the general public. Documentation of the model is in accordance with the Energy Information Administration's (AYE) legal obligation to provide adequate documentation in support of its models (Public Law 94-385, Section 57.b.2). The STHGM performs a short-term (18 to 27- month) forecast of hydroelectric generation in the United States using an autoregressive integrated moving average (UREMIA) time series model with precipitation as an explanatory variable. The model results are used as input for the short-term Energy Outlook

  9. Decision making biases in the communication of earthquake risk

    Science.gov (United States)

    Welsh, M. B.; Steacy, S.; Begg, S. H.; Navarro, D. J.

    2015-12-01

    L'Aquila, with 6 scientists convicted of manslaughter, shocked the scientific community, leading to urgent re-appraisal of communication methods for low-probability, high-impact events. Before the trial, a commission investigating the earthquake recommended risk assessment be formalised via operational earthquake forecasts and that social scientists be enlisted to assist in developing communication strategies. Psychological research has identified numerous decision biases relevant to this, including hindsight bias, where people (after the fact) overestimate an event's predictability. This affects experts as well as naïve participants as it relates to their ability to construct a plausible causal story rather than the likelihood of the event. Another problem is availability, which causes overestimation of the likelihood of observed rare events due to their greater noteworthiness. This, however, is complicated by the 'description-experience' gap, whereby people underestimate probabilities for events they have not experienced. That is, people who have experienced strong earthquakes judge them more likely while those who have not judge them less likely - relative to actual probabilities. Finally, format changes alter people's decisions. That is people treat '1 in 10,000' as different from 0.01% despite their mathematical equivalence. Such effects fall under the broad term framing, which describes how different framings of the same event alter decisions. In particular, people's attitude to risk depends significantly on how scenarios are described. We examine the effect of biases on the communication of change in risk. South Australian participants gave responses to scenarios describing familiar (bushfire) or unfamiliar (earthquake) risks. While bushfires are rare in specific locations, significant fire events occur each year and are extensively covered. By comparison, our study location (Adelaide) last had a M5 quake in 1954. Preliminary results suggest the description

  10. Short-term flow induced crystallization in isotactic polypropylene : how short is short?

    NARCIS (Netherlands)

    Ma, Z.; Balzano, L.; Portale, G.; Peters, G.W.M.

    2013-01-01

    The so-called "short-term flow" protocol is widely applied in experimental flow-induced crystallization studies on polymers in order to separate the nucleation and subsequent growth processes [Liedauer et al. Int. Polym. Proc. 1993, 8, 236–244]. The basis of this protocol is the assumption that

  11. Comparison of Short Term with Long Term Catheterization after Anterior Colporrhaphy Surgery

    Directory of Open Access Journals (Sweden)

    F. Movahed

    2010-07-01

    Full Text Available Introduction & Objective: This belief that overfilling the bladder after anterior colporrhaphy might have a negative influence on surgical outcome, causes routine catheterization after operation. This study was done to compare short term (24h with long term (72h catheterization after anterior colporrhaphy.Materials & Methods: This randomized clinical trial was carried out at Kosar Hospital , Qazvin (Iran in 2005-2006. One hundred cases candidating for anterior colporrhaphy , were divided in two equal groups . In the first group foley catheter was removed 24 hours and in the second group 72 hours after the operation. Before removing catheter, urine sample was obtained for culture . After removal and urination, residual volume was determinded. If the volume exceeded 200 ml or retention occured, the catheter would be fixed for more 72 hours. Need for recatheterization, urinary retention, positive urine culture,and hospital stay were surveyed. The data was analyzed using T and Fisher tests.Results: Residual volume exceeding 200 ml and the need for recatheterization occurred in one case (2% in the short term group but in the long term group none of the subjects needed recatheterization (P=1. Retention was not seen. In the both groups, one case (2% had positive urine culture with no statistically significant difference (P=1. Mean hospital stay was short in the first group (P=0.00.Conclusion: Short term catheterization after anterior colporrhaphy does not cause urinary retention and decreases hospital stay.

  12. Short-term versus long-term market opportunities and financial constraints

    International Nuclear Information System (INIS)

    Ferrari, Angelo

    1999-01-01

    This presentation discusses gas developments in Europe, the European Gas Directive, short term vs. long term, and Snam's new challenges. The European gas market is characterized by (1) The role of gas in meeting the demand for energy, which varies greatly from one country to another, (2) A growing market, (3) Decreasing role of domestic production, and (4) Increasing imports. Within the European Union, the Gas Directive aims to transform single national markets into one integrated European market by introducing third party access to the network for eligible clients as a means of increasing the competition between operators. The Gas Directive would appear to modify the form of the market rather than its size, and in particular the sharing of responsibility and risk among operators. The market in the future will offer operators the possibility to exploit opportunities deriving mainly from demands for increased flexibility. Opportunities linked to entrepreneurial initiatives require long-term investments characteristic of the gas business. Risks and opportunities must be balanced evenly between different operators. If everyone takes on their own risks and responsibilities, this means a wider distribution of the risks of long-term vs. short-term, currently borne by the gas companies that are integrated, into a market that tends to favour the short-term. A gradual liberalization process should allow incumbent operators to gradually diversify their activities in new gas market areas or enter new business activities. They could move beyond their local and European boundaries in pursuit of an international dimension. The market will have to make the transition from the national to the European dimension: as an example, Snam covers 90% of the Italian market, but its share of an integrated European market will be about 15%

  13. Short-Term Group Treatment for Adult Children of Alcoholics.

    Science.gov (United States)

    Cooper, Alvin; McCormack, WIlliam A.

    1992-01-01

    Adult children of alcoholics (n=24) were tested on measures of loneliness, anxiety, hostility, depression, and interpersonal dependency before and after participation in short-term group therapy. Highly significant test score changes supported effectiveness of individual therapy in short-term groups. (Author/NB)

  14. Implications of fault constitutive properties for earthquake prediction.

    Science.gov (United States)

    Dieterich, J H; Kilgore, B

    1996-04-30

    The rate- and state-dependent constitutive formulation for fault slip characterizes an exceptional variety of materials over a wide range of sliding conditions. This formulation provides a unified representation of diverse sliding phenomena including slip weakening over a characteristic sliding distance Dc, apparent fracture energy at a rupture front, time-dependent healing after rapid slip, and various other transient and slip rate effects. Laboratory observations and theoretical models both indicate that earthquake nucleation is accompanied by long intervals of accelerating slip. Strains from the nucleation process on buried faults generally could not be detected if laboratory values of Dc apply to faults in nature. However, scaling of Dc is presently an open question and the possibility exists that measurable premonitory creep may precede some earthquakes. Earthquake activity is modeled as a sequence of earthquake nucleation events. In this model, earthquake clustering arises from sensitivity of nucleation times to the stress changes induced by prior earthquakes. The model gives the characteristic Omori aftershock decay law and assigns physical interpretation to aftershock parameters. The seismicity formulation predicts large changes of earthquake probabilities result from stress changes. Two mechanisms for foreshocks are proposed that describe observed frequency of occurrence of foreshock-mainshock pairs by time and magnitude. With the first mechanism, foreshocks represent a manifestation of earthquake clustering in which the stress change at the time of the foreshock increases the probability of earthquakes at all magnitudes including the eventual mainshock. With the second model, accelerating fault slip on the mainshock nucleation zone triggers foreshocks.

  15. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  16. Frequency-specific insight into short-term memory capacity

    OpenAIRE

    Feurra, Matteo; Galli, Giulia; Pavone, Enea Francesco; Rossi, Alessandro; Rossi, Simone

    2016-01-01

    We provided novel evidence of a frequency-specific effect by transcranial alternating current stimulation (tACS) of the left posterior parietal cortex on short-term memory, during a digit span task. the effect was prominent with stimulation at beta frequency for young and not for middle-aged adults and correlated with age. Our findings highlighted a short-term memory capacity improvement by tACS application.

  17. Reconciling long-term cultural diversity and short-term collective social behavior.

    Science.gov (United States)

    Valori, Luca; Picciolo, Francesco; Allansdottir, Agnes; Garlaschelli, Diego

    2012-01-24

    An outstanding open problem is whether collective social phenomena occurring over short timescales can systematically reduce cultural heterogeneity in the long run, and whether offline and online human interactions contribute differently to the process. Theoretical models suggest that short-term collective behavior and long-term cultural diversity are mutually excluding, since they require very different levels of social influence. The latter jointly depends on two factors: the topology of the underlying social network and the overlap between individuals in multidimensional cultural space. However, while the empirical properties of social networks are intensively studied, little is known about the large-scale organization of real societies in cultural space, so that random input specifications are necessarily used in models. Here we use a large dataset to perform a high-dimensional analysis of the scientific beliefs of thousands of Europeans. We find that interopinion correlations determine a nontrivial ultrametric hierarchy of individuals in cultural space. When empirical data are used as inputs in models, ultrametricity has strong and counterintuitive effects. On short timescales, it facilitates a symmetry-breaking phase transition triggering coordinated social behavior. On long timescales, it suppresses cultural convergence by restricting it within disjoint groups. Moreover, ultrametricity implies that these results are surprisingly robust to modifications of the dynamical rules considered. Thus the empirical distribution of individuals in cultural space appears to systematically optimize the coexistence of short-term collective behavior and long-term cultural diversity, which can be realized simultaneously for the same moderate level of mutual influence in a diverse range of online and offline settings.

  18. Differences in health status between long-term and short-term benzodiazepine users.

    NARCIS (Netherlands)

    Zandstra, S.M.; Furer, J.W.; Lisdonk, E.H. van de; Bor, J.H.J.; Zitman, F.G.; Weel, C. van

    2002-01-01

    BACKGROUND: Despite generally accepted advice to keep treatment short, benzodiazepines are often prescibed for more than six months. Prevention of long-term benzodiazepine use could be facilitated by the utilisation of risk indicators for long-term use. However, the characteristics of long-term

  19. The roles of long-term phonotactic and lexical prosodic knowledge in phonological short-term memory.

    Science.gov (United States)

    Tanida, Yuki; Ueno, Taiji; Lambon Ralph, Matthew A; Saito, Satoru

    2015-04-01

    Many previous studies have explored and confirmed the influence of long-term phonological representations on phonological short-term memory. In most investigations, phonological effects have been explored with respect to phonotactic constraints or frequency. If interaction between long-term memory and phonological short-term memory is a generalized principle, then other phonological characteristics-that is, suprasegmental aspects of phonology-should also exert similar effects on phonological short-term memory. We explored this hypothesis through three immediate serial-recall experiments that manipulated Japanese nonwords with respect to lexical prosody (pitch-accent type, reflecting suprasegmental characteristics) as well as phonotactic frequency (reflecting segmental characteristics). The results showed that phonotactic frequency affected the retention not only of the phonemic sequences, but also of pitch-accent patterns, when participants were instructed to recall both the phoneme sequence and accent pattern of nonwords. In addition, accent pattern typicality influenced the retention of the accent pattern: Typical accent patterns were recalled more accurately than atypical ones. These results indicate that both long-term phonotactic and lexical prosodic knowledge contribute to phonological short-term memory performance.

  20. Visual Short-Term Memory Complexity

    DEFF Research Database (Denmark)

    Sørensen, Thomas Alrik

    Several recent studies have explored the nature and limits of visual short-term memory (VSTM) (e.g. Luck & Vogel, 1997). A general VSTM capacity limit of about 3 to 4 letters has been found, thus confirming results from earlier studies (e.g. Cattell, 1885; Sperling, 1960). However, Alvarez...

  1. Brain oscillatory substrates of visual short-term memory capacity.

    Science.gov (United States)

    Sauseng, Paul; Klimesch, Wolfgang; Heise, Kirstin F; Gruber, Walter R; Holz, Elisa; Karim, Ahmed A; Glennon, Mark; Gerloff, Christian; Birbaumer, Niels; Hummel, Friedhelm C

    2009-11-17

    The amount of information that can be stored in visual short-term memory is strictly limited to about four items. Therefore, memory capacity relies not only on the successful retention of relevant information but also on efficient suppression of distracting information, visual attention, and executive functions. However, completely separable neural signatures for these memory capacity-limiting factors remain to be identified. Because of its functional diversity, oscillatory brain activity may offer a utile solution. In the present study, we show that capacity-determining mechanisms, namely retention of relevant information and suppression of distracting information, are based on neural substrates independent of each other: the successful maintenance of relevant material in short-term memory is associated with cross-frequency phase synchronization between theta (rhythmical neural activity around 5 Hz) and gamma (> 50 Hz) oscillations at posterior parietal recording sites. On the other hand, electroencephalographic alpha activity (around 10 Hz) predicts memory capacity based on efficient suppression of irrelevant information in short-term memory. Moreover, repetitive transcranial magnetic stimulation at alpha frequency can modulate short-term memory capacity by influencing the ability to suppress distracting information. Taken together, the current study provides evidence for a double dissociation of brain oscillatory correlates of visual short-term memory capacity.

  2. SHORT-TERM MEMORY IS INDEPENDENT OF BRAIN PROTEIN SYNTHESIS

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Hasker P.; Rosenzweig, Mark R.; Jones, Oliver W.

    1980-09-01

    Male Swiss albino CD-1 mice given a single injection of a cerebral protein synthesis inhibitor, anisomycin (ANI) (1 mg/animal), 20 min prior to single trial passive avoidance training demonstrated impaired retention at tests given 3 hr, 6 hr, 1 day, and 7 days after training. Retention was not significantly different from saline controls when tests were given 0.5 or 1.5 hr after training. Prolonging inhibition of brain protein synthesis by giving either 1 or 2 additional injections of ANI 2 or 2 and 4 hr after training did not prolong short-term retention performance. The temporal development of impaired retention in ANI treated mice could not be accounted for by drug dosage, duration of protein synthesis inhibition, or nonspecific sickness at test. In contrast to the suggestion that protein synthesis inhibition prolongs short-term memory (Quinton, 1978), the results of this experiment indicate that short-term memory is not prolonged by antibiotic drugs that inhibit cerebral protein synthesis. All evidence seems consistent with the hypothesis that short-term memory is protein synthesis independent and that the establishment of long-term memory depends upon protein synthesis during or shortly after training. Evidence for a role of protein synthesis in memory maintenance is discussed.

  3. Impaired short-term memory for pitch in congenital amusia.

    Science.gov (United States)

    Tillmann, Barbara; Lévêque, Yohana; Fornoni, Lesly; Albouy, Philippe; Caclin, Anne

    2016-06-01

    Congenital amusia is a neuro-developmental disorder of music perception and production. The hypothesis is that the musical deficits arise from altered pitch processing, with impairments in pitch discrimination (i.e., pitch change detection, pitch direction discrimination and identification) and short-term memory. The present review article focuses on the deficit of short-term memory for pitch. Overall, the data discussed here suggest impairments at each level of processing in short-term memory tasks; starting with the encoding of the pitch information and the creation of the adequate memory trace, the retention of the pitch traces over time as well as the recollection and comparison of the stored information with newly incoming information. These impairments have been related to altered brain responses in a distributed fronto-temporal network, associated with decreased connectivity between these structures, as well as in abnormalities in the connectivity between the two auditory cortices. In contrast, amusic participants׳ short-term memory abilities for verbal material are preserved. These findings show that short-term memory deficits in congenital amusia are specific to pitch, suggesting a pitch-memory system that is, at least partly, separated from verbal memory. This article is part of a Special Issue entitled SI: Auditory working memory. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. The uranium industry: long-term planning for short-term competition

    International Nuclear Information System (INIS)

    Vottero, X.; Georges Capus, G.

    2001-01-01

    Long term planning for short term competition Today, uranium producers face new challenges in terms of both production (new regulatory, environmental and social constraints) and market conditions (new sources of uranium supply, very low prices and tough competition). In such a context, long-term planning is not just a prerequisite to survive in the nuclear fuel cycle industry. In fact, it also contributes to sustaining nuclear electricity generation facing fierce competition from other energy sources in increasingly deregulated markets. Firstly, the risk of investing in new mining projects in western countries is growing because, on the one hand, of very erratic market conditions and, on the other hand, of increasingly lengthy, complex and unpredictable regulatory conditions. Secondly, the supply of other sources of uranium (uranium derived from nuclear weapons, uranium produced in CIS countries, ...) involve other risks, mainly related to politics and commercial restrictions. Consequently, competitive uranium supply requires not only technical competence but also financial strength and good marketing capabilities in order to anticipate long-term market trends, in terms of both demand and supply. It also requires taking into account new parameters such as politics, environment, regulations, etc. Today, a supplier dedicated to the sustainable production of nuclear electricity must manage a broad range of long-term risks inherent to the procurement of uranium. Taking into account all these parameters in a context of short-term, fast-changing market is a great challenge for the future generation. World Uranium Civilian Supply and Demand. (authors)

  5. Insensitivity of visual short-term memory to irrelevant visual information.

    Science.gov (United States)

    Andrade, Jackie; Kemps, Eva; Werniers, Yves; May, Jon; Szmalec, Arnaud

    2002-07-01

    Several authors have hypothesized that visuo-spatial working memory is functionally analogous to verbal working memory. Irrelevant background speech impairs verbal short-term memory. We investigated whether irrelevant visual information has an analogous effect on visual short-term memory, using a dynamic visual noise (DVN) technique known to disrupt visual imagery (Quinn & McConnell, 1996b). Experiment I replicated the effect of DVN on pegword imagery. Experiments 2 and 3 showed no effect of DVN on recall of static matrix patterns, despite a significant effect of a concurrent spatial tapping task. Experiment 4 showed no effect of DVN on encoding or maintenance of arrays of matrix patterns, despite testing memory by a recognition procedure to encourage visual rather than spatial processing. Serial position curves showed a one-item recency effect typical of visual short-term memory. Experiment 5 showed no effect of DVN on short-term recognition of Chinese characters, despite effects of visual similarity and a concurrent colour memory task that confirmed visual processing of the characters. We conclude that irrelevant visual noise does not impair visual short-term memory. Visual working memory may not be functionally analogous to verbal working memory, and different cognitive processes may underlie visual short-term memory and visual imagery.

  6. On the long-term seismic hazard analysis in the Zhangjiakou Penglai seismotectonic zone, China

    Science.gov (United States)

    Fu, Zhengxiang; Liu, Jie; Liu, Guiping

    2004-10-01

    The Zhangjiakou-Penglai seismotectonic zone (ZPSZ) lies in the northern part of North China and extends along the Zhangjiakou-Beijing-Tianjin-Bohai Bay-Penglai-Yellow Sea. It is about 900 km long and some 250 km wide in a northwest direction. The great Sanhe-Pinggu ( MS=8.0) earthquake occurred on September 1679 and the Tangshan ( MS=7.8) earthquake on July 1976 caused serious economic and life losses. According to some differences in crust structure and regional tectonic stress field, the ZPSZ is divided into western and eastern segment by the 117°E line for study on long-term seismic hazard analysis. An analysis of Gutenberg-Richter's empirical relation of earthquake-frequency and time process of historic and recent earthquakes along the eastern and western segments shows that the earthquake activity obeys a Poisson process, and these calculations indicate that the earthquake occurrence probability of MS=6.0-6.9 is 0.77-0.83 in the eastern segment and the earthquake occurrence probability of MS=7.0-7.9 is 0.78-0.80 in the western segment of the ZPSZ during a period from 2005 to 2015.

  7. On the electric field transient anomaly observed at the time of the Kythira M=6.9 earthquake on January 2006

    Directory of Open Access Journals (Sweden)

    M. R. Varley

    2007-11-01

    Full Text Available The study of the Earth's electromagnetic fields prior to the occurrence of strong seismic events has repeatedly revealed cases were transient anomalies, often deemed as possible earthquake precursors, were observed on electromagnetic field recordings of surface, atmosphere and near space carried out measurements. In an attempt to understand the nature of such signals several models have been proposed based upon the exhibited characteristics of the observed anomalies and different possible generation mechanisms, with electric earthquake precursors (EEP appearing to be the main candidates for short-term earthquake precursors. This paper discusses the detection of a ULF electric field transient anomaly and its identification as a possible electric earthquake precursor accompanying the Kythira M=6.9 earthquake occurred on the 8 January 2006.

  8. Remembering over the short-term: the case against the standard model.

    Science.gov (United States)

    Nairne, James S

    2002-01-01

    Psychologists often assume that short-term storage is synonymous with activation, a mnemonic property that keeps information in an immediately accessible form. Permanent knowledge is activated, as a result of on-line cognitive processing, and an activity trace is established "in" short-term (or working) memory. Activation is assumed to decay spontaneously with the passage of time, so a refreshing process-rehearsal-is needed to maintain availability. Most of the phenomena of immediate retention, such as capacity limitations and word length effects, are assumed to arise from trade-offs between rehearsal and decay. This "standard model" of how we remember over the short-term still enjoys considerable popularity, although recent research questions most of its main assumptions. In this chapter I review the recent research and identify the empirical and conceptual problems that plague traditional conceptions of short-term memory. Increasingly, researchers are recognizing that short-term retention is cue driven, much like long-term memory, and that neither rehearsal nor decay is likely to explain the particulars of short-term forgetting.

  9. Gummed-up memory: chewing gum impairs short-term recall.

    Science.gov (United States)

    Kozlov, Michail D; Hughes, Robert W; Jones, Dylan M

    2012-01-01

    Several studies have suggested that short-term memory is generally improved by chewing gum. However, we report the first studies to show that chewing gum impairs short-term memory for both item order and item identity. Experiment 1 showed that chewing gum reduces serial recall of letter lists. Experiment 2 indicated that chewing does not simply disrupt vocal-articulatory planning required for order retention: Chewing equally impairs a matched task that required retention of list item identity. Experiment 3 demonstrated that manual tapping produces a similar pattern of impairment to that of chewing gum. These results clearly qualify the assertion that chewing gum improves short-term memory. They also pose a problem for short-term memory theories asserting that forgetting is based on domain-specific interference given that chewing does not interfere with verbal memory any more than tapping. It is suggested that tapping and chewing reduce the general capacity to process sequences.

  10. A Short Term Analogue Memory

    DEFF Research Database (Denmark)

    Shah, Peter Jivan

    1992-01-01

    A short term analogue memory is described. It is based on a well-known sample-hold topology in which leakage currents have been minimized partly by circuit design and partly by layout techniques. Measurements on a test chip implemented in a standard 2.4 micron analogue CMOS process show a droop...

  11. Coupled large earthquakes in the Baikal rift system: Response to bifurcations in nonlinear resonance hysteresis

    Directory of Open Access Journals (Sweden)

    Anatoly V. Klyuchevskii

    2013-11-01

    Full Text Available The current lithospheric geodynamics and tectonophysics in the Baikal rift are discussed in terms of a nonlinear oscillator with dissipation. The nonlinear oscillator model is applicable to the area because stress change shows up as quasi-periodic inharmonic oscillations at rifting attractor structures (RAS. The model is consistent with the space-time patterns of regional seismicity in which coupled large earthquakes, proximal in time but distant in space, may be a response to bifurcations in nonlinear resonance hysteresis in a system of three oscillators corresponding to the rifting attractors. The space-time distribution of coupled MLH > 5.5 events has been stable for the period of instrumental seismicity, with the largest events occurring in pairs, one shortly after another, on two ends of the rift system and with couples of smaller events in the central part of the rift. The event couples appear as peaks of earthquake ‘migration’ rate with an approximately decadal periodicity. Thus the energy accumulated at RAS is released in coupled large events by the mechanism of nonlinear oscillators with dissipation. The new knowledge, with special focus on space-time rifting attractors and bifurcations in a system of nonlinear resonance hysteresis, may be of theoretical and practical value for earthquake prediction issues. Extrapolation of the results into the nearest future indicates the probability of such a bifurcation in the region, i.e., there is growing risk of a pending M ≈ 7 coupled event to happen within a few years.

  12. Retention interval affects visual short-term memory encoding.

    Science.gov (United States)

    Bankó, Eva M; Vidnyánszky, Zoltán

    2010-03-01

    Humans can efficiently store fine-detailed facial emotional information in visual short-term memory for several seconds. However, an unresolved question is whether the same neural mechanisms underlie high-fidelity short-term memory for emotional expressions at different retention intervals. Here we show that retention interval affects the neural processes of short-term memory encoding using a delayed facial emotion discrimination task. The early sensory P100 component of the event-related potentials (ERP) was larger in the 1-s interstimulus interval (ISI) condition than in the 6-s ISI condition, whereas the face-specific N170 component was larger in the longer ISI condition. Furthermore, the memory-related late P3b component of the ERP responses was also modulated by retention interval: it was reduced in the 1-s ISI as compared with the 6-s condition. The present findings cannot be explained based on differences in sensory processing demands or overall task difficulty because there was no difference in the stimulus information and subjects' performance between the two different ISI conditions. These results reveal that encoding processes underlying high-precision short-term memory for facial emotional expressions are modulated depending on whether information has to be stored for one or for several seconds.

  13. Promise and problems in using stress triggering models for time-dependent earthquake hazard assessment

    Science.gov (United States)

    Cocco, M.

    2001-12-01

    Earthquake stress changes can promote failures on favorably oriented faults and modify the seismicity pattern over broad regions around the causative faults. Because the induced stress perturbations modify the rate of production of earthquakes, they alter the probability of seismic events in a specified time window. Comparing the Coulomb stress changes with the seismicity rate changes and aftershock patterns can statistically test the role of stress transfer in earthquake occurrence. The interaction probability may represent a further tool to test the stress trigger or shadow model. The probability model, which incorporate stress transfer, has the main advantage to include the contributions of the induced stress perturbation (a static step in its present formulation), the loading rate and the fault constitutive properties. Because the mechanical conditions of the secondary faults at the time of application of the induced load are largely unkown, stress triggering can only be tested on fault populations and not on single earthquake pairs with a specified time delay. The interaction probability can represent the most suitable tool to test the interaction between large magnitude earthquakes. Despite these important implications and the stimulating perspectives, there exist problems in understanding earthquake interaction that should motivate future research but at the same time limit its immediate social applications. One major limitation is that we are unable to predict how and if the induced stress perturbations modify the ratio between small versus large magnitude earthquakes. In other words, we cannot distinguish between a change in this ratio in favor of small events or of large magnitude earthquakes, because the interaction probability is independent of magnitude. Another problem concerns the reconstruction of the stressing history. The interaction probability model is based on the response to a static step; however, we know that other processes contribute to

  14. Large earthquake rates from geologic, geodetic, and seismological perspectives

    Science.gov (United States)

    Jackson, D. D.

    2017-12-01

    Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes

  15. FORESHOCKS AND TIME-DEPENDENT EARTHQUAKE HAZARD ASSESSMENT IN SOUTHERN CALIFORNIA.

    Science.gov (United States)

    Jones, Lucile M.

    1985-01-01

    The probability that an earthquake in southern California (M greater than equivalent to 3. 0) will be followed by an earthquake of larger magnitude within 5 days and 10 km (i. e. , will be a foreshock) is 6 plus or minus 0. 5 per cent (1 S. D. ), and is not significantly dependent on the magnitude of the possible foreshock between M equals 3 and M equals 5. The probability that an earthquake will be followed by an M greater than equivalent to 5. 0 main shock, however, increases with magnitude of the foreshock from less than 1 per cent at M greater than equivalent to 3 to 6. 5 plus or minus 2. 5 per cent (1 S. D. ) at M greater than equivalent to 5. The main shock will most likely occur in the first hour after the foreshock, and the probability that a main shock will occur in the first hour decreases with elapsed time from the occurrence of the possible foreshock by approximately the inverse of time. Thus, the occurrence of an earthquake of M greater than equivalent to 3. 0 in southern California increases the earthquake hazard within a small space-time window several orders of magnitude above the normal background level.

  16. Comparison of Sugammadex and Neostigmine in Short Term Surgery

    Directory of Open Access Journals (Sweden)

    Fatih Koc

    2014-03-01

    Full Text Available Aim: This study compared the efficacy and cost effectivines of sugammadex and neostigmine for reversal of neuromuscular blockade induced by rocuronium for short term elective surgery. Material and Method: After written informed consent, 33 patients aged 18%u201365, ASA I-III, who were undergoing short term surgery (

  17. Long-term Postseismic Deformation Following the 1964 Alaska Earthquake

    Science.gov (United States)

    Freymueller, J. T.; Cohen, S. C.; Hreinsdöttir, S.; Suito, H.

    2003-12-01

    Geodetic data provide a rich data set describing the postseismic deformation that followed the 1964 Alaska earthquake (Mw 9.2). This is particularly true for vertical deformation, since tide gauges and leveling surveys provide extensive spatial coverage. Leveling was carried out over all of the major roads of Alaska in 1964-65, and over the last several years we have resurveyed an extensive data set using GPS. Along Turnagain Arm of Cook Inlet, south of Anchorage, a trench-normal profile was surveyed repeatedly over the first decade after the earthquake, and many of these sites have been surveyed with GPS. After using a geoid model to correct for the difference between geometric and orthometric heights, the leveling+GPS surveys reveal up to 1.25 meters of uplift since 1964. The largest uplifts are concentrated in the northern part of the Kenai Peninsula, SW of Turnagain Arm. In some places, steep gradients in the cumulative uplift measurements point to a very shallow source for the deformation. The average 1964-late 1990s uplift rates were substantially higher than the present-day uplift rates, which rarely exceed 10 mm/yr. Both leveling and tide gauge data document a decay in uplift rate over time as the postseismic signal decreases. However, even today the postseismic deformation represents a substantial portion of the total observe deformation signal, illustrating that very long-lived postseismic deformation is an important element of the subduction zone earthquake cycle for the very largest earthquakes. This is in contrast to much smaller events, such as M~8 earthquakes, for which postseismic deformation in many cases decays within a few years. This suggests that the very largest earthquakes may excite different processes than smaller events.

  18. Short-term memory for scenes with affective content

    OpenAIRE

    Maljkovic, Vera; Martini, Paolo

    2005-01-01

    The emotional content of visual images can be parameterized along two dimensions: valence (pleasantness) and arousal (intensity of emotion). In this study we ask how these distinct emotional dimensions affect the short-term memory of human observers viewing a rapid stream of images and trying to remember their content. We show that valence and arousal modulate short-term memory as independent factors. Arousal influences dramatically the average speed of data accumulation in memory: Higher aro...

  19. Attention restores discrete items to visual short-term memory.

    Science.gov (United States)

    Murray, Alexandra M; Nobre, Anna C; Clark, Ian A; Cravo, André M; Stokes, Mark G

    2013-04-01

    When a memory is forgotten, is it lost forever? Our study shows that selective attention can restore forgotten items to visual short-term memory (VSTM). In our two experiments, all stimuli presented in a memory array were designed to be equally task relevant during encoding. During the retention interval, however, participants were sometimes given a cue predicting which of the memory items would be probed at the end of the delay. This shift in task relevance improved recall for that item. We found that this type of cuing improved recall for items that otherwise would have been irretrievable, providing critical evidence that attention can restore forgotten information to VSTM. Psychophysical modeling of memory performance has confirmed that restoration of information in VSTM increases the probability that the cued item is available for recall but does not improve the representational quality of the memory. We further suggest that attention can restore discrete items to VSTM.

  20. Foreshock activity and its probabilistic relation to earthquake occurrence in Albania and the surrounding area

    Directory of Open Access Journals (Sweden)

    K. Irikura

    1999-06-01

    Full Text Available We investigate some characteristics of foreshock activity of moderate and large earthquakes which occurred in the present century in Albania and the surrounding area. Using a prediction algorithm, based on possible foreshocks, we obtained a probabilistic relation between possible foreshocks and mainshocks. From documentary and instrumental data for the period 1901-1994 for the area between 39.0°- 43.0°N and 18.5°-21.5°E we evaluated the probability of the occurrence of mainshocks immediately after their possible foreshocks. The result shows that the probability that mainshocks with magnitude M ³ 6.0 are preceded by a foreshock with magnitude M ³ 4.4, distance £ about 50 km and time £ 10 days is 38% (6/16. The probability that one earthquake with M ³ 4.4 will be followed by a larger earthquake with M ³ 6.0 within about 50 km and 10 days is 1.3% (6/468, but the probability increases to 33% (1/3 if 7 earthquakes with M ³ 4.4 occur within about 50 km and 10 days. From instrumental data for the period 1971-1994, the probability that mainshocks with M ³ 5.0 are preceded by a foreshock with magnitude M ³ 4.0 is 33% (5/15. The probability that one earthquake with M ³ 4.0 will be followed by a larger earthquake with M ³ 5.0 within about 50 km and 10 days is 1.9% (5/262, but the probability increase to 5.6% (1/18 if 3 earthquakes with M ³ 4.0 occur within about 50 km and 10 days. We also found a regional variation of foreshock activity with activity decreasing from the Vlora-Elbasani-Dibra transversal seismic belt to the Ionian-Adriatic seismic zone to the interior part of Albania seismic zone.

  1. LANGUAGE REPETITION AND SHORT-TERM MEMORY: AN INTEGRATIVE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Steve eMajerus

    2013-07-01

    Full Text Available Short-term maintenance of verbal information is a core factor of language repetition, especially when reproducing multiple or unfamiliar stimuli. Many models of language processing locate the verbal short-term maintenance function in the left posterior superior temporo-parietal area and its connections with the inferior frontal gyrus. However, research in the field of short-term memory has implicated bilateral fronto-parietal networks, involved in attention and serial order processing, as being critical for the maintenance and reproduction of verbal sequences. We present here an integrative framework aimed at bridging research in the language processing and short-term memory fields. This framework considers verbal short-term maintenance as an emergent function resulting from synchronized and integrated activation in dorsal and ventral language processing networks as well as fronto-parietal attention and serial order processing networks. To-be-maintained item representations are temporarily activated in the dorsal and ventral language processing networks, novel phoneme and word serial order information is proposed to be maintained via a right fronto-parietal serial order processing network, and activation in these different networks is proposed to be coordinated and maintained via a left fronto-parietal attention processing network. This framework provides new perspectives for our understanding of information maintenance at the nonword-, word- and sentence-level as well as of verbal maintenance deficits in case of brain injury.

  2. Language repetition and short-term memory: an integrative framework.

    Science.gov (United States)

    Majerus, Steve

    2013-01-01

    Short-term maintenance of verbal information is a core factor of language repetition, especially when reproducing multiple or unfamiliar stimuli. Many models of language processing locate the verbal short-term maintenance function in the left posterior superior temporo-parietal area and its connections with the inferior frontal gyrus. However, research in the field of short-term memory has implicated bilateral fronto-parietal networks, involved in attention and serial order processing, as being critical for the maintenance and reproduction of verbal sequences. We present here an integrative framework aimed at bridging research in the language processing and short-term memory fields. This framework considers verbal short-term maintenance as an emergent function resulting from synchronized and integrated activation in dorsal and ventral language processing networks as well as fronto-parietal attention and serial order processing networks. To-be-maintained item representations are temporarily activated in the dorsal and ventral language processing networks, novel phoneme and word serial order information is proposed to be maintained via a right fronto-parietal serial order processing network, and activation in these different networks is proposed to be coordinated and maintained via a left fronto-parietal attention processing network. This framework provides new perspectives for our understanding of information maintenance at the non-word-, word- and sentence-level as well as of verbal maintenance deficits in case of brain injury.

  3. Short-horizon regulation for long-term investors

    NARCIS (Netherlands)

    Shi, Z.; Werker, B.J.M.

    2012-01-01

    We study the effects of imposing repeated short-horizon regulatory constraints on long-term investors. We show that Value-at-Risk and Expected Shortfall constraints, when imposed dynamically, lead to similar optimal portfolios and wealth distributions. We also show that, in utility terms, the costs

  4. Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence

    Science.gov (United States)

    Ratliff, J. L.; Porter, K.

    2014-12-01

    Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.

  5. Temporal Variation of Tectonic Tremor Activity Associated with Nearby Earthquakes

    Science.gov (United States)

    Chao, K.; Van der Lee, S.; Hsu, Y. J.; Pu, H. C.

    2017-12-01

    Tectonic tremor and slow slip events, located downdip from the seismogenic zone, hold the key to recurring patterns of typical earthquakes. Several findings of slow aseismic slip during the prenucletion processes of nearby earthquakes have provided new insight into the study of stress transform of slow earthquakes in fault zones prior to megathrust earthquakes. However, how tectonic tremor is associated with the occurrence of nearby earthquakes remains unclear. To enhance our understanding of the stress interaction between tremor and earthquakes, we developed an algorithm for the automatic detection and location of tectonic tremor in the collisional tectonic environment in Taiwan. Our analysis of a three-year data set indicates a short-term increase in the tremor rate starting at 19 days before the 2010 ML6.4 Jiashian main shock (Chao et al., JGR, 2017). Around the time when the tremor rate began to rise, one GPS station recorded a flip in its direction of motion. We hypothesize that tremor is driven by a slow-slip event that preceded the occurrence of the shallower nearby main shock, even though the inferred slip is too small to be observed by all GPS stations. To better quantify what the necessary condition for tremor to response to nearby earthquakes is, we obtained a 13-year ambient tremor catalog from 2004 to 2016 in the same region. We examine the spatiotemporal relationship between tremor and 37 ML>=5.0 (seven events with ML>=6.0) nearby earthquakes located within 0.5 degrees to the active tremor sources. The findings from this study can enhance our understanding of the interaction among tremor, slow slip, and nearby earthquakes in the high seismic hazard regions.

  6. Multifractals embedded in short time series: An unbiased estimation of probability moment

    Science.gov (United States)

    Qiu, Lu; Yang, Tianguang; Yin, Yanhua; Gu, Changgui; Yang, Huijie

    2016-12-01

    An exact estimation of probability moments is the base for several essential concepts, such as the multifractals, the Tsallis entropy, and the transfer entropy. By means of approximation theory we propose a new method called factorial-moment-based estimation of probability moments. Theoretical prediction and computational results show that it can provide us an unbiased estimation of the probability moments of continuous order. Calculations on probability redistribution model verify that it can extract exactly multifractal behaviors from several hundred recordings. Its powerfulness in monitoring evolution of scaling behaviors is exemplified by two empirical cases, i.e., the gait time series for fast, normal, and slow trials of a healthy volunteer, and the closing price series for Shanghai stock market. By using short time series with several hundred lengths, a comparison with the well-established tools displays significant advantages of its performance over the other methods. The factorial-moment-based estimation can evaluate correctly the scaling behaviors in a scale range about three generations wider than the multifractal detrended fluctuation analysis and the basic estimation. The estimation of partition function given by the wavelet transform modulus maxima has unacceptable fluctuations. Besides the scaling invariance focused in the present paper, the proposed factorial moment of continuous order can find its various uses, such as finding nonextensive behaviors of a complex system and reconstructing the causality relationship network between elements of a complex system.

  7. Verbal short-term memory and vocabulary learning in polyglots.

    Science.gov (United States)

    Papagno, C; Vallar, G

    1995-02-01

    Polyglot and non-polyglot Italian subjects were given tests assessing verbal (phonological) and visuo-spatial short-term and long-term memory, general intelligence, and vocabulary knowledge in their native language. Polyglots had a superior level of performance in verbal short-term memory tasks (auditory digit span and nonword repetition) and in a paired-associate learning test, which assessed the subjects' ability to acquire new (Russian) words. By contrast, the two groups had comparable performance levels in tasks assessing general intelligence, visuo-spatial short-term memory and learning, and paired-associate learning of Italian words. These findings, which are in line with neuropsychological and developmental evidence, as well as with data from normal subjects, suggest a close relationship between the capacity of phonological memory and the acquisition of foreign languages.

  8. The stimulation of hematosis on short-term and prolong irradiation

    International Nuclear Information System (INIS)

    Tukhtaev, T.M.

    1978-01-01

    This book studies the stimulation of hematosis on short-term and prolong irradiation, pathogenetic mechanisms of lesion and reconstruction of hematosis at critical radiation sickness, action hematosis stimulators in short-term irradiation conditions

  9. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    Directory of Open Access Journals (Sweden)

    W. F. Peng

    2012-03-01

    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  10. Rapid effects of estrogens on short-term memory: Possible mechanisms.

    Science.gov (United States)

    Paletta, Pietro; Sheppard, Paul A S; Matta, Richard; Ervin, Kelsy S J; Choleris, Elena

    2018-06-01

    Estrogens affect learning and memory through rapid and delayed mechanisms. Here we review studies on rapid effects on short-term memory. Estradiol rapidly improves social and object recognition memory, spatial memory, and social learning when administered systemically. The dorsal hippocampus mediates estrogen rapid facilitation of object, social and spatial short-term memory. The medial amygdala mediates rapid facilitation of social recognition. The three estrogen receptors, α (ERα), β (ERβ) and the G-protein coupled estrogen receptor (GPER) appear to play different roles depending on the task and brain region. Both ERα and GPER agonists rapidly facilitate short-term social and object recognition and spatial memory when administered systemically or into the dorsal hippocampus and facilitate social recognition in the medial amygdala. Conversely, only GPER can facilitate social learning after systemic treatment and an ERβ agonist only rapidly improved short-term spatial memory when given systemically or into the hippocampus, but also facilitates social recognition in the medial amygdala. Investigations into the mechanisms behind estrogens' rapid effects on short term memory showed an involvement of the extracellular signal-regulated kinase (ERK) and the phosphoinositide 3-kinase (PI3K) kinase pathways. Recent evidence also showed that estrogens interact with the neuropeptide oxytocin in rapidly facilitating social recognition. Estrogens can increase the production and/or release of oxytocin and other neurotransmitters, such as dopamine and acetylcholine. Therefore, it is possible that estrogens' rapid effects on short-term memory may occur through the regulation of various neurotransmitters, although more research is need on these interactions as well as the mechanisms of estrogens' actions on short-term memory. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Understanding Short-Term Nonmigrating Tidal Variability in the Ionospheric Dynamo Region from SABER Using Information Theory and Bayesian Statistics

    Science.gov (United States)

    Kumari, K.; Oberheide, J.

    2017-12-01

    Nonmigrating tidal diagnostics of SABER temperature observations in the ionospheric dynamo region reveal a large amount of variability on time-scales of a few days to weeks. In this paper, we discuss the physical reasons for the observed short-term tidal variability using a novel approach based on Information theory and Bayesian statistics. We diagnose short-term tidal variability as a function of season, QBO, ENSO, and solar cycle and other drivers using time dependent probability density functions, Shannon entropy and Kullback-Leibler divergence. The statistical significance of the approach and its predictive capability is exemplified using SABER tidal diagnostics with emphasis on the responses to the QBO and solar cycle. Implications for F-region plasma density will be discussed.

  12. A 30-year history of earthquake crisis communication in California and lessons for the future

    Science.gov (United States)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  13. CN earthquake prediction algorithm and the monitoring of the future strong Vrancea events

    International Nuclear Information System (INIS)

    Moldoveanu, C.L.; Radulian, M.; Novikova, O.V.; Panza, G.F.

    2002-01-01

    The strong earthquakes originating at intermediate-depth in the Vrancea region (located in the SE corner of the highly bent Carpathian arc) represent one of the most important natural disasters able to induce heavy effects (high tool of casualties and extensive damage) in the Romanian territory. The occurrence of these earthquakes is irregular, but not infrequent. Their effects are felt over a large territory, from Central Europe to Moscow and from Greece to Scandinavia. The largest cultural and economical center exposed to the seismic risk due to the Vrancea earthquakes is Bucharest. This metropolitan area (230 km 2 wide) is characterized by the presence of 2.5 million inhabitants (10% of the country population) and by a considerable number of high-risk structures and infrastructures. The best way to face strong earthquakes is to mitigate the seismic risk by using the two possible complementary approaches represented by (a) the antiseismic design of structures and infrastructures (able to support strong earthquakes without significant damage), and (b) the strong earthquake prediction (in terms of alarm intervals declared for long, intermediate or short-term space-and time-windows). The intermediate term medium-range earthquake prediction represents the most realistic target to be reached at the present state of knowledge. The alarm declared in this case extends over a time window of about one year or more, and a space window of a few hundreds of kilometers. In the case of Vrancea events the spatial uncertainty is much less, being of about 100 km. The main measures for the mitigation of the seismic risk allowed by the intermediate-term medium-range prediction are: (a) verification of the buildings and infrastructures stability and reinforcement measures when required, (b) elaboration of emergency plans of action, (c) schedule of the main actions required in order to restore the normality of the social and economical life after the earthquake. The paper presents the

  14. Seismic‐hazard forecast for 2016 including induced and natural earthquakes in the central and eastern United States

    Science.gov (United States)

    Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.

    2016-01-01

    The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in

  15. Pattern recognition methodologies and deterministic evaluation of seismic hazard: A strategy to increase earthquake preparedness

    International Nuclear Information System (INIS)

    Peresan, Antonella; Panza, Giuliano F.; Gorshkov, Alexander I.; Aoudia, Abdelkrim

    2001-05-01

    Several algorithms, structured according to a general pattern-recognition scheme, have been developed for the space-time identification of strong events. Currently, two of such algorithms are applied to the Italian territory, one for the recognition of earthquake-prone areas and the other, namely CN algorithm, for earthquake prediction purposes. These procedures can be viewed as independent experts, hence they can be combined to better constrain the alerted seismogenic area. We examine here the possibility to integrate CN intermediate-term medium-range earthquake predictions, pattern recognition of earthquake-prone areas and deterministic hazard maps, in order to associate CN Times of Increased Probability (TIPs) to a set of appropriate scenarios of ground motion. The advantage of this procedure mainly consists in the time information provided by predictions, useful to increase preparedness of safety measures and to indicate a priority for detailed seismic risk studies to be performed at a local scale. (author)

  16. Does paleoseismology forecast the historic rates of large earthquakes on the San Andreas fault system?

    Science.gov (United States)

    Biasi, Glenn; Scharer, Katherine M.; Weldon, Ray; Dawson, Timothy E.

    2016-01-01

    The 98-year open interval since the most recent ground-rupturing earthquake in the greater San Andreas boundary fault system would not be predicted by the quasi-periodic recurrence statistics from paleoseismic data. We examine whether the current hiatus could be explained by uncertainties in earthquake dating. Using seven independent paleoseismic records, 100 year intervals may have occurred circa 1150, 1400, and 1700 AD, but they occur in a third or less of sample records drawn at random. A second method sampling from dates conditioned on the existence of a gap of varying length suggests century-long gaps occur 3-10% of the time. A combined record with more sites would lead to lower probabilities. Systematic data over-interpretation is considered an unlikely explanation. Instead some form of non-stationary behaviour seems required, perhaps through long-range fault interaction. Earthquake occurrence since 1000 AD is not inconsistent with long-term cyclicity suggested from long runs of earthquake simulators.

  17. Short-Term Energy Outlook: Quarterly projections. Fourth quarter 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-11-05

    The Energy Information Administration (EIA) prepares quarterly, short-term energy supply, demand, and price projections for publication in February, May, August, and November in the Short-Term Energy Outlook (Outlook). An annual supplement analyzes the performance of previous forecasts, compares recent cases with those of other forecasting services, and discusses current topics related to the short-term energy markets. (See Short-Term Energy Outlook Annual Supplement, DOE/EIA-0202.) The forecast period for this issue of the Outlook extends from the fourth quarter of 1993 through the fourth quarter of 1994. Values for the third quarter of 1993, however, are preliminary EIA estimates (for example, some monthly values for petroleum supply and disposition are derived in part from weekly data reported in the Weekly Petroleum Status Report) or are calculated from model simulations using the latest exogenous information available (for example, electricity sales and generation are simulated using actual weather data). The historical energy data are EIA data published in the Monthly Energy Review, Petroleum Supply Monthly, and other EIA publications.

  18. Category Specific Knowledge Modulate Capacity Limitations of Visual Short-Term Memory

    DEFF Research Database (Denmark)

    Dall, Jonas Olsen; Watanabe, Katsumi; Sørensen, Thomas Alrik

    2016-01-01

    We explore whether expertise can modulate the capacity of visual short-term memory, as some seem to argue that training affects capacity of short-term memory [13] while others are not able to find this modulation [12]. We extend on a previous study [3] demonstrating expertise effects by investiga...... are in line with the theoretical interpretation that visual short-term memory reflects the sum of the reverberating feedback loops to representations in long-term memory.......We explore whether expertise can modulate the capacity of visual short-term memory, as some seem to argue that training affects capacity of short-term memory [13] while others are not able to find this modulation [12]. We extend on a previous study [3] demonstrating expertise effects......), and expert observers (Japanese university students). For both the picture and the letter condition we find no performance difference in memory capacity, however, in the critical hiragana condition we demonstrate a systematic difference relating expertise differences between the groups. These results...

  19. Soil structure interactions of eastern U.S. type earthquakes

    International Nuclear Information System (INIS)

    Chang Chen; Serhan, S.

    1991-01-01

    Two types of earthquakes have occurred in the eastern US in the past. One of them was the infrequent major events such as the 1811-1812 New Madrid Earthquakes, or the 1886 Charleston Earthquake. The other type was the frequent shallow earthquakes with high frequency, short duration and high accelerations. Two eastern US nuclear power plants, V.C Summer and Perry, went through extensive licensing effort to obtain fuel load licenses after this type of earthquake was recorded on sites and exceeded the design bases beyond 10 hertz region. This paper discusses the soil-structure interactions of the latter type of earthquakes

  20. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  1. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  2. Short-term mechanisms influencing volumetric brain dynamics

    NARCIS (Netherlands)

    Dieleman, Nikki; Koek, Huiberdina L.; Hendrikse, Jeroen

    2017-01-01

    With the use of magnetic resonance imaging (MRI) and brain analysis tools, it has become possible to measure brain volume changes up to around 0.5%. Besides long-term brain changes caused by atrophy in aging or neurodegenerative disease, short-term mechanisms that influence brain volume may exist.

  3. Potentials of short term and long term cryopreserved sperm of the ...

    African Journals Online (AJOL)

    To service the growing demand for male African giant catfish (Clarias gariepinus) broodstock for aquaculture in Nigeria, and to conserve valuable genetic resources, we improved both short-term (in deep freezer at -35°C) and long-term cryopreservation (in liquid nitrogen at -296°C) of catfish sperm. Catfish sperm ...

  4. Possible deep fault slip preceding the 2004 Parkfield earthquake, inferred from detailed observations of tectonic tremor

    Science.gov (United States)

    Shelly, David R.

    2009-01-01

    Earthquake predictability depends, in part, on the degree to which sudden slip is preceded by slow aseismic slip. Recently, observations of deep tremor have enabled inferences of deep slow slip even when detection by other means is not possible, but these data are limited to certain areas and mostly the last decade. The region near Parkfield, California, provides a unique convergence of several years of high-quality tremor data bracketing a moderate earthquake, the 2004 magnitude 6.0 event. Here, I present detailed observations of tectonic tremor from mid-2001 through 2008 that indicate deep fault slip both before and after the Parkfield earthquake that cannot be detected with surface geodetic instruments. While there is no obvious short-term precursor, I find unidirectional tremor migration accompanied by elevated tremor rates in the 3 months prior to the earthquake, which suggests accelerated creep on the fault ∼16 km beneath the eventual earthquake hypocenter.

  5. Complex Non-volcanic Tremor in Guerrero Mexico Triggered by the 2010 Mw 8.8 Chilean Earthquake

    Science.gov (United States)

    Zigone, D.; Campillo, M.; Husker, A. L.; Kostoglodov, V.; Payero, J. S.; Frank, W.; Shapiro, N. M.; Voisin, C.; Cougoulat, G.; Cotte, N.

    2010-12-01

    In this study we analyze the tremors triggered in Guerrero region (Mexico) by the 2010 magnitude 8.8 Chilean Earthquake using mini-seismic array data from the French-Mexican G-GAP project and broadband data from the Servicio Sismologico Nacional of Mexico. The strong dynamic shaking by the earthquake produced the first observed triggered non-volcanic tremors (NVT) in Mexico so far with at least 3 different types of tremors at different time scales. There was a slow slip event (SSE) occurring at the time of the earthquake, which may have increased the probability of tremor triggering in the region. The first type of observed triggered tremors occurred during the S waves, Love waves and Rayleigh waves as already reported in other subductions zones and continental faults (Miyazawa and Mori, 2005, 2006; Rubinstein et al., 2007; Gomberg et al., 2008; Peng et al, 2009…). The greatest amount of energy and duration accompanies the long-period Rayleigh waves, with smaller bursts during the S and Love waves. For this particular tremor we observed the dispersion of Rayleigh waves in the envelopes of triggered tremors, which indicates a very strong modulation of the source by the passing surface wave. An unexpected short-term tremor occurred approximately one hour later of the arrival of the surface waves on the coastal stations. The NVT has only been previously observed at distances > 100 km inland. It also has a shorter frequency range (3-6 Hz) than other NVT (1-10 Hz) observed in the region. Finally, we observed a significant increase of so-called ambient tremor activity with higher intensity than all triggered NVT during the days after the earthquake. This study adds new types of tremors to the lexicon of triggered NVT observed in the world.

  6. Sensitivity of Earthquake Loss Estimates to Source Modeling Assumptions and Uncertainty

    Science.gov (United States)

    Reasenberg, Paul A.; Shostak, Nan; Terwilliger, Sharon

    2006-01-01

    Introduction: This report explores how uncertainty in an earthquake source model may affect estimates of earthquake economic loss. Specifically, it focuses on the earthquake source model for the San Francisco Bay region (SFBR) created by the Working Group on California Earthquake Probabilities. The loss calculations are made using HAZUS-MH, a publicly available computer program developed by the Federal Emergency Management Agency (FEMA) for calculating future losses from earthquakes, floods and hurricanes within the United States. The database built into HAZUS-MH includes a detailed building inventory, population data, data on transportation corridors, bridges, utility lifelines, etc. Earthquake hazard in the loss calculations is based upon expected (median value) ground motion maps called ShakeMaps calculated for the scenario earthquake sources defined in WGCEP. The study considers the effect of relaxing certain assumptions in the WG02 model, and explores the effect of hypothetical reductions in epistemic uncertainty in parts of the model. For example, it addresses questions such as what would happen to the calculated loss distribution if the uncertainty in slip rate in the WG02 model were reduced (say, by obtaining additional geologic data)? What would happen if the geometry or amount of aseismic slip (creep) on the region's faults were better known? And what would be the effect on the calculated loss distribution if the time-dependent earthquake probability were better constrained, either by eliminating certain probability models or by better constraining the inherent randomness in earthquake recurrence? The study does not consider the effect of reducing uncertainty in the hazard introduced through models of attenuation and local site characteristics, although these may have a comparable or greater effect than does source-related uncertainty. Nor does it consider sources of uncertainty in the building inventory, building fragility curves, and other assumptions

  7. Forecasting stock return volatility: A comparison between the roles of short-term and long-term leverage effects

    Science.gov (United States)

    Pan, Zhiyuan; Liu, Li

    2018-02-01

    In this paper, we extend the GARCH-MIDAS model proposed by Engle et al. (2013) to account for the leverage effect in short-term and long-term volatility components. Our in-sample evidence suggests that both short-term and long-term negative returns can cause higher future volatility than positive returns. Out-of-sample results show that the predictive ability of GARCH-MIDAS is significantly improved after taking the leverage effect into account. The leverage effect for short-term volatility component plays more important role than the leverage effect for long-term volatility component in affecting out-of-sample forecasting performance.

  8. Earthquake scaling laws for rupture geometry and slip heterogeneity

    Science.gov (United States)

    Thingbaijam, Kiran K. S.; Mai, P. Martin; Goda, Katsuichiro

    2016-04-01

    We analyze an extensive compilation of finite-fault rupture models to investigate earthquake scaling of source geometry and slip heterogeneity to derive new relationships for seismic and tsunami hazard assessment. Our dataset comprises 158 earthquakes with a total of 316 rupture models selected from the SRCMOD database (http://equake-rc.info/srcmod). We find that fault-length does not saturate with earthquake magnitude, while fault-width reveals inhibited growth due to the finite seismogenic thickness. For strike-slip earthquakes, fault-length grows more rapidly with increasing magnitude compared to events of other faulting types. Interestingly, our derived relationship falls between the L-model and W-model end-members. In contrast, both reverse and normal dip-slip events are more consistent with self-similar scaling of fault-length. However, fault-width scaling relationships for large strike-slip and normal dip-slip events, occurring on steeply dipping faults (δ~90° for strike-slip faults, and δ~60° for normal faults), deviate from self-similarity. Although reverse dip-slip events in general show self-similar scaling, the restricted growth of down-dip fault extent (with upper limit of ~200 km) can be seen for mega-thrust subduction events (M~9.0). Despite this fact, for a given earthquake magnitude, subduction reverse dip-slip events occupy relatively larger rupture area, compared to shallow crustal events. In addition, we characterize slip heterogeneity in terms of its probability distribution and spatial correlation structure to develop a complete stochastic random-field characterization of earthquake slip. We find that truncated exponential law best describes the probability distribution of slip, with observable scale parameters determined by the average and maximum slip. Applying Box-Cox transformation to slip distributions (to create quasi-normal distributed data) supports cube-root transformation, which also implies distinctive non-Gaussian slip

  9. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  10. Short term memory bowing effect is consistent with presentation rate dependent decay.

    Science.gov (United States)

    Tarnow, Eugen

    2010-12-01

    I reanalyze the free recall data of Murdock, J Exp Psychol 64(5):482-488 (1962) and Murdock and Okada, J Verbal Learn and Verbal Behav 86:263-267 (1970) which show the famous bowing effect in which initial and recent items are recalled better than intermediate items (primacy and recency effects). Recent item recall probabilities follow a logarithmic decay with time of recall consistent with the tagging/retagging theory. The slope of the decay increases with increasing presentation rate. The initial items, with an effectively low presentation rate, decay with the slowest logarithmic slope, explaining the primacy effect. The finding that presentation rate limits the duration of short term memory suggests a basis for memory loss in busy adults, for the importance of slow music practice, for long term memory deficiencies for people with attention deficits who may be artificially increasing the presentation rates of their surroundings. A well-defined, quantitative measure of the primacy effect is introduced.

  11. Short-term and working memory impairments in aphasia.

    Science.gov (United States)

    Potagas, Constantin; Kasselimis, Dimitrios; Evdokimidis, Ioannis

    2011-08-01

    The aim of the present study is to investigate short-term memory and working memory deficits in aphasics in relation to the severity of their language impairment. Fifty-eight aphasic patients participated in this study. Based on language assessment, an aphasia score was calculated for each patient. Memory was assessed in two modalities, verbal and spatial. Mean scores for all memory tasks were lower than normal. Aphasia score was significantly correlated with performance on all memory tasks. Correlation coefficients for short-term memory and working memory were approximately of the same magnitude. According to our findings, severity of aphasia is related with both verbal and spatial memory deficits. Moreover, while aphasia score correlated with lower scores in both short-term memory and working memory tasks, the lack of substantial difference between corresponding correlation coefficients suggests a possible primary deficit in information retention rather than impairment in working memory. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. The Effects of the 1999 Turkish Earthquake on Young Children: Analyzing Traumatized Children's Completion of Short Stories

    Science.gov (United States)

    Oncu, Elif Celebi; Wise, Aysegul Metindogan

    2010-01-01

    The purpose of this exploratory study was to determine whether projective techniques could identify long-term consequences among children stemming from exposure to a traumatic event. The first group of children (n = 53; 26 female, 27 male) experienced 2 major earthquakes at age 7, 3 months apart, in Turkey, while a similarly matched control group…

  13. Short-Term Contract Work in Adult Education (I) and (II).

    Science.gov (United States)

    Hall, Dorothea; McMath, Patricia

    1986-01-01

    This two-part article discusses short-term project contracts for adult education staff. Part one covers implications of this trend for the service and for the staff involved. Part two looks at short-term contracts from the management viewpoint. (CH)

  14. Short-term Memory of Deep RNN

    OpenAIRE

    Gallicchio, Claudio

    2018-01-01

    The extension of deep learning towards temporal data processing is gaining an increasing research interest. In this paper we investigate the properties of state dynamics developed in successive levels of deep recurrent neural networks (RNNs) in terms of short-term memory abilities. Our results reveal interesting insights that shed light on the nature of layering as a factor of RNN design. Noticeably, higher layers in a hierarchically organized RNN architecture results to be inherently biased ...

  15. The effects of spatially varying earthquake impacts on mood and anxiety symptom treatments among long-term Christchurch residents following the 2010/11 Canterbury earthquakes, New Zealand.

    Science.gov (United States)

    Hogg, Daniel; Kingham, Simon; Wilson, Thomas M; Ardagh, Michael

    2016-09-01

    This study investigates the effects of disruptions to different community environments, community resilience and cumulated felt earthquake intensities on yearly mood and anxiety symptom treatments from the New Zealand Ministry of Health's administrative databases between September 2009 and August 2012. The sample includes 172,284 long-term residents from different Christchurch communities. Living in a better physical environment was associated with lower mood and anxiety treatment rates after the beginning of the Canterbury earthquake sequence whereas an inverse effect could be found for social community environment and community resilience. These results may be confounded by pre-existing patterns, as well as intensified treatment-seeking behaviour and intervention programmes in severely affected areas. Nevertheless, the findings indicate that adverse mental health outcomes can be found in communities with worse physical but stronger social environments or community resilience post-disaster. Also, they do not necessarily follow felt intensities since cumulative earthquake intensity did not show a significant effect. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  17. Seismic dynamics in advance and after the recent strong earthquakes in Italy and New Zealand

    Science.gov (United States)

    Nekrasova, A.; Kossobokov, V. G.

    2017-12-01

    We consider seismic events as a sequence of avalanches in self-organized system of blocks-and-faults of the Earth lithosphere and characterize earthquake series with the distribution of the control parameter, η = τ × 10B × (5-M) × L C of the Unified Scaling Law for Earthquakes, USLE (where τ is inter-event time, B is analogous to the Gutenberg-Richter b-value, and C is fractal dimension of seismic locus). A systematic analysis of earthquake series in Central Italy and New Zealand, 1993-2017, suggests the existence, in a long-term, of different rather steady levels of seismic activity characterized with near constant values of η, which, in mid-term, intermittently switch at times of transitions associated with the strong catastrophic events. On such a transition, seismic activity, in short-term, may follow different scenarios with inter-event time scaling of different kind, including constant, logarithmic, power law, exponential rise/decay or a mixture of those. The results do not support the presence of universality in seismic energy release. The observed variability of seismic activity in advance and after strong (M6.0+) earthquakes in Italy and significant (M7.0+) earthquakes in New Zealand provides important constraints on modelling realistic earthquake sequences by geophysicists and can be used to improve local seismic hazard assessments including earthquake forecast/prediction methodologies. The transitions of seismic regime in Central Italy and New Zealand started in 2016 are still in progress and require special attention and geotechnical monitoring. It would be premature to make any kind of definitive conclusions on the level of seismic hazard which is evidently high at this particular moment of time in both regions. The study supported by the Russian Science Foundation Grant No.16-17-00093.

  18. Cardioprotective Signature of Short-Term Caloric Restriction.

    Directory of Open Access Journals (Sweden)

    Hossein Noyan

    Full Text Available To understand the molecular pathways underlying the cardiac preconditioning effect of short-term caloric restriction (CR.Lifelong CR has been suggested to reduce the incidence of cardiovascular disease through a variety of mechanisms. However, prolonged adherence to a CR life-style is difficult. Here we reveal the pathways that are modulated by short-term CR, which are associated with protection of the mouse heart from ischemia.Male 10-12 wk old C57bl/6 mice were randomly assigned to an ad libitum (AL diet with free access to regular chow, or CR, receiving 30% less food for 7 days (d, prior to myocardial infarction (MI via permanent coronary ligation. At d8, the left ventricles (LV of AL and CR mice were collected for Western blot, mRNA and microRNA (miR analyses to identify cardioprotective gene expression signatures. In separate groups, infarct size, cardiac hemodynamics and protein abundance of caspase 3 was measured at d2 post-MI.This short-term model of CR was associated with cardio-protection, as evidenced by decreased infarct size (18.5±2.4% vs. 26.6±1.7%, N=10/group; P=0.01. mRNA and miR profiles pre-MI (N=5/group identified genes modulated by short-term CR to be associated with circadian clock, oxidative stress, immune function, apoptosis, metabolism, angiogenesis, cytoskeleton and extracellular matrix (ECM. Western blots pre-MI revealed CR-associated increases in phosphorylated Akt and GSK3ß, reduced levels of phosphorylated AMPK and mitochondrial related proteins PGC-1α, cytochrome C and cyclooxygenase (COX IV, with no differences in the levels of phosphorylated eNOS or MAPK (ERK1/2; p38. CR regimen was also associated with reduced protein abundance of cleaved caspase 3 in the infarcted heart and improved cardiac function.

  19. The stability of the international oil trade network from short-term and long-term perspectives

    Science.gov (United States)

    Sun, Qingru; Gao, Xiangyun; Zhong, Weiqiong; Liu, Nairong

    2017-09-01

    To examine the stability of the international oil trade network and explore the influence of countries and trade relationships on the trade stability, we construct weighted and unweighted international oil trade networks based on complex network theory using oil trading data between countries from 1996 to 2014. We analyze the stability of international oil trade network (IOTN) from short-term and long-term aspects. From the short-term perspective, we find that the trade volumes play an important role on the stability. Moreover, the weighted IOTN is stable; however, the unweighted networks can better reflect the actual evolution of IOTN. From the long-term perspective, we identify trade relationships that are maintained during the whole sample period to reveal the situation of the whole international oil trade. We provide a way to quantitatively measure the stability of complex network from short-term and long-term perspectives, which can be applied to measure and analyze trade stability of other goods or services.

  20. Dissociating Measures of Consciousness from Measures of Short-Term Memory

    DEFF Research Database (Denmark)

    Sørensen, Thomas Alrik; Ásgeirsson, Árni Gunnar; Staugaard, Camilla Funch

    Often, the contents of consciousness are equated with the contents of short-term memory (or working memory), sometimes to a point where they are treated as identical entities. In the present study we aimed to investigate whether they may be modulated independently and thus dissociated from each...... if conscious content simply can be reduced to a cognitive process like short-term memory. In two experiments, we combined two different measures of short-term memory capacity to investigate how manipulations of set-size affect performance in observers with the Perceptual Awareness Scale (PAS) to measure...... conscious experience of the stimulus in every trial (Ramsøy & Overgaard, 2004; Overgaard & Sørensen, 2004). We trained observers to report their experience of a visual target stimulus on the four-point PAS scale; ranging from “no experience” to “clear experience”. To measure short-term memory we used...

  1. Applicability of short-term accelerated biofouling studies to predict long-term biofouling accumulation in reverse osmosis membrane systems

    KAUST Repository

    Sanawar, Huma

    2018-02-02

    Biofouling studies addressing biofouling control are mostly executed in short-term studies. It is unclear whether data collected from these experiments are representative for long-term biofouling as occurring in full-scale membrane systems. This study investigated whether short-term biofouling studies accelerated by biodegradable nutrient dosage to feed water were predictive for long-term biofouling development without nutrient dosage. Since the presence of a feed spacer has an strong effect on the degree of biofouling, this study employed six geometrically different feed spacers. Membrane fouling simulators (MFSs) were operated with the same (i) membrane, (ii) feed flow and (iii) feed water, but with feed spacers varying in geometry. For the short-term experiment, biofilm formation was enhanced by nutrient dosage to the MFS feed water, whereas no nutrient dosage was applied in the long-term experiment. Pressure drop development was monitored to characterize the extent of biofouling, while the accumulated viable biomass content at the end of the experimental run was quantified by adenosine triphosphate (ATP) measurements. Impact of feed spacer geometry on biofouling was compared for the short-term and long-term biofouling study. The results of the study revealed that the feed spacers exhibited the same biofouling behavior for (i) the short-term (9-d) study with nutrient dosage and (ii) the long-term (96-d) study without nutrient dosage. For the six different feed spacers, the accumulated viable biomass content (pg ATP.cm) was roughly the same, but the biofouling impact in terms of pressure drop increase in time was significantly different. The biofouling impact ranking of the six feed spacers was the same for the short-term and long-term biofouling studies. Therefore, it can be concluded that short-term accelerated biofouling studies in MFSs are a representative and suitable approach for the prediction of biofouling in membrane filtration systems after long-term

  2. Short-term fasting alters cytochrome P450-mediated drug metabolism in humans

    NARCIS (Netherlands)

    Lammers, Laureen A.; Achterbergh, Roos; de Vries, Emmely M.; van Nierop, F. Samuel; Klümpen, Heinz-Josef; Soeters, Maarten R.; Boelen, Anita; Romijn, Johannes A.; Mathôt, Ron A. A.

    2015-01-01

    Experimental studies indicate that short-term fasting alters drug metabolism. However, the effects of short-term fasting on drug metabolism in humans need further investigation. Therefore, the aim of this study was to evaluate the effects of short-term fasting (36 h) on P450-mediated drug

  3. First person: a mental health mission to post-earthquake El Salvador.

    Science.gov (United States)

    Katz, Craig L

    2013-09-01

    In this article the author excerpts and discusses salient quotes or moments from the journal he compiled while visiting El Salvador in February 2001 as head of Disaster Psychiatry Outreach (DPO) to assist survivors of a major earthquake. This case discussion of a single disaster mental health response exemplifies key issues related to both short and long term mental health service delivery to disaster affected communities. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Informal payments for healthcare services and short-term effects of the introduction of visit fee on these payments in Hungary.

    Science.gov (United States)

    Baji, Petra; Pavlova, Milena; Gulácsi, László; Zsófia, Homolyáné Csete; Groot, Wim

    2012-01-01

    The objective of this paper is to study the short-term effects of the introduction of the visit fee in Hungary in 2007 on informal patient payments. We present the pattern of informal payments in primary, out-patient specialist and in in-patient care in the period before and shortly after the visit fee was introduced. We also analyse whether in the short run, the introduction of visit fee decreased the probability of paying informally. For the analysis, we use a dataset for a representative sample of 2500 respondents collected in 2007 shortly after the introduction of the visit fee, which contains data on informal payments for healthcare services. According to our results, 9% of the patients paid informally during their last visit to GP (2 Euros on average), 14% paid informally for specialist care (35 Euros on average) and 50% paid informally for hospitalisation (58 Euros on average). We find a significant reduction in the probability of paying informally only for elderly patients in case of in-patient care. Our results suggest that informal payments are widely spread in Hungary, especially in in-patient care. The short run potential of the introduction of the visit fee to reduce informal payments seems to be minor. Copyright © 2011 John Wiley & Sons, Ltd.

  5. Dissociating Contents of Consciousness from Contents of Short-Term Memory

    DEFF Research Database (Denmark)

    Sørensen, Thomas Alrik; Ásgeirsson, Árni Gunnar; Staugaard, Camilla Funch

    2014-01-01

    The contents of consciousness and of short-term memory are hard to disentangle. As it seems intuitive that we represent attended objects in short-term memory and in experience, to many, it also seems intuitive to equate this content. Here we investigated memory resolution for orientation......” to a “clear experience” of a probed target. To assess memory resolution we used a Landolt-variation on the visual short-term memory (VSTM) resolution paradigm (e.g. Wilken & Ma, 2004). Set-sizes in the memory display were varied between 1, 2, or 4 elements. With increasing set-size we found that both...

  6. Some risks related to the short-term trading of natural gas

    International Nuclear Information System (INIS)

    Ahmed El Hachemi Mazighi

    2004-01-01

    Traditionally guided by long-term contracts, the international natural gas trade is experiencing new methods of operating, based on the short term and more flexibility. Today, indeed, the existence of uncommitted quantities of natural gas, combined with gas price discrepancies among different regions of the world, gives room for the expansion of the spot-trading of gas. The main objective of this paper is to discuss three fundamental risks related to the short-term trading of natural gas: volume risk, price risk and infrastructure risk. The defenders Of globalisation argue that the transition from the long-term to the short-term trading of natural gas is mainly a question of access to gas reserves, decreasing costs of gas liquefaction, the building of liquefied natural gas (LNG) fleets and regasification facilities and third-party access to the infrastructure. This process needs to be as short as possible, so that the risks related to the transition process will disappear rapidly. On the other hand, the detractors of globalisation put the emphasis on the complexity of the gas value chain and on the fact that eliminating long- term contracts increases the risks inherent to the international natural gas business. In this paper, we try to untangle and assess the risks related to the short-term trading of natural gas. Our main conclusions are: the short-term trading of gas is far from riskless; volume risk requires stock-building in both consuming and producing countries. (author)

  7. Holding multiple items in short term memory: a neural mechanism.

    Directory of Open Access Journals (Sweden)

    Edmund T Rolls

    Full Text Available Human short term memory has a capacity of several items maintained simultaneously. We show how the number of short term memory representations that an attractor network modeling a cortical local network can simultaneously maintain active is increased by using synaptic facilitation of the type found in the prefrontal cortex. We have been able to maintain 9 short term memories active simultaneously in integrate-and-fire simulations where the proportion of neurons in each population, the sparseness, is 0.1, and have confirmed the stability of such a system with mean field analyses. Without synaptic facilitation the system can maintain many fewer memories active in the same network. The system operates because of the effectively increased synaptic strengths formed by the synaptic facilitation just for those pools to which the cue is applied, and then maintenance of this synaptic facilitation in just those pools when the cue is removed by the continuing neuronal firing in those pools. The findings have implications for understanding how several items can be maintained simultaneously in short term memory, how this may be relevant to the implementation of language in the brain, and suggest new approaches to understanding and treating the decline in short term memory that can occur with normal aging.

  8. Holding multiple items in short term memory: a neural mechanism.

    Science.gov (United States)

    Rolls, Edmund T; Dempere-Marco, Laura; Deco, Gustavo

    2013-01-01

    Human short term memory has a capacity of several items maintained simultaneously. We show how the number of short term memory representations that an attractor network modeling a cortical local network can simultaneously maintain active is increased by using synaptic facilitation of the type found in the prefrontal cortex. We have been able to maintain 9 short term memories active simultaneously in integrate-and-fire simulations where the proportion of neurons in each population, the sparseness, is 0.1, and have confirmed the stability of such a system with mean field analyses. Without synaptic facilitation the system can maintain many fewer memories active in the same network. The system operates because of the effectively increased synaptic strengths formed by the synaptic facilitation just for those pools to which the cue is applied, and then maintenance of this synaptic facilitation in just those pools when the cue is removed by the continuing neuronal firing in those pools. The findings have implications for understanding how several items can be maintained simultaneously in short term memory, how this may be relevant to the implementation of language in the brain, and suggest new approaches to understanding and treating the decline in short term memory that can occur with normal aging.

  9. Holding Multiple Items in Short Term Memory: A Neural Mechanism

    Science.gov (United States)

    Rolls, Edmund T.; Dempere-Marco, Laura; Deco, Gustavo

    2013-01-01

    Human short term memory has a capacity of several items maintained simultaneously. We show how the number of short term memory representations that an attractor network modeling a cortical local network can simultaneously maintain active is increased by using synaptic facilitation of the type found in the prefrontal cortex. We have been able to maintain 9 short term memories active simultaneously in integrate-and-fire simulations where the proportion of neurons in each population, the sparseness, is 0.1, and have confirmed the stability of such a system with mean field analyses. Without synaptic facilitation the system can maintain many fewer memories active in the same network. The system operates because of the effectively increased synaptic strengths formed by the synaptic facilitation just for those pools to which the cue is applied, and then maintenance of this synaptic facilitation in just those pools when the cue is removed by the continuing neuronal firing in those pools. The findings have implications for understanding how several items can be maintained simultaneously in short term memory, how this may be relevant to the implementation of language in the brain, and suggest new approaches to understanding and treating the decline in short term memory that can occur with normal aging. PMID:23613789

  10. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  11. Qualitative similarities in the visual short-term memory of pigeons and people.

    Science.gov (United States)

    Gibson, Brett; Wasserman, Edward; Luck, Steven J

    2011-10-01

    Visual short-term memory plays a key role in guiding behavior, and individual differences in visual short-term memory capacity are strongly predictive of higher cognitive abilities. To provide a broader evolutionary context for understanding this memory system, we directly compared the behavior of pigeons and humans on a change detection task. Although pigeons had a lower storage capacity and a higher lapse rate than humans, both species stored multiple items in short-term memory and conformed to the same basic performance model. Thus, despite their very different evolutionary histories and neural architectures, pigeons and humans have functionally similar visual short-term memory systems, suggesting that the functional properties of visual short-term memory are subject to similar selective pressures across these distant species.

  12. Short-term hydropower production planning by stochastic programming

    DEFF Research Database (Denmark)

    Fleten, Stein-Erik; Kristoffersen, Trine

    2008-01-01

    -term production planning a matter of spatial distribution among the reservoirs of the plant. Day-ahead market prices and reservoir inflows are, however, uncertain beyond the current operation day and water must be allocated among the reservoirs in order to strike a balance between current profits and expected......Within the framework of multi-stage mixed-integer linear stochastic programming we develop a short-term production plan for a price-taking hydropower plant operating under uncertainty. Current production must comply with the day-ahead commitments of the previous day which makes short...

  13. Short-term variability of CYG X-1

    International Nuclear Information System (INIS)

    Oda, M.; Doi, K.; Ogawara, Y.; Takagishi, K.; Wada, M.

    1975-01-01

    The short-term X-ray variability distinguishes Cyg X-1, which is the most likely candidate of the black hole, from other X-ray sources. Present status of our knowledge on this short-term variation mainly from the Uhuru, the MIT and the GSFC observations is reviewed. The nature of impulsive variations which compose the time variation exceeding the statistical fluctuation is discussed. There are indications that the energy spectrum of large pulses is harder than the average spectrum or the large pulses are the characteristics of the hard component of the spectrum if it is composed of two, soft and hard, components. Features of the variations may be partly simulated by the superposition of random short-noise pulses with a fraction of a second duration. However, the autocorrelation analysis and the dynamic spectrum analysis indicate that the correlation lasts for several seconds and in the variation buried are some regularities which exhibit power concentrations in several frequency bands; 0.2 -- 0.3, 0.4 -- 0.5, 0.8, 1.2 -- 1.5 Hz. There are several possible interpretation of these results in terms of: e.g. a) a mixture of short-noise pulses with two or more constant durations, b) the shape of the basic shot-noise pulse, c) bunching of the pulses, d) superposition of wave-packets or temporal oscillations. But we have not yet reached any definite understandings in the nature of the variabilities. The sub-structure of the fluctuations on a time scale of milli-second suggested by two investigations is also discussed. (auth.)

  14. Some risks related to the short-term trading of natural gas

    International Nuclear Information System (INIS)

    Mazighi, Ahmed El Hachemi

    2004-01-01

    Traditionally guided by long-term contracts, the international natural gas trade is experiencing new methods of operating, based on the short term and more flexibility. Today, indeed, the existence of uncommitted quantities of natural gas, combined with gas price discrepancies among different regions of the world, gives room for the expansion of the spot-trading of gas. The main objective of this paper is to discuss three fundamental risks related to the short-term trading of natural gas: volume risk, price risk and infrastructure risk. The defenders of globalisation argue that the transition from the long-term to the short-term trading of natural gas is mainly a question of access to gas reserves, decreasing costs of gas liquefaction, the building of liquefied natural gas (LNG) fleets and regasification facilities and third-party access to the infrastructure. This process needs to be as short as possible, so that the risks related to the transition process will disappear rapidly. On the other hand, the detractors of globalisation put the emphasis on the complexity of the gas value chain and on the fact that eliminating long-term contracts increases the risks inherent to the international natural gas business. In this paper, we try to untangle and assess the risks related to the short-term trading of natural gas. Our main conclusions are: the short-term trading of gas is far from riskless; volume risk requires stock-building in both consuming and producing countries; price risk, through the high volatility for gas, induces an increase in options prices; there is no evidence to suggest that money-lenders' appetite for financing gas infrastructure projects will continue in a short-term trading system. This would be a threat to consumers' security of supply. (Author)

  15. Short-term and long-term effects of violent media on aggression in children and adults.

    Science.gov (United States)

    Bushman, Brad J; Huesmann, L Rowell

    2006-04-01

    To test whether the results of the accumulated studies on media violence and aggressive behavior are consistent with the theories that have evolved to explain the effects. We tested for the existence of both short-term and long-term effects for aggressive behavior. We also tested the theory-driven hypothesis that short-term effects should be greater for adults and long-term effects should be greater for children. Meta-analysis. Children younger than 18 years and adults. Violent media, including TV, movies, video games, music, and comic books. Measures of aggressive behavior, aggressive thoughts, angry feelings, physiological arousal (eg, heart rate, blood pressure), and helping behavior. Effect size estimates were combined using meta-analytic procedures. As expected, the short-term effects of violent media were greater for adults than for children whereas the long-term effects were greater for children than for adults. The results also showed that there were overall modest but significant effect sizes for exposure to media violence on aggressive behaviors, aggressive thoughts, angry feelings, arousal levels, and helping behavior. The results are consistent with the theory that short-term effects are mostly due to the priming of existing well-encoded scripts, schemas, or beliefs, which adults have had more time to encode. In contrast, long-term effects require the learning (encoding) of scripts, schemas, or beliefs. Children can encode new scripts, schemas, and beliefs via observational learning with less interference and effort than adults.

  16. Independence of long-term contextual memory and short-term perceptual hypotheses: Evidence from contextual cueing of interrupted search.

    Science.gov (United States)

    Schlagbauer, Bernhard; Mink, Maurice; Müller, Hermann J; Geyer, Thomas

    2017-02-01

    Observers are able to resume an interrupted search trial faster relative to responding to a new, unseen display. This finding of rapid resumption is attributed to short-term perceptual hypotheses generated on the current look and confirmed upon subsequent looks at the same display. It has been suggested that the contents of perceptual hypotheses are similar to those of other forms of memory acquired long-term through repeated exposure to the same search displays over the course of several trials, that is, the memory supporting "contextual cueing." In three experiments, we investigated the relationship between short-term perceptual hypotheses and long-term contextual memory. The results indicated that long-term, contextual memory of repeated displays neither affected the generation nor the confirmation of short-term perceptual hypotheses for these displays. Furthermore, the analysis of eye movements suggests that long-term memory provides an initial benefit in guiding attention to the target, whereas in subsequent looks guidance is entirely based on short-term perceptual hypotheses. Overall, the results reveal a picture of both long- and short-term memory contributing to reliable performance gains in interrupted search, while exerting their effects in an independent manner.

  17. Questioning short-term memory and its measurement: Why digit span measures long-term associative learning.

    Science.gov (United States)

    Jones, Gary; Macken, Bill

    2015-11-01

    Traditional accounts of verbal short-term memory explain differences in performance for different types of verbal material by reference to inherent characteristics of the verbal items making up memory sequences. The role of previous experience with sequences of different types is ostensibly controlled for either by deliberate exclusion or by presenting multiple trials constructed from different random permutations. We cast doubt on this general approach in a detailed analysis of the basis for the robust finding that short-term memory for digit sequences is superior to that for other sequences of verbal material. Specifically, we show across four experiments that this advantage is not due to inherent characteristics of digits as verbal items, nor are individual digits within sequences better remembered than other types of individual verbal items. Rather, the advantage for digit sequences stems from the increased frequency, compared to other verbal material, with which digits appear in random sequences in natural language, and furthermore, relatively frequent digit sequences support better short-term serial recall than less frequent ones. We also provide corpus-based computational support for the argument that performance in a short-term memory setting is a function of basic associative learning processes operating on the linguistic experience of the rememberer. The experimental and computational results raise questions not only about the role played by measurement of digit span in cognition generally, but also about the way in which long-term memory processes impact on short-term memory functioning. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Fractals and Forecasting in Earthquakes and Finance

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.

    2011-12-01

    It is now recognized that Benoit Mandelbrot's fractals play a critical role in describing a vast range of physical and social phenomena. Here we focus on two systems, earthquakes and finance. Since 1942, earthquakes have been characterized by the Gutenberg-Richter magnitude-frequency relation, which in more recent times is often written as a moment-frequency power law. A similar relation can be shown to hold for financial markets. Moreover, a recent New York Times article, titled "A Richter Scale for the Markets" [1] summarized the emerging viewpoint that stock market crashes can be described with similar ideas as large and great earthquakes. The idea that stock market crashes can be related in any way to earthquake phenomena has its roots in Mandelbrot's 1963 work on speculative prices in commodities markets such as cotton [2]. He pointed out that Gaussian statistics did not account for the excessive number of booms and busts that characterize such markets. Here we show that both earthquakes and financial crashes can both be described by a common Landau-Ginzburg-type free energy model, involving the presence of a classical limit of stability, or spinodal. These metastable systems are characterized by fractal statistics near the spinodal. For earthquakes, the independent ("order") parameter is the slip deficit along a fault, whereas for the financial markets, it is financial leverage in place. For financial markets, asset values play the role of a free energy. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In the case of financial models, the probabilities are closely related to implied volatility, an important component of Black-Scholes models for stock valuations. [2] B. Mandelbrot, The variation of certain speculative prices, J. Business, 36, 294 (1963)

  19. Lower bound earthquake magnitude for probabilistic seismic hazard evaluation

    International Nuclear Information System (INIS)

    McCann, M.W. Jr.; Reed, J.W.

    1990-01-01

    This paper presents the results of a study that develops an engineering and seismological basis for selecting a lower-bound magnitude (LBM) for use in seismic hazard assessment. As part of a seismic hazard analysis the range of earthquake magnitudes that are included in the assessment of the probability of exceedance of ground motion must be defined. The upper-bound magnitude is established by earth science experts based on their interpretation of the maximum size of earthquakes that can be generated by a seismic source. The lower-bound or smallest earthquake that is considered in the analysis must also be specified. The LBM limits the earthquakes that are considered in assessing the probability that specified ground motion levels are exceeded. In the past there has not been a direct consideration of the appropriate LBM value that should be used in a seismic hazard assessment. This study specifically looks at the selection of a LBM for use in seismic hazard analyses that are input to the evaluation/design of nuclear power plants (NPPs). Topics addressed in the evaluation of a LBM are earthquake experience data at heavy industrial facilities, engineering characteristics of ground motions associated with small-magnitude earthquakes, probabilistic seismic risk assessments (seismic PRAs), and seismic margin evaluations. The results of this study and the recommendations concerning a LBM for use in seismic hazard assessments are discussed. (orig.)

  20. Ordered Short-Term Memory Differs in Signers and Speakers: Implications for Models of Short-Term Memory

    Science.gov (United States)

    Bavelier, Daphne; Newport, Elissa L.; Hall, Matt; Supalla, Ted; Boutla, Mrim

    2008-01-01

    Capacity limits in linguistic short-term memory (STM) are typically measured with forward span tasks in which participants are asked to recall lists of words in the order presented. Using such tasks, native signers of American Sign Language (ASL) exhibit smaller spans than native speakers ([Boutla, M., Supalla, T., Newport, E. L., & Bavelier, D.…

  1. Preferential attachment in evolutionary earthquake networks

    Science.gov (United States)

    Rezaei, Soghra; Moghaddasi, Hanieh; Darooneh, Amir Hossein

    2018-04-01

    Earthquakes as spatio-temporal complex systems have been recently studied using complex network theory. Seismic networks are dynamical networks due to addition of new seismic events over time leading to establishing new nodes and links to the network. Here we have constructed Iran and Italy seismic networks based on Hybrid Model and testified the preferential attachment hypothesis for the connection of new nodes which states that it is more probable for newly added nodes to join the highly connected nodes comparing to the less connected ones. We showed that the preferential attachment is present in the case of earthquakes network and the attachment rate has a linear relationship with node degree. We have also found the seismic passive points, the most probable points to be influenced by other seismic places, using their preferential attachment values.

  2. The Implications of Strike-Slip Earthquake Source Properties on the Transform Boundary Development Process

    Science.gov (United States)

    Neely, J. S.; Huang, Y.; Furlong, K.

    2017-12-01

    Subduction-Transform Edge Propagator (STEP) faults, produced by the tearing of a subducting plate, allow us to study the development of a transform plate boundary and improve our understanding of both long-term geologic processes and short-term seismic hazards. The 280 km long San Cristobal Trough (SCT), formed by the tearing of the Australia plate as it subducts under the Pacific plate near the Solomon and Vanuatu subduction zones, shows along-strike variations in earthquake behaviors. The segment of the SCT closest to the tear rarely hosts earthquakes > Mw 6, whereas the SCT sections more than 80 - 100 km from the tear experience Mw7 earthquakes with repeated rupture along the same segments. To understand the effect of cumulative displacement on SCT seismicity, we analyze b-values, centroid-time delays and corner frequencies of the SCT earthquakes. We use the spectral ratio method based on Empirical Green's Functions (eGfs) to isolate source effects from propagation and site effects. We find high b-values along the SCT closest to the tear with values decreasing with distance before finally increasing again towards the far end of the SCT. Centroid time-delays for the Mw 7 strike-slip earthquakes increase with distance from the tear, but corner frequency estimates for a recent sequence of Mw 7 earthquakes are approximately equal, indicating a growing complexity in earthquake behavior with distance from the tear due to a displacement-driven transform boundary development process (see figure). The increasing complexity possibly stems from the earthquakes along the eastern SCT rupturing through multiple asperities resulting in multiple moment pulses. If not for the bounding Vanuatu subduction zone at the far end of the SCT, the eastern SCT section, which has experienced the most displacement, might be capable of hosting larger earthquakes. When assessing the seismic hazard of other STEP faults, cumulative fault displacement should be considered a key input in

  3. Pigeon visual short-term memory directly compared to primates.

    Science.gov (United States)

    Wright, Anthony A; Elmore, L Caitlin

    2016-02-01

    Three pigeons were trained to remember arrays of 2-6 colored squares and detect which of two squares had changed color to test their visual short-term memory. Procedures (e.g., stimuli, displays, viewing times, delays) were similar to those used to test monkeys and humans. Following extensive training, pigeons performed slightly better than similarly trained monkeys, but both animal species were considerably less accurate than humans with the same array sizes (2, 4 and 6 items). Pigeons and monkeys showed calculated memory capacities of one item or less, whereas humans showed a memory capacity of 2.5 items. Despite the differences in calculated memory capacities, the pigeons' memory results, like those from monkeys and humans, were all well characterized by an inverse power-law function fit to d' values for the five display sizes. This characterization provides a simple, straightforward summary of the fundamental processing of visual short-term memory (how visual short-term memory declines with memory load) that emphasizes species similarities based upon similar functional relationships. By closely matching pigeon testing parameters to those of monkeys and humans, these similar functional relationships suggest similar underlying processes of visual short-term memory in pigeons, monkeys and humans. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Short-term energy outlook: Quarterly projections, Third quarter 1992

    International Nuclear Information System (INIS)

    1992-08-01

    The Energy Information Administration (EIA) prepares quarterly, short-term energy supply, demand, and price projections for publication in February, May, August, and November in the Short-Term Energy Outlook (Outlook). An annual supplement analyzes the performance of previous forecasts, compares recent cases with those of other forecasting services, and discusses current topics related to the short-term energy markets. (See Short-Term Energy Outlook Annual Supplement, DOE/EIA-0202.) The principal users of the Outlook are managers and energy analysts in private industry and government. The forecast period for this issue of the Outlook extends from the third quarter of 1992 through the fourth quarter of 1993. Values for the second quarter of 1992, however, are preliminary EIA estimates (for example, some monthly values for petroleum supply and disposition are derived in part from weekly data reported in the Weekly Petroleum Status Report) or are calculated from model simulations using the latest exogenous information available (for example, electricity sales and generation are simulated using actual weather data). The historical energy data are EIA data published in the Monthly Energy Review, Petroleum Supply Monthly, and other EIA publications. Minor discrepancies between the data in these publications and the historical data in this Outlook are due to independent rounding

  5. Auditory-Cortex Short-Term Plasticity Induced by Selective Attention

    Science.gov (United States)

    Jääskeläinen, Iiro P.; Ahveninen, Jyrki

    2014-01-01

    The ability to concentrate on relevant sounds in the acoustic environment is crucial for everyday function and communication. Converging lines of evidence suggests that transient functional changes in auditory-cortex neurons, “short-term plasticity”, might explain this fundamental function. Under conditions of strongly focused attention, enhanced processing of attended sounds can take place at very early latencies (~50 ms from sound onset) in primary auditory cortex and possibly even at earlier latencies in subcortical structures. More robust selective-attention short-term plasticity is manifested as modulation of responses peaking at ~100 ms from sound onset in functionally specialized nonprimary auditory-cortical areas by way of stimulus-specific reshaping of neuronal receptive fields that supports filtering of selectively attended sound features from task-irrelevant ones. Such effects have been shown to take effect in ~seconds following shifting of attentional focus. There are findings suggesting that the reshaping of neuronal receptive fields is even stronger at longer auditory-cortex response latencies (~300 ms from sound onset). These longer-latency short-term plasticity effects seem to build up more gradually, within tens of seconds after shifting the focus of attention. Importantly, some of the auditory-cortical short-term plasticity effects observed during selective attention predict enhancements in behaviorally measured sound discrimination performance. PMID:24551458

  6. Short-Term Effects of Midseason Coach Turnover on Team Performance in Soccer

    Science.gov (United States)

    Balduck, Anne-Line; Buelens, Marc; Philippaerts, Renaat

    2010-01-01

    The present study addressed the issue of short-term performance effects of midseason coach turnover in soccer. The goal of this study was to examine this effect on subsequent short-term team performance. The purposes of this study were to (a) examine whether midseason coach turnover improved results in the short term, and (b) examine how team…

  7. Short-term Forecasting Tools for Agricultural Nutrient Management.

    Science.gov (United States)

    Easton, Zachary M; Kleinman, Peter J A; Buda, Anthony R; Goering, Dustin; Emberston, Nichole; Reed, Seann; Drohan, Patrick J; Walter, M Todd; Guinan, Pat; Lory, John A; Sommerlot, Andrew R; Sharpley, Andrew

    2017-11-01

    The advent of real-time, short-term farm management tools is motivated by the need to protect water quality above and beyond the general guidance offered by existing nutrient management plans. Advances in high-performance computing and hydrologic or climate modeling have enabled rapid dissemination of real-time information that can assist landowners and conservation personnel with short-term management planning. This paper reviews short-term decision support tools for agriculture that are under various stages of development and implementation in the United States: (i) Wisconsin's Runoff Risk Advisory Forecast (RRAF) System, (ii) New York's Hydrologically Sensitive Area Prediction Tool, (iii) Virginia's Saturated Area Forecast Model, (iv) Pennsylvania's Fertilizer Forecaster, (v) Washington's Application Risk Management (ARM) System, and (vi) Missouri's Design Storm Notification System. Although these decision support tools differ in their underlying model structure, the resolution at which they are applied, and the hydroclimates to which they are relevant, all provide forecasts (range 24-120 h) of runoff risk or soil moisture saturation derived from National Weather Service Forecast models. Although this review highlights the need for further development of robust and well-supported short-term nutrient management tools, their potential for adoption and ultimate utility requires an understanding of the appropriate context of application, the strategic and operational needs of managers, access to weather forecasts, scales of application (e.g., regional vs. field level), data requirements, and outreach communication structure. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.

  8. An Artificial Neural Network Based Short-term Dynamic Prediction of Algae Bloom

    Directory of Open Access Journals (Sweden)

    Yao Junyang

    2014-06-01

    Full Text Available This paper proposes a method of short-term prediction of algae bloom based on artificial neural network. Firstly, principal component analysis is applied to water environmental factors in algae bloom raceway ponds to get main factors that influence the formation of algae blooms. Then, a model of short-term dynamic prediction based on neural network is built with the current chlorophyll_a values as input and the chlorophyll_a values in the next moment as output to realize short-term dynamic prediction of algae bloom. Simulation results show that the model can realize short-term prediction of algae bloom effectively.

  9. DYNAMICS OF THE ANXIETY DISORDERS IN THE COURSE OF SHORT-TERM PSYCHOTHERAPY

    Directory of Open Access Journals (Sweden)

    T.N. Hmylova

    2008-06-01

    Full Text Available The tendency of psychotherapy modern concepts referring to the short-term forms having been taken into account, we carried out the research aimed at the study of short-term form personality-oriented psychotherapy effect on the anxiety disorder dynamics. 103 patients with neurotic disorders were examined in the neurosis and psychotherapy department of the Bekhterev Psychoneurological Research Institute. The findings revealed the situational and personal anxiety level to be objectively decreased in the short-term group psychotherapy course. The short-term group psychotherapy was proved to bean effective method in anxiety disorders treatment considering indications and limitations.

  10. An ethics curriculum for short-term global health trainees

    OpenAIRE

    DeCamp, Matthew; Rodriguez, Joce; Hecht, Shelby; Barry, Michele; Sugarman, Jeremy

    2013-01-01

    Background Interest in short-term global health training and service programs continues to grow, yet they can be associated with a variety of ethical issues for which trainees or others with limited global health experience may not be prepared to address. Therefore, there is a clear need for educational interventions concerning these ethical issues. Methods We developed and evaluated an introductory curriculum, ?Ethical Challenges in Short-term Global Health Training.? The curriculum was deve...

  11. Narcissism and the Strategic Pursuit of Short-Term Mating

    DEFF Research Database (Denmark)

    Schmitt, David P.; Alcalay, Lidia; Allik, Jüri

    2017-01-01

    Previous studies have documented links between sub-clinical narcissism and the active pursuit of short-term mating strategies (e.g., unrestricted sociosexuality, marital infidelity, mate poaching). Nearly all of these investigations have relied solely on samples from Western cultures. In the curr...... limitations of these cross-culturally universal findings and presents suggestions for future research into revealing the precise psychological features of narcissism that facilitate the strategic pursuit of short-term mating....

  12. Cash Management and Short-Term Investments for Colleges and Universities.

    Science.gov (United States)

    Haag, Leonard H.

    Effective cash management and short-term investing are discussed in this "how to" guide designed to benefit most institutions of higher education. The following premises are examined: proper compensation for effective cash management is not an expense but an investment; effective cash management and short-term investment programs do not depend on…

  13. Conversion of short-term to long-term memory in the novel object recognition paradigm.

    Science.gov (United States)

    Moore, Shannon J; Deshpande, Kaivalya; Stinnett, Gwen S; Seasholtz, Audrey F; Murphy, Geoffrey G

    2013-10-01

    It is well-known that stress can significantly impact learning; however, whether this effect facilitates or impairs the resultant memory depends on the characteristics of the stressor. Investigation of these dynamics can be confounded by the role of the stressor in motivating performance in a task. Positing a cohesive model of the effect of stress on learning and memory necessitates elucidating the consequences of stressful stimuli independently from task-specific functions. Therefore, the goal of this study was to examine the effect of manipulating a task-independent stressor (elevated light level) on short-term and long-term memory in the novel object recognition paradigm. Short-term memory was elicited in both low light and high light conditions, but long-term memory specifically required high light conditions during the acquisition phase (familiarization trial) and was independent of the light level during retrieval (test trial). Additionally, long-term memory appeared to be independent of stress-mediated glucocorticoid release, as both low and high light produced similar levels of plasma corticosterone, which further did not correlate with subsequent memory performance. Finally, both short-term and long-term memory showed no savings between repeated experiments suggesting that this novel object recognition paradigm may be useful for longitudinal studies, particularly when investigating treatments to stabilize or enhance weak memories in neurodegenerative diseases or during age-related cognitive decline. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Component fragility data base for reliability and probability studies

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.; Hofmayer, C.; Kassier, M.; Pepper, S.

    1989-01-01

    Safety-related equipment in a nuclear plant plays a vital role in its proper operation and control, and failure of such equipment due to an earthquake may pose a risk to the safe operation of the plant. Therefore, in order to assess the overall reliability of a plant, the reliability of performance of the equipment should be studied first. The success of a reliability or a probability study depends to a great extent on the data base. To meet this demand, Brookhaven National Laboratory (BNL) has formed a test data base relating the seismic capacity of equipment specimens to the earthquake levels. Subsequently, the test data have been analyzed for use in reliability and probability studies. This paper describes the data base and discusses the analysis methods. The final results that can be directly used in plant reliability and probability studies are also presented in this paper

  15. Short-term and long-term plasticity interaction in human primary motor cortex.

    Science.gov (United States)

    Iezzi, Ennio; Suppa, Antonio; Conte, Antonella; Li Voti, Pietro; Bologna, Matteo; Berardelli, Alfredo

    2011-05-01

    Repetitive transcranial magnetic stimulation (rTMS) over primary motor cortex (M1) elicits changes in motor evoked potential (MEP) size thought to reflect short- and long-term forms of synaptic plasticity, resembling short-term potentiation (STP) and long-term potentiation/depression (LTP/LTD) observed in animal experiments. We designed this study in healthy humans to investigate whether STP as elicited by 5-Hz rTMS interferes with LTP/LTD-like plasticity induced by intermittent and continuous theta-burst stimulation (iTBS and cTBS). The effects induced by 5-Hz rTMS and iTBS/cTBS were indexed as changes in MEP size. We separately evaluated changes induced by 5-Hz rTMS, iTBS and cTBS applied alone and those induced by iTBS and cTBS delivered after priming 5-Hz rTMS. Interactions between 5-Hz rTMS and iTBS/cTBS were investigated under several experimental conditions by delivering 5-Hz rTMS at suprathreshold and subthreshold intensity, allowing 1 and 5 min intervals to elapse between 5-Hz rTMS and TBS, and delivering one and ten 5-Hz rTMS trains. We also investigated whether 5-Hz rTMS induces changes in intracortical excitability tested with paired-pulse transcranial magnetic stimulation. When given alone, 5-Hz rTMS induced short-lasting and iTBS/cTBS induced long-lasting changes in MEP amplitudes. When M1 was primed with 10 suprathreshold 5-Hz rTMS trains at 1 min before iTBS or cTBS, the iTBS/cTBS-induced after-effects disappeared. The 5-Hz rTMS left intracortical excitability unchanged. We suggest that STP elicited by suprathreshold 5-Hz rTMS abolishes iTBS/cTBS-induced LTP/LTD-like plasticity through non-homeostatic metaplasticity mechanisms. Our study provides new information on interactions between short-term and long-term rTMS-induced plasticity in human M1. © 2011 The Authors. European Journal of Neuroscience © 2011 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  16. Short-term plasticity as a neural mechanism supporting memory and attentional functions.

    Science.gov (United States)

    Jääskeläinen, Iiro P; Ahveninen, Jyrki; Andermann, Mark L; Belliveau, John W; Raij, Tommi; Sams, Mikko

    2011-11-08

    Based on behavioral studies, several relatively distinct perceptual and cognitive functions have been defined in cognitive psychology such as sensory memory, short-term memory, and selective attention. Here, we review evidence suggesting that some of these functions may be supported by shared underlying neuronal mechanisms. Specifically, we present, based on an integrative review of the literature, a hypothetical model wherein short-term plasticity, in the form of transient center-excitatory and surround-inhibitory modulations, constitutes a generic processing principle that supports sensory memory, short-term memory, involuntary attention, selective attention, and perceptual learning. In our model, the size and complexity of receptive fields/level of abstraction of neural representations, as well as the length of temporal receptive windows, increases as one steps up the cortical hierarchy. Consequently, the type of input (bottom-up vs. top down) and the level of cortical hierarchy that the inputs target, determine whether short-term plasticity supports purely sensory vs. semantic short-term memory or attentional functions. Furthermore, we suggest that rather than discrete memory systems, there are continuums of memory representations from short-lived sensory ones to more abstract longer-duration representations, such as those tapped by behavioral studies of short-term memory. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. [Expression of negative emotional responses to the 2011 Great East Japan Earthquake: Analysis of big data from social media].

    Science.gov (United States)

    Miura, Asako; Komori, Masashi; Matsumura, Naohiro; Maeda, Kazutoshi

    2015-06-01

    In this article, we investigated the expression of emotional responses to the 2011 Great East Japan Earthquake by analyzing the frequency of negative emotional terms in tweets posted on Twitter, one of the most popular social media platforms. We focused on differences in time-series variations and diurnal changes between two kinds of disasters: natural disasters (earthquakes and tsunamis) and nuclear accidents. The number of tweets containing negative emotional responses increased sharply shortly after the first huge earthquake and decreased over time, whereas tweets about nuclear accidents showed no correlation with elapsed time. Expressions of anxiety about natural disasters had a circadian rhythm, with a peak at midnight, whereas expressions of anger about the nuclear accident were highly sensitive to critical events related to the accident. These findings were discussed in terms of similarities and differences compared to earlier studies on emotional responses in social media.

  18. Short- and long-term reproducibility of radioisotopic examination of gastric emptying

    Energy Technology Data Exchange (ETDEWEB)

    Jonderko, K. (Silesian School of Medicine, Katowice (Poland). Dept. of Gastroenterology)

    1990-01-01

    Reproducibility of gastric emptying (GE) of a radiolabelled solid meal was assessed. The short-term reproducibility was evaluated on the basis of 12 paired GE examinations performed 1-3 days apart. Twelve paired GE examinations taken 3-8 months apart enabled long-term reproducibility assessment. Reproducibility of GE parameters was expressed in terms of the coefficient of variation, CV. No significant between-day variation of solid GE was found either regarding the short-term or the long-term reproducibility. Although slightly higher CV values characterized the long-term reproducibility of the GE parameters considered, the variations of the differences between repeated GE examinations did not differ significantly between short- and long-term GE reproducibility. The results obtained justify the use of radioisotopic GE measurement for the assessment of early and late results of pharmacologic or surgical management. (author).

  19. Short-term and long-term effects of GDP on traffic deaths in 18 OECD countries, 1960-2011.

    Science.gov (United States)

    Dadgar, Iman; Norström, Thor

    2017-02-01

    Research suggests that increases in gross domestic product (GDP) lead to increases in traffic deaths plausibly due to the increased road traffic induced by an expanding economy. However, there also seems to exist a long-term effect of economic growth that is manifested in improved traffic safety and reduced rates of traffic deaths. Previous studies focus on either the short-term, procyclical effect, or the long-term, protective effect. The aim of the present study is to estimate the short-term and long-term effects jointly in order to assess the net impact of GDP on traffic mortality. We extracted traffic death rates for the period 1960-2011 from the WHO Mortality Database for 18 OECD countries. Data on GDP/capita were obtained from the Maddison Project. We performed error correction modelling to estimate the short-term and long-term effects of GDP on the traffic death rates. The estimates from the error correction modelling for the entire study period suggested that a one-unit increase (US$1000) in GDP/capita yields an instantaneous short-term increase in the traffic death rate by 0.58 (pGDP leads to an immediate increase in traffic deaths. However, after the mid-1970s this short-term effect is more than outweighed by a markedly stronger protective long-term effect, whereas the reverse is true for the period before the mid-1970s. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  20. An approximate method of short-term tsunami forecast and the hindcasting of some recent events

    Directory of Open Access Journals (Sweden)

    Yu. P. Korolev

    2011-11-01

    Full Text Available The paper presents a method for a short-term tsunami forecast based on sea level data from remote sites. This method is based on Green's function for the wave equation possessing the fundamental property of symmetry. This property is well known in acoustics and seismology as the reciprocity principle. Some applications of this principle on tsunami research are considered in the current study. Simple relationships and estimated transfer functions enabled us to simulate tsunami waveforms for any selected oceanic point based only on the source location and sea level data from a remote reference site. The important advantage of this method is that it is irrespective of the actual source mechanism (seismic, submarine landslide or other phenomena. The method was successfully applied to hindcast several recent tsunamis observed in the Northwest Pacific. The locations of the earthquake epicenters and the tsunami records from one of the NOAA DART sites were used as inputs for the modelling, while tsunami observations at other DART sites were used to verify the model. Tsunami waveforms for the 2006, 2007 and 2009 earthquake events near Simushir Island were simulated and found to be in good agreement with the observations. The correlation coefficients between the predicted and observed tsunami waveforms were from 0.50 to 0.85. Thus, the proposed method can be effectively used to simulate tsunami waveforms for the entire ocean and also for both regional and local tsunami warning services, assuming that they have access to the real-time sea level data from DART stations.

  1. Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile

    Science.gov (United States)

    Pasten, D.

    2017-12-01

    The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability

  2. Short-term uranium price formation: a methodology

    International Nuclear Information System (INIS)

    Hsieh, L.Y.; de Graffenried, C.L.

    1987-01-01

    One of the major problems in analyzing the short-term uranium market is the lack of a well-defined spot market price. The two primary sources of price data covering the US uranium market are the series published by the US Dept. of Energy (DOE) and by the Nuclear Exchange Corporation (NUEXCO), a private brokerage firm. Because of the differences in both definition and coverage, these two series are not directly comparable. In this study, an econometric model was developed for analyzing the interrelationship between short-term uranium price (NUEXCO exchange value), supply, demand, and future price expectations formed by market participants. The validity of this model has been demonstrated by the fact that all simulation statistics derived are highly significant. Three forecasting scenarios were developed in this study

  3. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Science.gov (United States)

    Rydberg, Henny; Marrone, Gaetano; Strömdahl, Susanne; von Schreeb, Johan

    2015-01-01

    Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity. We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001. Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  4. A Promising Tool to Assess Long Term Public Health Effects of Natural Disasters: Combining Routine Health Survey Data and Geographic Information Systems to Assess Stunting after the 2001 Earthquake in Peru.

    Directory of Open Access Journals (Sweden)

    Henny Rydberg

    Full Text Available Research on long-term health effects of earthquakes is scarce, especially in low- and middle-income countries, which are disproportionately affected by disasters. To date, progress in this area has been hampered by the lack of tools to accurately measure these effects. Here, we explored whether long-term public health effects of earthquakes can be assessed using a combination of readily available data sources on public health and geographic distribution of seismic activity.We used childhood stunting as a proxy for public health effects. Data on stunting were attained from Demographic and Health Surveys. Earthquake data were obtained from U.S. Geological Survey's ShakeMaps, geographic information system-based maps that divide earthquake affected areas into different shaking intensity zones. We combined these two data sources to categorize the surveyed children into different earthquake exposure groups, based on how much their area of residence was affected by the earthquake. We assessed the feasibility of the approach using a real earthquake case--an 8.4 magnitude earthquake that hit southern Peru in 2001.Our results indicate that the combination of health survey data and disaster data may offer a readily accessible and accurate method for determining the long-term public health consequences of a natural disaster. Our work allowed us to make pre- and post-earthquake comparisons of stunting, an important indicator of the well-being of a society, as well as comparisons between populations with different levels of exposure to the earthquake. Furthermore, the detailed GIS based data provided a precise and objective definition of earthquake exposure. Our approach should be considered in future public health and disaster research exploring the long-term effects of earthquakes and potentially other natural disasters.

  5. Persistent long-term facilitation at an identified synapse becomes labile with activation of short-term heterosynaptic plasticity.

    Science.gov (United States)

    Hu, Jiang-Yuan; Schacher, Samuel

    2014-04-02

    Short-term and long-term synaptic plasticity are cellular correlates of learning and memory of different durations. Little is known, however, how these two forms of plasticity interact at the same synaptic connection. We examined the reciprocal impact of short-term heterosynaptic or homosynaptic plasticity at sensorimotor synapses of Aplysia in cell culture when expressing persistent long-term facilitation (P-LTF) evoked by serotonin [5-hydroxytryptamine (5-HT)]. Short-term heterosynaptic plasticity induced by 5-HT (facilitation) or the neuropeptide FMRFa (depression) and short-term homosynaptic plasticity induced by tetanus [post-tetanic potentiation (PTP)] or low-frequency stimulation [homosynaptic depression (HSD)] of the sensory neuron were expressed in both control synapses and synapses expressing P-LTF in the absence or presence of protein synthesis inhibitors. All forms of short-term plasticity failed to significantly affect ongoing P-LTF in the absence of protein synthesis inhibitors. However, P-LTF reversed to control levels when either 5-HT or FMRFa was applied in the presence of rapamycin. In contrast, P-LTF was unaffected when either PTP or HSD was evoked in the presence of either rapamycin or anisomycin. These results indicate that synapses expressing persistent plasticity acquire a "new" baseline and functionally express short-term changes as naive synapses, but the new baseline becomes labile following selective activations-heterosynaptic stimuli that evoke opposite forms of plasticity-such that when presented in the presence of protein synthesis inhibitors produce a rapid reversal of the persistent plasticity. Activity-selective induction of a labile state at synapses expressing persistent plasticity may facilitate the development of therapies for reversing inappropriate memories.

  6. Behavioural Models of Motor Control and Short-Term Memory

    OpenAIRE

    Imanaka, Kuniyasu; Funase, Kozo; Yamauchi, Masaki

    1995-01-01

    We examined in this review article the behavioural and conceptual models of motor control and short-term memory which have intensively been investigated since the 1970s. First, we reviewed both the dual-storage model of short-term memory in which movement information is stored and a typical model of motor control which emphasizes the importance of efferent factors. We then examined two models of preselection effects: a cognitive model and a cognitive/ efferent model. Following this we reviewe...

  7. Gummed-up memory: Chewing gum impairs short-term recall

    OpenAIRE

    Kozlov, Michail D; Hughes, Robert W; Jones, Dylan M

    2012-01-01

    Several studies have suggested that short-term memory is generally improved by chewing gum. However, we report the first studies to show that chewing gum impairs short-term memory for both item order and item identity. Experiment 1 showed that chewing gum reduces serial recall of letter lists. Experiment 2 indicated that chewing does not simply disrupt vocal-articulatory planning required for order retention: Chewing equally impairs a matched task that required retention of list item identity...

  8. [Short-term memory characteristics of vibration intensity tactile perception on human wrist].

    Science.gov (United States)

    Hao, Fei; Chen, Li-Juan; Lu, Wei; Song, Ai-Guo

    2014-12-25

    In this study, a recall experiment and a recognition experiment were designed to assess the human wrist's short-term memory characteristics of tactile perception on vibration intensity, by using a novel homemade vibrotactile display device based on the spatiotemporal combination vibration of multiple micro vibration motors as a test device. Based on the obtained experimental data, the short-term memory span, recognition accuracy and reaction time of vibration intensity were analyzed. From the experimental results, some important conclusions can be made: (1) The average short-term memory span of tactile perception on vibration intensity is 3 ± 1 items; (2) The greater difference between two adjacent discrete intensities of vibrotactile stimulation is defined, the better average short-term memory span human wrist gets; (3) There is an obvious difference of the average short-term memory span on vibration intensity between the male and female; (4) The mechanism of information extraction in short-term memory of vibrotactile display is to traverse the scanning process by comparison; (5) The recognition accuracy and reaction time performance of vibrotactile display compares unfavourably with that of visual and auditory. The results from this study are important for designing vibrotactile display coding scheme.

  9. Pro short-term procurement - Broker/trader

    International Nuclear Information System (INIS)

    Hoellen, E.E.

    1990-01-01

    The author presents his opinion on the issue of short-term versus long-term procurement of uranium and enrichment and the impact on reliability of supply. The progression of the market has been one of increasing commoditization. Utility buyers have moved towards purchasing uranium on the spot market and linking long-term contracts to spot-market pricing. There is some logic to the argument that utilities and the industry in general would be best served by this approach. Inventories would be worked off much more quickly, and unnecessary supply would be shut off until prices recovered to profitable levels. The result would be a healthier market with no detriment to the reliability of supply

  10. Long-term effect of early-life stress from earthquake exposure on working memory in adulthood.

    Science.gov (United States)

    Li, Na; Wang, Yumei; Zhao, Xiaochuan; Gao, Yuanyuan; Song, Mei; Yu, Lulu; Wang, Lan; Li, Ning; Chen, Qianqian; Li, Yunpeng; Cai, Jiajia; Wang, Xueyi

    2015-01-01

    The present study aimed to investigate the long-term effect of 1976 Tangshan earthquake exposure in early life on performance of working memory in adulthood. A total of 907 study subjects born and raised in Tangshan were enrolled in this study. They were divided into three groups according to the dates of birth: infant exposure (3-12 months, n=274), prenatal exposure (n=269), and no exposure (born at least 1 year after the earthquake, n=364). The prenatal group was further divided into first, second, and third trimester subgroups based on the timing of exposure during pregnancy. Hopkins Verbal Learning Test-Revised and Brief Visuospatial Memory Test-Revised (BVMT-R) were used to measure the performance of working memory. Unconditional logistic regression analysis was used to analyze the influential factors for impaired working memory. The Hopkins Verbal Learning Test-Revised scores did not show significant difference across the three groups. Compared with no exposure group, the BVMT-R scores were slightly lower in the prenatal exposure group and markedly decreased in the infant exposure group. When the BVMT-R scores were analyzed in three subgroups, the results showed that the subjects whose mothers were exposed to earthquake in the second and third trimesters of pregnancy had significantly lower BVMT-R scores compared with those in the first trimester. Education level and early-life earthquake exposure were identified as independent risk factors for reduced performance of visuospatial memory indicated by lower BVMT-R scores. Infant exposure to earthquake-related stress impairs visuospatial memory in adulthood. Fetuses in the middle and late stages of development are more vulnerable to stress-induced damage that consequently results in impaired visuospatial memory. Education and early-life trauma can also influence the performance of working memory in adulthood.

  11. Short-term memory in the service of executive control functions

    Directory of Open Access Journals (Sweden)

    Farshad Alizadeh Mansouri

    2015-12-01

    Full Text Available Short-term memory is a crucial cognitive function for supporting on-going and upcoming behaviours, allowing storage of information across delay periods. The content of this memory may typically include tangible information about features such as the shape, colour or texture of an object, its location and motion relative to the body, or phonological information. The neural correlate of these short-term memories has been found in different brain areas involved in organizing perceptual or motor functions. In particular, neuronal activity in different prefrontal areas encodes task-related information corresponding to short-term memory across delay periods, and lesions in the prefrontal cortex severely affect the ability to hold this type of memory. Recent studies have further expanded the scope and possible role of short-term memory by showing that information of abstract entities such as a behaviour-guiding rule, or the occurrence of a conflict in information processing; can also be maintained in short-term memory and used for adjusting the allocation of executive control in dynamic environments. It has also been shown that neuronal activity in the dorsolateral prefrontal and orbitofrontal cortices encodes information about such abstract entities. These findings suggest that the prefrontal cortex plays crucial roles in organizing goal-directed behaviour by supporting various mnemonic processes that maintain a wide range of information in the service of executive control of on-going or upcoming behaviour.

  12. Implicit short- and long-term memory direct our gaze in visual search.

    Science.gov (United States)

    Kruijne, Wouter; Meeter, Martijn

    2016-04-01

    Visual attention is strongly affected by the past: both by recent experience and by long-term regularities in the environment that are encoded in and retrieved from memory. In visual search, intertrial repetition of targets causes speeded response times (short-term priming). Similarly, targets that are presented more often than others may facilitate search, even long after it is no longer present (long-term priming). In this study, we investigate whether such short-term priming and long-term priming depend on dissociable mechanisms. By recording eye movements while participants searched for one of two conjunction targets, we explored at what stages of visual search different forms of priming manifest. We found both long- and short- term priming effects. Long-term priming persisted long after the bias was present, and was again found even in participants who were unaware of a color bias. Short- and long-term priming affected the same stage of the task; both biased eye movements towards targets with the primed color, already starting with the first eye movement. Neither form of priming affected the response phase of a trial, but response repetition did. The results strongly suggest that both long- and short-term memory can implicitly modulate feedforward visual processing.

  13. Stacking Ensemble Learning for Short-Term Electricity Consumption Forecasting

    Directory of Open Access Journals (Sweden)

    Federico Divina

    2018-04-01

    Full Text Available The ability to predict short-term electric energy demand would provide several benefits, both at the economic and environmental level. For example, it would allow for an efficient use of resources in order to face the actual demand, reducing the costs associated to the production as well as the emission of CO 2 . To this aim, in this paper we propose a strategy based on ensemble learning in order to tackle the short-term load forecasting problem. In particular, our approach is based on a stacking ensemble learning scheme, where the predictions produced by three base learning methods are used by a top level method in order to produce final predictions. We tested the proposed scheme on a dataset reporting the energy consumption in Spain over more than nine years. The obtained experimental results show that an approach for short-term electricity consumption forecasting based on ensemble learning can help in combining predictions produced by weaker learning methods in order to obtain superior results. In particular, the system produces a lower error with respect to the existing state-of-the art techniques used on the same dataset. More importantly, this case study has shown that using an ensemble scheme can achieve very accurate predictions, and thus that it is a suitable approach for addressing the short-term load forecasting problem.

  14. Identity modulates short-term memory for facial emotion.

    Science.gov (United States)

    Galster, Murray; Kahana, Michael J; Wilson, Hugh R; Sekuler, Robert

    2009-12-01

    For some time, the relationship between processing of facial expression and facial identity has been in dispute. Using realistic synthetic faces, we reexamined this relationship for both perception and short-term memory. In Experiment 1, subjects tried to identify whether the emotional expression on a probe stimulus face matched the emotional expression on either of two remembered faces that they had just seen. The results showed that identity strongly influenced recognition short-term memory for emotional expression. In Experiment 2, subjects' similarity/dissimilarity judgments were transformed by multidimensional scaling (MDS) into a 2-D description of the faces' perceptual representations. Distances among stimuli in the MDS representation, which showed a strong linkage of emotional expression and facial identity, were good predictors of correct and false recognitions obtained previously in Experiment 1. The convergence of the results from Experiments 1 and 2 suggests that the overall structure and configuration of faces' perceptual representations may parallel their representation in short-term memory and that facial identity modulates the representation of facial emotion, both in perception and in memory. The stimuli from this study may be downloaded from http://cabn.psychonomic-journals.org/content/supplemental.

  15. Short-term and long-term memory in early temporal lobe dysfunction.

    Science.gov (United States)

    Hershey, T; Craft, S; Glauser, T A; Hale, S

    1998-01-01

    Following medial temporal damage, mature humans are impaired in retaining new information over long delays but not short delays. The question of whether a similar dissociation occurs in children was addressed by testing children (ages 7-16) with unilateral temporal lobe epilepsy (TLE) and controls on short- and long-term memory tasks, including a spatial delayed response task (SDR). Early-onset TLE did not affect performance on short delays on SDR, but it did impair performance at the longest delay (60 s), similar to adults with unilateral medial temporal damage. In addition, early-onset TLE affected performance on pattern recall, spatial span, and verbal span with rehearsal interference. No differences were found on story recall or on a response inhibition task.

  16. Distribution of Short-Term and Lifetime Predicted Risks of Cardiovascular Diseases in Peruvian Adults

    Science.gov (United States)

    Quispe, Renato; Bazo-Alvarez, Juan Carlos; Burroughs Peña, Melissa S; Poterico, Julio A; Gilman, Robert H; Checkley, William; Bernabé-Ortiz, Antonio; Huffman, Mark D; Miranda, J Jaime

    2015-01-01

    Background Short-term risk assessment tools for prediction of cardiovascular disease events are widely recommended in clinical practice and are used largely for single time-point estimations; however, persons with low predicted short-term risk may have higher risks across longer time horizons. Methods and Results We estimated short-term and lifetime cardiovascular disease risk in a pooled population from 2 studies of Peruvian populations. Short-term risk was estimated using the atherosclerotic cardiovascular disease Pooled Cohort Risk Equations. Lifetime risk was evaluated using the algorithm derived from the Framingham Heart Study cohort. Using previously published thresholds, participants were classified into 3 categories: low short-term and low lifetime risk, low short-term and high lifetime risk, and high short-term predicted risk. We also compared the distribution of these risk profiles across educational level, wealth index, and place of residence. We included 2844 participants (50% men, mean age 55.9 years [SD 10.2 years]) in the analysis. Approximately 1 of every 3 participants (34% [95% CI 33 to 36]) had a high short-term estimated cardiovascular disease risk. Among those with a low short-term predicted risk, more than half (54% [95% CI 52 to 56]) had a high lifetime predicted risk. Short-term and lifetime predicted risks were higher for participants with lower versus higher wealth indexes and educational levels and for those living in urban versus rural areas (PPeruvian adults were classified as low short-term risk but high lifetime risk. Vulnerable adults, such as those from low socioeconomic status and those living in urban areas, may need greater attention regarding cardiovascular preventive strategies. PMID:26254303

  17. Stochastic Optimal Wind Power Bidding Strategy in Short-Term Electricity Market

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Bak-Jensen, Birgitte

    2012-01-01

    Due to the fluctuating nature and non-perfect forecast of the wind power, the wind power owners are penalized for the imbalance costs of the regulation, when they trade wind power in the short-term liberalized electricity market. Therefore, in this paper a formulation of an imbalance cost...... minimization problem for trading wind power in the short-term electricity market is described, to help the wind power owners optimize their bidding strategy. Stochastic optimization and a Monte Carlo method are adopted to find the optimal bidding strategy for trading wind power in the short-term electricity...... market in order to deal with the uncertainty of the regulation price, the activated regulation of the power system and the forecasted wind power generation. The Danish short-term electricity market and a wind farm in western Denmark are chosen as study cases due to the high wind power penetration here...

  18. Short-term regulation of hydro powerplants. Studies on the environmental effects

    International Nuclear Information System (INIS)

    Sinisalmi, T.; Riihimaeki, J.; Vehanen, T.; Yrjaenae, T.

    1997-01-01

    The publication is a final report on a project studying effects of short-term regulation of hydro power plants. The project consists of two parts: (1) examining and developing methods for evaluation, (2) applying methods in a case study at the Oulujoki River. The economic value of short-term regulation was studied with a model consisting of an optimization model and a river simulation model. Constraints on water level or discharge variations could be given to the power plants and their economical influence could be studied. Effects on shoreline recreation use due to water level fluctuation were studied with a model where various effects are made commensurable and expressed in monetary terms. A literature survey and field experiments were used to study the methods for assessing effects of short-term regulation on river habitats. The state and development needs of fish stocks and fisheries in large regulated rivers were studied and an environmental classification was made. Remedial measures for the short-term regulated rivers were studied with a literature survey and enquiries. A comprehensive picture of the various effects of short-term regulation was gained in the case study in Oulujoki River (110 km long, 7 power plants). Harmful effects can be reduced with the given recommendations of remedial measures on environment and the usage of the hydro power plants. (orig.) 52 refs

  19. Short-term regulation of hydro powerplants. Studies on the environmental effects

    Energy Technology Data Exchange (ETDEWEB)

    Sinisalmi, T. [ed.; Forsius, J.; Muotka, J.; Soimakallio, H. [Imatran Voima Oy, Vantaa (Finland); Riihimaeki, J. [VTT, Espoo (Finland); Vehanen, T. [Finnish Game and Fisheries Research Inst. (Finland); Yrjaenae, T. [North Ostrobothnia Regional Environmental Centre, Oulu (Finland)

    1997-12-31

    The publication is a final report on a project studying effects of short-term regulation of hydro power plants. The project consists of two parts: (1) examining and developing methods for evaluation, (2) applying methods in a case study at the Oulujoki River. The economic value of short-term regulation was studied with a model consisting of an optimization model and a river simulation model. Constraints on water level or discharge variations could be given to the power plants and their economical influence could be studied. Effects on shoreline recreation use due to water level fluctuation were studied with a model where various effects are made commensurable and expressed in monetary terms. A literature survey and field experiments were used to study the methods for assessing effects of short-term regulation on river habitats. The state and development needs of fish stocks and fisheries in large regulated rivers were studied and an environmental classification was made. Remedial measures for the short-term regulated rivers were studied with a literature survey and enquiries. A comprehensive picture of the various effects of short-term regulation was gained in the case study in Oulujoki River (110 km long, 7 power plants). Harmful effects can be reduced with the given recommendations of remedial measures on environment and the usage of the hydro power plants. (orig.) 52 refs.

  20. Visual dot interaction with short-term memory.

    Science.gov (United States)

    Etindele Sosso, Faustin Armel

    2017-06-01

    Many neurodegenerative diseases have a memory component. Brain structures related to memory are affected by environmental stimuli, and it is difficult to dissociate effects of all behavior of neurons. Here, visual cortex of mice was stimulated with gratings and dot, and an observation of neuronal activity before and after was made. Bandwidth, firing rate and orientation selectivity index were evaluated. A primary communication between primary visual cortex and short-term memory appeared to show an interesting path to train cognitive circuitry and investigate the basics mechanisms of the neuronal learning. The findings also suggested the interplay between primary visual cortex and short-term plasticity. The properties inside a visual target shape the perception and affect the basic encoding. Using visual cortex, it may be possible to train the memory and improve the recovery of people with cognitive disabilities or memory deficit.

  1. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    Science.gov (United States)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source

  2. Short-Term Choriocapillaris Changes in Patients with Central Serous Chorioretinopathy after Half-Dose Photodynamic Therapy

    Directory of Open Access Journals (Sweden)

    Marco Nassisi

    2017-11-01

    Full Text Available Background: Although photodynamic therapy (PDT has become the standard treatment for central serous chorioretinopathy (CSC, its mechanism of action remains unclear. It is assumed that PDT induces short-term choriocapillaris (CC occlusion and long-term choroidal vascular remodeling. In this paper, we describe the short-term CC changes induced by Half-Dose PDT (HD-PDT in chronic CSC using optical coherence tomography-angiography (OCTA. Methods: This is a prospective interventional case series. Chronic CSC eyes underwent Spectral-Domain OCT, Fundus Autofluorescence, FA, ICGA (Heidelberg Spectralis, Heidelberg, Germany and OCTA (RTVue XR Avanti with AngioVue; Optovue Inc., Fremont, CA, USA before HD-PDT, with follow-up after one hour, one week, and one month. Vascular changes after PDT were analyzed within the CC layer. The CC vessel density was defined as the percentage of an area occupied by flow pixels, using Image J software to obtain measurements by applying a grey level threshold. All pixels with a grey level above the threshold were considered as indicators of blood flow. Results: 20 eyes of 19 patients were included. At baseline the mean CC vessel density was 94.87 ± 2.32%. It significantly differed from the density at 1 week and 1 month (92.79 ± 3.16% and 95.55 ± 2.05%, p < 0.001, respectively, but not with values at 1 h (94.8 ± 2.28%, p = 0.516. Conclusions: CC vessel density was significantly reduced at 1 week as compared with baseline, suggesting a possible short-term effect of PDT on CC perfusion. After 1 month however, the CC vessel density was even higher than the baseline, probably due to a CC recovery. OCTA seems to be useful in the visualization of CC vessels and in confirming the mechanism of action of PDT treatment in eyes with chronic CSC.

  3. GPS detection of ionospheric perturbation before the 13 February 2001, El Salvador earthquake

    OpenAIRE

    V. V. Plotkin

    2003-01-01

    A large earthquake of M6.6 occurred on 13 February 2001 at 14:22:05 UT in El Salvador. We detected ionospheric perturbation before this earthquake using GPS data received from CORS network. Systematic decreases of ionospheric total electron content during two days before the earthquake onset were observed at set of stations near the earthquake location and probably in region of about 1000 km from epicenter. This result is consistent with t...

  4. Pro short-term procurement - U.S. utility

    International Nuclear Information System (INIS)

    Thompson, R.D.

    1990-01-01

    The author expresses the opinion that rather than focusing market discussions around short-term versus long-term procurement strategies, the parties need to be focusing on how long it is going to take to get to a predominantly market-based price both in uranium and enrichment. Long-term contracts are going to be around and will always be an important part of buyers' and sellers' strategies. It is evident that the annual term contract price renegotiations around the world are resulting in continually lower prices. When these price negotiations finally arrive in the range of the market price, a commodity market that resembles other energy commodity markets can be obtained

  5. Short-Term Memory in Habituation and Dishabituation

    Science.gov (United States)

    Whitlow, Jesse William, Jr.

    1975-01-01

    The present research evaluated the refractorylike response decrement, as found in habituation of auditory evoked peripheral vasoconstriction in rabbits, to determine whether or not it represents a short-term habituation process distinct from effector fatigue or sensory adaptation. (Editor)

  6. Effects of short term and long term soil warming on ecosystem phenology of a sub-arctic grassland: an NDVI-based approach

    Science.gov (United States)

    Leblans, Niki; Sigurdsson, Bjarni D.; Janssens, Ivan A.

    2014-05-01

    Phenology has been defined as the study of the timing of recurring biological events and the causes of their timing with regard to abiotic and biotic factors. Ecosystem phenology, including the onset of the growing season and its senescence in autumn, plays an important role in the carbon, water and energy exchange between biosphere and atmosphere at higher latitudes. Factors that influence ecosystem phenology can therefore induce important climate-controlling feedback mechanisms. Global surface temperatures have been predicted to increase in the coming decades. Hence, a better understanding of the effect of temperature on ecosystem phenology is essential. Natural geothermal soil temperature gradients in Iceland offer a unique opportunity to study the soil temperature (Ts) dependence of ecosystem phenology and distinguish short-term (transient) warming effects (in recently established Ts gradients) from long-term (permanent) effects (in centuries-old Ts gradients). This research was performed in the framework of an international research project (ForHot; www.forhot.is). ForHot includes two natural grassland areas with gradients in Ts, dominated by Festuca sp., Agrostis sp.. The first warmed area was created in 2008, when an earthquake in S-Iceland caused geothermal systems to be shifted to previously cold soils. The second area is located about 3 km away from this newly warmed grassland. For this area, there are proofs that the natural soil warming has been continuous for at least 300 year. In the present study we focus on Ts elevation gradients of +0 to +10°C. The experiment consists of five transects with five temperature levels (+0,+1,+3,+5 and +10°C) in the two aforementioned grassland ecosystems (n=25 in each grassland). From April until November 2013, weekly measurements of the normalized difference vegetation index (NDVI) were taken. In the short-term warmed grassland, the greening of the vegetation was 36 days advanced at +10°C Ts and the date of 50

  7. What do short-term and long-term relationships look like? Building the relationship coordination and strategic timing (ReCAST) model.

    Science.gov (United States)

    Eastwick, Paul W; Keneski, Elizabeth; Morgan, Taylor A; McDonald, Meagan A; Huang, Sabrina A

    2018-05-01

    Close relationships research has examined committed couples (e.g., dating relationships, marriages) using intensive methods that plot relationship development over time. But a substantial proportion of people's real-life sexual experiences take place (a) before committed relationships become "official" and (b) in short-term relationships; methods that document the time course of relationships have rarely been applied to these contexts. We adapted a classic relationship trajectory-plotting technique to generate the first empirical comparisons between the features of people's real-life short-term and long-term relationships across their entire timespan. Five studies compared long-term and short-term relationships in terms of the timing of relationship milestones (e.g., flirting, first sexual intercourse) and the occurrence/intensity of important relationship experiences (e.g., romantic interest, strong sexual desire, attachment). As romantic interest was rising and partners were becoming acquainted, long-term and short-term relationships were indistinguishable. Eventually, romantic interest in short-term relationships plateaued and declined while romantic interest in long-term relationships continued to rise, ultimately reaching a higher peak. As relationships progressed, participants evidenced more features characteristic of the attachment-behavioral system (e.g., attachment, caregiving) in long-term than short-term relationships but similar levels of other features (e.g., sexual desire, self-promotion, intrasexual competition). These data inform a new synthesis of close relationships and evolutionary psychological perspectives called the Relationship Coordination and Strategic Timing (ReCAST) model. ReCAST depicts short-term and long-term relationships as partially overlapping trajectories (rather than relationships initiated with distinct strategies) that differ in their progression along a normative relationship development sequence. (PsycINFO Database Record (c

  8. Augmented Reality in Informal Learning Environments: Investigating Short-term and Long-term Effects

    DEFF Research Database (Denmark)

    Sommerauer, Peter; Müller, Oliver

    2018-01-01

    field experiment with 24 participants at a mathematics exhibition to measure the effect of AR on acquiring and retaining mathematical knowledge in an informal learning environment, both short-term (i.e., directly after visiting the exhibition) and long-term (i.e., two months after the museum visit). Our...

  9. Distribution of Short-Term and Lifetime Predicted Risks of Cardiovascular Diseases in Peruvian Adults.

    Science.gov (United States)

    Quispe, Renato; Bazo-Alvarez, Juan Carlos; Burroughs Peña, Melissa S; Poterico, Julio A; Gilman, Robert H; Checkley, William; Bernabé-Ortiz, Antonio; Huffman, Mark D; Miranda, J Jaime

    2015-08-07

    Short-term risk assessment tools for prediction of cardiovascular disease events are widely recommended in clinical practice and are used largely for single time-point estimations; however, persons with low predicted short-term risk may have higher risks across longer time horizons. We estimated short-term and lifetime cardiovascular disease risk in a pooled population from 2 studies of Peruvian populations. Short-term risk was estimated using the atherosclerotic cardiovascular disease Pooled Cohort Risk Equations. Lifetime risk was evaluated using the algorithm derived from the Framingham Heart Study cohort. Using previously published thresholds, participants were classified into 3 categories: low short-term and low lifetime risk, low short-term and high lifetime risk, and high short-term predicted risk. We also compared the distribution of these risk profiles across educational level, wealth index, and place of residence. We included 2844 participants (50% men, mean age 55.9 years [SD 10.2 years]) in the analysis. Approximately 1 of every 3 participants (34% [95% CI 33 to 36]) had a high short-term estimated cardiovascular disease risk. Among those with a low short-term predicted risk, more than half (54% [95% CI 52 to 56]) had a high lifetime predicted risk. Short-term and lifetime predicted risks were higher for participants with lower versus higher wealth indexes and educational levels and for those living in urban versus rural areas (PPeruvian adults were classified as low short-term risk but high lifetime risk. Vulnerable adults, such as those from low socioeconomic status and those living in urban areas, may need greater attention regarding cardiovascular preventive strategies. © 2015 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley Blackwell.

  10. The 2007 Mentawai earthquake sequence on the Sumatra megathrust

    Science.gov (United States)

    Konca, A.; Avouac, J.; Sladen, A.; Meltzner, A. J.; Kositsky, A. P.; Sieh, K.; Fang, P.; Li, Z.; Galetzka, J.; Genrich, J.; Chlieh, M.; Natawidjaja, D. H.; Bock, Y.; Fielding, E. J.; Helmberger, D. V.

    2008-12-01

    The Sumatra Megathrust has recently produced a flurry of large interplate earthquakes starting with the giant Mw 9.15, Aceh earthquake of 2004. All of these earthquakes occurred within the area monitored by the Sumatra Geodetic Array (SuGAr), which provided exceptional records of near-field co-seismic and postseismic ground displacements. The most recent of these major earthquakes, an Mw 8.4 earthquake and an Mw 7.9 earthquake twelve hours later, occurred in the Mentawai islands area where devastating historical earthquakes had happened in 1797 and 1833. The 2007 earthquake sequence provides an exceptional opportunity to understand the variability of the earthquakes along megathrusts and their relation to interseismic coupling. The InSAR, GPS and teleseismic modeling shows that 2007 earthquakes ruptured a fraction of the strongly coupled Mentawai patch of the megathrust, which is also only a fraction of the 1833 rupture area. It also released a much smaller moment than the one released in 1833, or than the deficit of moment that has accumulated since. Both earthquakes of 2007 consist of 2 sub-events which are 50 to 100 km apart from each other. On the other hand, the northernmost slip patch of 8.4 and southern slip patch of 7.9 earthquakes abut each other, but they ruptured 12 hours apart. Sunda megathrust earthquakes of recent years include a rupture of a strongly coupled patch that closely mimics a prior rupture of that patch and which is well correlated with the interseismic coupling pattern (Nias-Simeulue section), as well as a rupture sequence of a strongly coupled patch that differs substantially in the details from its most recent predecessors (Mentawai section). We conclude that (1) seismic asperities are probably persistent features which arise form heterogeneous strain build up in the interseismic period; and (2) the same portion of a megathrust can rupture in different ways depending on whether asperities break as isolated events or cooperate to produce

  11. Short-term Outcomes Following Concussion in the NFL: A Study of Player Longevity, Performance, and Financial Loss.

    Science.gov (United States)

    Navarro, Sergio M; Sokunbi, Olumide F; Haeberle, Heather S; Schickendantz, Mark S; Mont, Michael A; Figler, Richard A; Ramkumar, Prem N

    2017-11-01

    A short-term protocol for evaluation of National Football League (NFL) athletes incurring concussion has yet to be fully defined and framed in the context of the short-term potential team and career longevity, financial risk, and performance. To compare the short-term career outcomes for NFL players with concussions by analyzing the effect of concussions on (1) franchise release rate, (2) career length, (3) salary, and (4) performance. Cohort study; Level of evidence, 3. NFL player transaction records and publicly available injury reports from August 2005 to January 2016 were analyzed. All players sustaining documented concussions were evaluated for a change to inactive or DNP ("did not participate") status. A case-control design compared franchise release rates and remaining NFL career span. Career length was analyzed via survival analysis. Salary and performance differences were analyzed with publicly available contract data and a performance-scoring algorithm based on position/player level. Of the 5894 eligible NFL players over the 11-year period, 307 sustained publicly reported concussions resulting in the DNP injury protocol. Analysis of the probability of remaining in the league demonstrated a statistically significantly shorter career length for the concussion group at 3 and 5 years after concussion. The year-over-year change in contract value for the concussion group resulted in a mean overall salary reduction of $300,000 ± $1,300,000 per year (interquartile range, -$723,000 to $450,000 per year). The performance score reduction for all offensive scoring players sustaining concussions was statistically significant. This retrospective study demonstrated that NFL players who sustain a concussion face a higher overall franchise release rate and shorter career span. Players who sustained concussions may incur significant salary reductions and perform worse after concussion. Short-term reductions in longevity, performance, and salary after concussion exist and

  12. Short-term synaptic plasticity and heterogeneity in neural systems

    Science.gov (United States)

    Mejias, J. F.; Kappen, H. J.; Longtin, A.; Torres, J. J.

    2013-01-01

    We review some recent results on neural dynamics and information processing which arise when considering several biophysical factors of interest, in particular, short-term synaptic plasticity and neural heterogeneity. The inclusion of short-term synaptic plasticity leads to enhanced long-term memory capacities, a higher robustness of memory to noise, and irregularity in the duration of the so-called up cortical states. On the other hand, considering some level of neural heterogeneity in neuron models allows neural systems to optimize information transmission in rate coding and temporal coding, two strategies commonly used by neurons to codify information in many brain areas. In all these studies, analytical approximations can be made to explain the underlying dynamics of these neural systems.

  13. Seismic gaps previous to certain great earthquakes occurring in North China

    Energy Technology Data Exchange (ETDEWEB)

    Wei, K H; Lin, C H; Chu, H C; Chao, Y H; Chao, H L; Hou, H F

    1978-07-01

    The epicentral distributions of small and moderate earthquakes preceding nine great earthquakes (M greater than or equal to 7.0) in North China are analyzed. It can be seen that most of these earthquakes are preceded by gaps in the regions surrounding their epicenters. The relations between the parameters of the seismic gaps, such as the lengths of their long and short axes, the areas of the gaps, etc., and the parameters of the corresponding earthquakes are discussed.

  14. Perceptions of short-term medical volunteer work: a qualitative study in Guatemala.

    Science.gov (United States)

    Green, Tyler; Green, Heidi; Scandlyn, Jean; Kestler, Andrew

    2009-02-26

    Each year medical providers from wealthy countries participate in short-term medical volunteer work in resource-poor countries. Various authors have raised concern that such work has the potential to be harmful to recipient communities; however, the social science and medical literature contains little research into the perceptions of short-term medical volunteer work from the perspective of members of recipient communities. This exploratory study examines the perception of short-term medical volunteer work in Guatemala among groups of actors affected by or participating in these programs. The researchers conducted in-depth, semi-structured interviews with 72 individuals, including Guatemalan healthcare providers and health authorities, foreign medical providers, non-medical personnel working on health projects, and Guatemalan parents of children treated by a short-term volunteer group. Detailed notes and summaries of these interviews were uploaded, coded and annotated using Atlas.ti (Scientific Software Development GmbH, Berlin) to identify recurrent themes from the interviews. Informants commonly identified a need for increased access to medical services in Guatemala, and many believed that short-term medical volunteers are in a position to offer improved access to medical care in the communities where they serve. Informants most frequently cited appropriate patient selection and attention to payment systems as the best means to avoid creating dependence on foreign aid. The most frequent suggestion to improve short-term medical volunteer work was coordination with and respect for local Guatemalan healthcare providers and their communities, as insufficient understanding of the country's existing healthcare resources and needs may result in perceived harm to the recipient community. The perceived impact of short-term medical volunteer projects in Guatemala is highly variable and dependent upon the individual project. In this exploratory study, project

  15. The potential influence of short-term environmental variability on the composition of testate amoeba communities in Sphagnum peatlands.

    Science.gov (United States)

    Sullivan, Maura E; Booth, Robert K

    2011-07-01

    Testate amoebae are a group of moisture-sensitive, shell-producing protozoa that have been widely used as indicators of changes in mean water-table depth within oligotrophic peatlands. However, short-term environmental variability (i.e., sub-annual) also probably influences community composition. The objective of this study was to assess the potential influence of short-term environmental variability on the composition of testate amoeba communities in Sphagnum-dominated peatlands. Testate amoebae and environmental conditions, including hourly measurements of relative humidity within the upper centimeter of the peatland surface, were examined throughout the 2008 growing season at 72 microsites within 11 peatlands of Pennsylvania and Wisconsin, USA. Relationships among testate amoeba communities, vegetation, depth to water table, pH, and an index of short-term environmental variability (EVI), were examined using nonmetric multidimensional scaling and correlation analysis. Results suggest that EVI influences testate amoeba communities, with some taxa more abundant under highly variable conditions (e.g., Arcella discoides, Difflugia pulex, and Hyalosphenia subflava) and others more abundant when environmental conditions at the peatland surface were relatively stable (e.g., Archerella flavum and Bullinularia indica). The magnitude of environmental variability experienced at the peatland surface appears to be primarily controlled by vegetation composition and density. In particular, sites with dense Sphagnum cover had lower EVI values than sites with loose-growing Sphagnum or vegetation dominated by vascular plants and/or non-Sphagnum bryophytes. Our results suggest that more environmental information may be inferred from testate amoebae than previously recognized. Knowledge of relationships between testate amoebae and short-term environmental variability should lead to more detailed and refined environmental inferences.

  16. Qualitative similarities in the visual short-term memory of pigeons and people

    OpenAIRE

    Gibson, Brett; Wasserman, Edward; Luck, Steven J.

    2011-01-01

    Visual short-term memory plays a key role in guiding behavior, and individual differences in visual short-term memory capacity are strongly predictive of higher cognitive abilities. To provide a broader evolutionary context for understanding this memory system, we directly compared the behavior of pigeons and humans on a change detection task. Although pigeons had a lower storage capacity and a higher lapse rate than humans, both species stored multiple items in short-term memory and conforme...

  17. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  18. Short-term versus long-term contracting for uranium enrichment services

    International Nuclear Information System (INIS)

    Rudy, G.P.

    1990-01-01

    The US Department of Energy (US DOE) is the world's largest and most experienced supplier of uranium enrichment services. Through the late 1970s and early 1980s, emerging market forces transformed what was once a monopoly into a highly competitive industry. In the early 1980's the DOE lost market share. But as we enter the 1990s, new market forces have emerged. The US DOE believes a responsible balance between long-term and short-term contracting will be the key to success and the key to assuring the long-term health and reliability of the nuclear fuel industry. The US DOE intends to be in this nuclear business for a long time and will continue to offer reliable and responsive services second to none

  19. Probabilistic seismic hazard assessments of Sabah, east Malaysia: accounting for local earthquake activity near Ranau

    Science.gov (United States)

    Khalil, Amin E.; Abir, Ismail A.; Ginsos, Hanteh; Abdel Hafiez, Hesham E.; Khan, Sohail

    2018-02-01

    Sabah state in eastern Malaysia, unlike most of the other Malaysian states, is characterized by common seismological activity; generally an earthquake of moderate magnitude is experienced at an interval of roughly every 20 years, originating mainly from two major sources, either a local source (e.g. Ranau and Lahad Dato) or a regional source (e.g. Kalimantan and South Philippines subductions). The seismicity map of Sabah shows the presence of two zones of distinctive seismicity, these zones are near Ranau (near Kota Kinabalu) and Lahad Datu in the southeast of Sabah. The seismicity record of Ranau begins in 1991, according to the international seismicity bulletins (e.g. United States Geological Survey and the International Seismological Center), and this short record is not sufficient for seismic source characterization. Fortunately, active Quaternary fault systems are delineated in the area. Henceforth, the seismicity of the area is thus determined as line sources referring to these faults. Two main fault systems are believed to be the source of such activities; namely, the Mensaban fault zone and the Crocker fault zone in addition to some other faults in their vicinity. Seismic hazard assessments became a very important and needed study for the extensive developing projects in Sabah especially with the presence of earthquake activities. Probabilistic seismic hazard assessments are adopted for the present work since it can provide the probability of various ground motion levels during expected from future large earthquakes. The output results are presented in terms of spectral acceleration curves and uniform hazard curves for periods of 500, 1000 and 2500 years. Since this is the first time that a complete hazard study has been done for the area, the output will be a base and standard for any future strategic plans in the area.

  20. Communicating likelihoods and probabilities in forecasts of volcanic eruptions

    Science.gov (United States)

    Doyle, Emma E. H.; McClure, John; Johnston, David M.; Paton, Douglas

    2014-02-01

    The issuing of forecasts and warnings of natural hazard events, such as volcanic eruptions, earthquake aftershock sequences and extreme weather often involves the use of probabilistic terms, particularly when communicated by scientific advisory groups to key decision-makers, who can differ greatly in relative expertise and function in the decision making process. Recipients may also differ in their perception of relative importance of political and economic influences on interpretation. Consequently, the interpretation of these probabilistic terms can vary greatly due to the framing of the statements, and whether verbal or numerical terms are used. We present a review from the psychology literature on how the framing of information influences communication of these probability terms. It is also unclear as to how people rate their perception of an event's likelihood throughout a time frame when a forecast time window is stated. Previous research has identified that, when presented with a 10-year time window forecast, participants viewed the likelihood of an event occurring ‘today’ as being of less than that in year 10. Here we show that this skew in perception also occurs for short-term time windows (under one week) that are of most relevance for emergency warnings. In addition, unlike the long-time window statements, the use of the phrasing “within the next…” instead of “in the next…” does not mitigate this skew, nor do we observe significant differences between the perceived likelihoods of scientists and non-scientists. This finding suggests that effects occurring due to the shorter time window may be ‘masking’ any differences in perception due to wording or career background observed for long-time window forecasts. These results have implications for scientific advice, warning forecasts, emergency management decision-making, and public information as any skew in perceived event likelihood towards the end of a forecast time window may result in

  1. [Impulsiveness Among Short-Term Prisoners with Antisocial Personality Disorder].

    Science.gov (United States)

    Lang, Fabian U; Otte, Stefanie; Vasic, Nenad; Jäger, Markus; Dudeck, Manuela

    2015-07-01

    The study aimed to investigate the correlation between impulsiveness and the antisocial personality disorder among short-term prisoners. The impulsiveness was diagnosed by the Barratt Impulsiveness Scale (BIS). Short-term prisoners with antisocial personality disorder scored significant higher marks on the BIS total scale than those without any personality disorder. In detail, they scored higher marks on each subscale regarding attentional, motor and nonplanning impulsiveness. Moderate and high effects were calculated. It is to be considered to regard impulsivity as a conceptual component of antisociality. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Potential breeding distributions of U.S. birds predicted with both short-term variability and long-term average climate data.

    Science.gov (United States)

    Bateman, Brooke L; Pidgeon, Anna M; Radeloff, Volker C; Flather, Curtis H; VanDerWal, Jeremy; Akçakaya, H Resit; Thogmartin, Wayne E; Albright, Thomas P; Vavrus, Stephen J; Heglund, Patricia J

    2016-12-01

    Climate conditions, such as temperature or precipitation, averaged over several decades strongly affect species distributions, as evidenced by experimental results and a plethora of models demonstrating statistical relations between species occurrences and long-term climate averages. However, long-term averages can conceal climate changes that have occurred in recent decades and may not capture actual species occurrence well because the distributions of species, especially at the edges of their range, are typically dynamic and may respond strongly to short-term climate variability. Our goal here was to test whether bird occurrence models can be predicted by either covariates based on short-term climate variability or on long-term climate averages. We parameterized species distribution models (SDMs) based on either short-term variability or long-term average climate covariates for 320 bird species in the conterminous USA and tested whether any life-history trait-based guilds were particularly sensitive to short-term conditions. Models including short-term climate variability performed well based on their cross-validated area-under-the-curve AUC score (0.85), as did models based on long-term climate averages (0.84). Similarly, both models performed well compared to independent presence/absence data from the North American Breeding Bird Survey (independent AUC of 0.89 and 0.90, respectively). However, models based on short-term variability covariates more accurately classified true absences for most species (73% of true absences classified within the lowest quarter of environmental suitability vs. 68%). In addition, they have the advantage that they can reveal the dynamic relationship between species and their environment because they capture the spatial fluctuations of species potential breeding distributions. With this information, we can identify which species and guilds are sensitive to climate variability, identify sites of high conservation value where climate

  3. Reinsurance by short-term reinsurers in South Africa

    Directory of Open Access Journals (Sweden)

    Fernhout, C. L. R.

    2016-02-01

    Full Text Available The short-term reinsurance process usually involves three parties, namely the insurer, the reinsurer and the original policyholder, as the insurer cedes a part of the covered risk of the policyholder to the reinsurer. This research however addresses the perceptions of reinsurers regarding their reinsurance activities, where the reinsurer sells reinsurance to other insurance entities (viz. insurers and reinsurers, as well as buys reinsurance from other insurance entities. The crux of short-term reinsurance is therefore mutually loss sharing between the various insurance entities. The objective of this research focuses on the improvement of financial decision-making regarding the reinsurance operations of the reinsurers. To achieve this objective a literature study was undertaken to provide adequate background to compile a questionnaire for the empirical survey. The primary study embodies the perceptions of the South African short-term reinsurers regarding the following aspects: the various reasons why reinsurance occurs; the contracts / methods of reinsurance; the bases / forms of reinsurance; and the factors which determine the retention levels of a reinsurer. South Africa is classified as a developing economy, is a member of the BRICS countries and has an emerging market economy. The empirical results should therefore also be valuable to other countries which are classified similarly

  4. Frequency-specific insight into short-term memory capacity.

    Science.gov (United States)

    Feurra, Matteo; Galli, Giulia; Pavone, Enea Francesco; Rossi, Alessandro; Rossi, Simone

    2016-07-01

    The digit span is one of the most widely used memory tests in clinical and experimental neuropsychology for reliably measuring short-term memory capacity. In the forward version, sequences of digits of increasing length have to be reproduced in the order in which they are presented, whereas in the backward version items must be reproduced in the reversed order. Here, we assessed whether transcranial alternating current stimulation (tACS) increases the memory span for digits of young and midlife adults. Imperceptibly weak electrical currents in the alpha (10 Hz), beta (20 Hz), theta (5 Hz), and gamma (40 Hz) range, as well as a sham stimulation, were delivered over the left posterior parietal cortex, a cortical region thought to sustain maintenance processes in short-term memory through oscillatory brain activity in the beta range. We showed a frequency-specific effect of beta-tACS that robustly increased the forward memory span of young, but not middle-aged, healthy individuals. The effect correlated with age: the younger the subjects, the greater the benefit arising from parietal beta stimulation. Our results provide evidence of a short-term memory capacity improvement in young adults by online frequency-specific tACS application. Copyright © 2016 the American Physiological Society.

  5. Assessing Lay Understanding of Common Presentations of Earthquake Hazard Information

    Science.gov (United States)

    Thompson, K. J.; Krantz, D. H.

    2010-12-01

    The Working Group on California Earthquake Probabilities (WGCEP) includes, in its introduction to earthquake rupture forecast maps, the assertion that "In daily living, people are used to making decisions based on probabilities -- from the flip of a coin (50% probability of heads) to weather forecasts (such as a 30% chance of rain) to the annual chance of being killed by lightning (about 0.0003%)." [3] However, psychology research identifies a large gap between lay and expert perception of risk for various hazards [2], and cognitive psychologists have shown in numerous studies [1,4-6] that people neglect, distort, misjudge, or misuse probabilities, even when given strong guidelines about the meaning of numerical or verbally stated probabilities [7]. The gap between lay and expert use of probability needs to be recognized more clearly by scientific organizations such as WGCEP. This study undertakes to determine how the lay public interprets earthquake hazard information, as presented in graphical map form by the Uniform California Earthquake Rupture Forecast (UCERF), compiled by the WGCEP and other bodies including the USGS and CGS. It also explores alternate ways of presenting hazard data, to determine which presentation format most effectively translates information from scientists to public. Participants both from California and from elsewhere in the United States are included, to determine whether familiarity -- either with the experience of an earthquake, or with the geography of the forecast area -- affects people's ability to interpret an earthquake hazards map. We hope that the comparisons between the interpretations by scientific experts and by different groups of laypeople will both enhance theoretical understanding of factors that affect information transmission and assist bodies such as the WGCEP in their laudable attempts to help people prepare themselves and their communities for possible natural hazards. [1] Kahneman, D & Tversky, A (1979). Prospect

  6. On the Inclusion of Short-distance Bystander Effects into a Logistic Tumor Control Probability Model.

    Science.gov (United States)

    Tempel, David G; Brodin, N Patrik; Tomé, Wolfgang A

    2018-01-01

    Currently, interactions between voxels are neglected in the tumor control probability (TCP) models used in biologically-driven intensity-modulated radiotherapy treatment planning. However, experimental data suggests that this may not always be justified when bystander effects are important. We propose a model inspired by the Ising model, a short-range interaction model, to investigate if and when it is important to include voxel to voxel interactions in biologically-driven treatment planning. This Ising-like model for TCP is derived by first showing that the logistic model of tumor control is mathematically equivalent to a non-interacting Ising model. Using this correspondence, the parameters of the logistic model are mapped to the parameters of an Ising-like model and bystander interactions are introduced as a short-range interaction as is the case for the Ising model. As an example, we apply the model to study the effect of bystander interactions in the case of radiation therapy for prostate cancer. The model shows that it is adequate to neglect bystander interactions for dose distributions that completely cover the treatment target and yield TCP estimates that lie in the shoulder of the dose response curve. However, for dose distributions that yield TCP estimates that lie on the steep part of the dose response curve or for inhomogeneous dose distributions having significant hot and/or cold regions, bystander effects may be important. Furthermore, the proposed model highlights a previously unexplored and potentially fruitful connection between the fields of statistical mechanics and tumor control probability/normal tissue complication probability modeling.

  7. Short-term memory in Down syndrome: applying the working memory model.

    Science.gov (United States)

    Jarrold, C; Baddeley, A D

    2001-10-01

    This paper is divided into three sections. The first reviews the evidence for a verbal short-term memory deficit in Down syndrome. Existing research suggests that short-term memory for verbal information tends to be impaired in Down syndrome, in contrast to short-term memory for visual and spatial material. In addition, problems of hearing or speech do not appear to be a major cause of difficulties on tests of verbal short-term memory. This suggests that Down syndrome is associated with a specific memory problem, which we link to a potential deficit in the functioning of the 'phonological loop' of Baddeley's (1986) model of working memory. The second section considers the implications of a phonological loop problem. Because a reasonable amount is known about the normal functioning of the phonological loop, and of its role in language acquisition in typical development, we can make firm predictions as to the likely nature of the short-term memory problem in Down syndrome, and its consequences for language learning. However, we note that the existing evidence from studies with individuals with Down syndrome does not fit well with these predictions. This leads to the third section of the paper, in which we consider key questions to be addressed in future research. We suggest that there are two questions to be answered, which follow directly from the contradictory results outlined in the previous section. These are 'What is the precise nature of the verbal short-term memory deficit in Down syndrome', and 'What are the consequences of this deficit for learning'. We discuss ways in which these questions might be addressed in future work.

  8. Response probability and latency: a straight line, an operational definition of meaning and the structure of short term memory

    OpenAIRE

    Tarnow, Dr. Eugen

    2008-01-01

    The functional relationship between response probability and time is investigated in data from Rubin, Hinton and Wenzel (1999) and Anderson (1981). Recall/recognition probabilities and search times are linearly related through stimulus presentation lags from 6 seconds to 600 seconds in the former experiment and for repeated learning of words in the latter. The slope of the response time vs. probability function is related to the meaningfulness of the items used. The Rubin et al data sugges...

  9. Estimation of Extreme Response and Failure Probability of Wind Turbines under Normal Operation using Probability Density Evolution Method

    DEFF Research Database (Denmark)

    Sichani, Mahdi Teimouri; Nielsen, Søren R.K.; Liu, W. F.

    2013-01-01

    Estimation of extreme response and failure probability of structures subjected to ultimate design loads is essential for structural design of wind turbines according to the new standard IEC61400-1. This task is focused on in the present paper in virtue of probability density evolution method (PDEM......), which underlies the schemes of random vibration analysis and structural reliability assessment. The short-term rare failure probability of 5-mega-watt wind turbines, for illustrative purposes, in case of given mean wind speeds and turbulence levels is investigated through the scheme of extreme value...... distribution instead of any other approximate schemes of fitted distribution currently used in statistical extrapolation techniques. Besides, the comparative studies against the classical fitted distributions and the standard Monte Carlo techniques are carried out. Numerical results indicate that PDEM exhibits...

  10. The ordered network structure and prediction summary for M ≥ 7 earthquakes in Xinjiang region of China

    International Nuclear Information System (INIS)

    Men, Ke-Pei; Zhao, Kai

    2014-01-01

    M ≥ 7 earthquakes have showed an obvious commensurability and orderliness in Xinjiang of China and its adjacent region since 1800. The main orderly values are 30 a x k (k = 1, 2, 3), 11 ∝ 12 a, 41 ∝ 43 a, 18 ∝ 19 a, and 5 ∝ 6 a. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered network structure analysis with complex network technology, we focus on the prediction summary of M ≥ 7 earthquakes by using the ordered network structure, and add new information to further optimize network, hence construct the 2D- and 3D-ordered network structure of M ≥ 7 earthquakes. In this paper, the network structure revealed fully the regularity of seismic activity of M ≥ 7 earthquakes in the study region during the past 210 years. Based on this, the Karakorum M7.1 earthquake in 1996, the M7.9 earthquake on the frontier of Russia, Mongol, and China in 2003, and two Yutian M7.3 earthquakes in 2008 and 2014 were predicted successfully. At the same time, a new prediction opinion is presented that the future two M ≥ 7 earthquakes will probably occur around 2019-2020 and 2025-2026 in this region. The results show that large earthquake occurred in defined region can be predicted. The method of ordered network structure analysis produces satisfactory results for the mid-and-long term prediction of M ≥ 7 earthquakes.

  11. Effect of zinc supplementation of pregnant rats on short-term and long-term memory of their offspring

    International Nuclear Information System (INIS)

    Ali, M.A.; Ghotbeddin, Z.; Parham, G.H.

    2007-01-01

    To see the dose dependent effects of zinc chloride on the short-term and long-term memory in a shuttle box (rats). Six pair adult wistar rats were taken for this experiment. One group of pregnant rats received a daily oral dose of 20 mg/kg Zn as zinc chloride and the remaining groups received a daily oral dose of (30, 50, 70,100 mg/kg) zinc chloride for two weeks by gavage. One month after birth, a shuttle box was used to test short-term and long-term memory. Two criteria were considered to behavioral test, including latency in entering dark chamber and time spent in the dark chamber. This experiment showed that oral administration of ZnCl/sub 2/ with (20, 30, 50 mg/kg/day) doses after 2 weeks at the stage of pregnancy, can improve the working memory of their offspring (p<0.05). Where as ZnCl/sub 2/ with 30 mg/kg/day dose has been more effective than other doses (p<0.001). But rat which received ZnCl/sub 2/ with 100 mg/kg/day at the stage of pregnancy, has shown significant impairment in working (short-term) memory of their offspring (p<0.05) and there was no significant difference in reference (long-term) memory 3 for any of groups. This study has demonstrated that zinc chloride consumption with 30 mg/kg/day dose for two weeks at the stage of pregnancy in rats, has positive effect on short-term memory on their offspring. But consumption of enhanced zinc 100 mg/kg/day in pregnant rats can cause short-term memory impairment. On the other hand, zinc supplementation such as zinc chloride has no effect on long-term memory. (author)

  12. Short-term memory in zebrafish (Danio rerio).

    Science.gov (United States)

    Jia, Jason; Fernandes, Yohaan; Gerlai, Robert

    2014-08-15

    Learning and memory represent perhaps the most complex behavioral phenomena. Although their underlying mechanisms have been extensively analyzed, only a fraction of the potential molecular components have been identified. The zebrafish has been proposed as a screening tool with which mechanisms of complex brain functions may be systematically uncovered. However, as a relative newcomer in behavioral neuroscience, the zebrafish has not been well characterized for its cognitive and mnemonic features, thus learning and/or memory screens with adults have not been feasible. Here we study short-term memory of adult zebrafish. We show animated images of conspecifics (the stimulus) to the experimental subject during 1 min intervals on ten occasions separated by different (2, 4, 8 or 16 min long) inter-stimulus intervals (ISI), a between subject experimental design. We quantify the distance of the subject from the image presentation screen during each stimulus presentation interval, during each of the 1-min post-stimulus intervals immediately following the stimulus presentations and during each of the 1-min intervals furthest away from the last stimulus presentation interval and just before the next interval (pre-stimulus interval), respectively. Our results demonstrate significant retention of short-term memory even in the longest ISI group but suggest no acquisition of reference memory. Because in the employed paradigm both stimulus presentation and behavioral response quantification is computer automated, we argue that high-throughput screening for drugs or mutations that alter short-term memory performance of adult zebrafish is now becoming feasible. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. 34 CFR 664.11 - What is a short-term seminar project?

    Science.gov (United States)

    2010-07-01

    ... short-term seminar project is— (a) Designed to help integrate international studies into an institution... 34 Education 3 2010-07-01 2010-07-01 false What is a short-term seminar project? 664.11 Section 664.11 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF...

  14. Reproductive and Birth Outcomes in Haiti Before and After the 2010 Earthquake.

    Science.gov (United States)

    Harville, Emily W; Do, Mai

    2016-02-01

    We aimed to examine the relationship between exposure to the 2010 Haiti earthquake and pregnancy wantedness, interpregnancy interval, and birth weight. From the nationally representative Haiti 2012 Demographic and Health Survey, information on "size of child at birth" (too small or not) was available for 7280 singleton births in the previous 5 years, whereas information on birth weight was available for 1607 births. Pregnancy wantedness, short (earthquake and by level of damage. Multiple logistic regression and linear regression analyses were conducted. Post-earthquake births were less likely to be wanted and more likely to be born after a short interpregnancy interval. Earthquake exposure was associated with increased likelihood of a child being born too small: timing of birth (after earthquake vs. before earthquake, adjusted odds ratio [aOR]: 1.27, 95% confidence interval [CI]: 1.12-1.45), region (hardest-hit vs. rest of country; aOR: 1.43, 95% CI: 1.14- 1.80), and house damage (aOR: 1.27 95% CI: 1.02-1.58). Mean birth weight was 150 to 300 g lower in those exposed to the earthquake. Experience with the earthquake was associated with worse reproductive and birth outcomes, which underscores the need to provide reproductive health services as part of relief efforts.

  15. Multivesicular release underlies short term synaptic potentiation independent of release probability change in the supraoptic nucleus.

    Directory of Open Access Journals (Sweden)

    Michelle E Quinlan

    Full Text Available Magnocellular neurons of the supraoptic nucleus receive glutamatergic excitatory inputs that regulate the firing activity and hormone release from these neurons. A strong, brief activation of these excitatory inputs induces a lingering barrage of tetrodotoxin-resistant miniature EPSCs (mEPSCs that lasts for tens of minutes. This is known to accompany an immediate increase in large amplitude mEPSCs. However, it remains unknown how long this amplitude increase can last and whether it is simply a byproduct of greater release probability. Using in vitro patch clamp recording on acute rat brain slices, we found that a brief, high frequency stimulation (HFS of afferents induced a potentiation of mEPSC amplitude lasting up to 20 min. This amplitude potentiation did not correlate with changes in mEPSC frequency, suggesting that it does not reflect changes in presynaptic release probability. Nonetheless, neither postsynaptic calcium chelator nor the NMDA receptor antagonist blocked the potentiation. Together with the known calcium dependency of HFS-induced potentiation of mEPSCs, our results imply that mEPSC amplitude increase requires presynaptic calcium. Further analysis showed multimodal distribution of mEPSC amplitude, suggesting that large mEPSCs were due to multivesicular glutamate release, even at late post-HFS when the frequency is no longer elevated. In conclusion, high frequency activation of excitatory synapses induces lasting multivesicular release in the SON, which is independent of changes in release probability. This represents a novel form of synaptic plasticity that may contribute to prolonged excitatory tone necessary for generation of burst firing of magnocellular neurons.

  16. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  17. Geomorphological and geological property of short active fault in fore-arc region of Japan

    International Nuclear Information System (INIS)

    Sasaki, Toshinori; Inoue, Daiei; Ueta, Keiichi; Miyakoshi, Katsuyoshi

    2009-01-01

    The important issue in the earthquake magnitude evaluation method is the classification of short active faults or lineaments. It is necessary to determine the type of active fault to be included in the earthquake magnitude evaluation. The particular group of fault is the surface earthquake faults that are presumed to be branched faults of large interplate earthquakes in subduction zones. We have classified short lineaments in two fore-arc regions of Japan through geological and geomorphological methods based on field survey and aerial photograph interpretation. The first survey is conducted at Enmeiji Fault in Boso Peninsula. The fault is known to have been displaced by 1923 Taisho Kanto earthquake. The altitude distributions of marine terrace surfaces are different on both sides of the fault. In other words, this fault has been displaced repeatedly by the large interplate earthquakes in the past. However, the recurrent interval of this fault is far longer than the large interplate earthquake calculated by the slip rate and the displacement per event. The second survey is conducted in the western side of Muroto Peninsula, where several short lineaments are distributed. We have found several fault outcrops along the few, particular lineaments. The faults in the region have similar properties to Enmeiji Fault. On the other hand, short lineaments are found to be structural landforms. The comparison of the two groups enables us to classify the short lineaments based on the geomorphological property and geological cause of these faults. Displacement per event is far larger than displacement deduced from length of the active fault. Recurrence interval of the short active fault is far longer than that of large interplate earthquake. Displacement of the short active fault has cumulative. The earthquake magnitude of the faults have these characters need to be evaluated by the plate boundary fault or the long branched seismogenic fault. (author)

  18. Aspects if stochastic models for short-term hydropower scheduling and bidding

    Energy Technology Data Exchange (ETDEWEB)

    Belsnes, Michael Martin [Sintef Energy, Trondheim (Norway); Follestad, Turid [Sintef Energy, Trondheim (Norway); Wolfgang, Ove [Sintef Energy, Trondheim (Norway); Fosso, Olav B. [Dep. of electric power engineering NTNU, Trondheim (Norway)

    2012-07-01

    This report discusses challenges met when turning from deterministic to stochastic decision support models for short-term hydropower scheduling and bidding. The report describes characteristics of the short-term scheduling and bidding problem, different market and bidding strategies, and how a stochastic optimization model can be formulated. A review of approaches for stochastic short-term modelling and stochastic modelling for the input variables inflow and market prices is given. The report discusses methods for approximating the predictive distribution of uncertain variables by scenario trees. Benefits of using a stochastic over a deterministic model are illustrated by a case study, where increased profit is obtained to a varying degree depending on the reservoir filling and price structure. Finally, an approach for assessing the effect of using a size restricted scenario tree to approximate the predictive distribution for stochastic input variables is described. The report is a summary of the findings of Work package 1 of the research project #Left Double Quotation Mark#Optimal short-term scheduling of wind and hydro resources#Right Double Quotation Mark#. The project aims at developing a prototype for an operational stochastic short-term scheduling model. Based on the investigations summarized in the report, it is concluded that using a deterministic equivalent formulation of the stochastic optimization problem is convenient and sufficient for obtaining a working prototype. (author)

  19. 22 CFR 62.21 - Short-term scholars.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Short-term scholars. 62.21 Section 62.21 Foreign Relations DEPARTMENT OF STATE PUBLIC DIPLOMACY AND EXCHANGES EXCHANGE VISITOR PROGRAM Specific... programs, confer on common problems and projects, and promote professional relationships and communications...

  20. Scalable data-driven short-term traffic prediction

    NARCIS (Netherlands)

    Friso, K.; Wismans, L. J.J.; Tijink, M. B.

    2017-01-01

    Short-term traffic prediction has a lot of potential for traffic management. However, most research has traditionally focused on either traffic models-which do not scale very well to large networks, computationally-or on data-driven methods for freeways, leaving out urban arterials completely. Urban

  1. Short-Term Saved Leave Scheme

    CERN Multimedia

    2007-01-01

    As announced at the meeting of the Standing Concertation Committee (SCC) on 26 June 2007 and in http://Bulletin No. 28/2007, the existing Saved Leave Scheme will be discontinued as of 31 December 2007. Staff participating in the Scheme will shortly receive a contract amendment stipulating the end of financial contributions compensated by save leave. Leave already accumulated on saved leave accounts can continue to be taken in accordance with the rules applicable to the current scheme. A new system of saved leave will enter into force on 1 January 2008 and will be the subject of a new implementation procedure entitled "Short-term saved leave scheme" dated 1 January 2008. At its meeting on 4 December 2007, the SCC agreed to recommend the Director-General to approve this procedure, which can be consulted on the HR Department’s website at the following address: https://cern.ch/hr-services/services-Ben/sls_shortterm.asp All staff wishing to participate in the new scheme a...

  2. Short-Term Saved Leave Scheme

    CERN Multimedia

    HR Department

    2007-01-01

    As announced at the meeting of the Standing Concertation Committee (SCC) on 26 June 2007 and in http://Bulletin No. 28/2007, the existing Saved Leave Scheme will be discontinued as of 31 December 2007. Staff participating in the Scheme will shortly receive a contract amendment stipulating the end of financial contributions compensated by save leave. Leave already accumulated on saved leave accounts can continue to be taken in accordance with the rules applicable to the current scheme. A new system of saved leave will enter into force on 1 January 2008 and will be the subject of a new im-plementation procedure entitled "Short-term saved leave scheme" dated 1 January 2008. At its meeting on 4 December 2007, the SCC agreed to recommend the Director-General to approve this procedure, which can be consulted on the HR Department’s website at the following address: https://cern.ch/hr-services/services-Ben/sls_shortterm.asp All staff wishing to participate in the new scheme ...

  3. Maternal haemoglobin and short-term neonatal outcome in preterm neonates.

    Directory of Open Access Journals (Sweden)

    Elodie Savajols

    Full Text Available To determine whether there is a significant association between maternal haemoglobin measured before delivery and short-term neonatal outcome in very preterm neonates.We included prospectively all live births occurring from 25 to 32+6 weeks of gestation in a tertiary care centre between January 1(st 2009 and December 31(st 2011. Outborn infants and infants presenting with lethal malformations were excluded. Three hundred and thirty-nine mothers and 409 infants met the inclusion criteria. For each mother-infant pair a prospective record of epidemiologic data was performed and maternal haemoglobin concentration recorded within 24 hours before delivery was retrospectively researched. Maternal haemoglobin was divided into quartiles with the second and the third one regarded as reference as they were composed of normal haemoglobin values. Short-term outcome was defined as poor in case of death during hospital stay and/or grades III/IV intraventricular haemorrhage and/or periventricular leukomalacia and/or necessity of ventriculoperitoneal shunt.The global rate of poor short-term neonatal outcome was 11.4% and was significantly associated with low maternal haemoglobin values. This association remained significant after adjustment for antenatal corticosteroids therapy, gestational age, parity, mechanism of preterm birth, mode of delivery and birth weight (aOR = 2.97 CI 95% [1.36-6.47]. There was no relation between short-term neonatal outcome and high maternal haemoglobin concentration values.We show that low maternal haemoglobin concentration at delivery is an independent risk factor for poor short-term neonatal outcome in very preterm neonates. This study is one of the first to show such an association within the preterm population.

  4. Transfer of Information from Short- to Long-Term Memory

    Science.gov (United States)

    Modigliani, Vito; Seamon, John G.

    1974-01-01

    The present study examined current hypotheses concerning information transfer from short-term memory (STM) to long-term memory (LTM) using a Peterson STM task with word triplets presented over retention intervals of 0, 3, 6, 9, and 18 sec. (Editor)

  5. Short-term energy outlook, July 1998

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-07-01

    The Energy Information Administration (EIA) prepares The Short-Term Energy Outlook (energy supply, demand, and price projections) monthly for distribution on the internet at: www.eia.doe.gov/emeu/steo/pub/contents.html. In addition, printed versions of the report are available to subscribers in January, April, July and October. The forecast period for this issue of the Outlook extends from July 1998 through December 1999. Values for second quarter of 1998 data, however, are preliminary EIA estimates (for example, some monthly values for petroleum supply and disposition are derived in part from weekly data reported in EIA`s Weekly Petroleum Status Report) or are calculated from model simulations that use the latest exogenous information available (for example, electricity sales and generation are simulated by using actual weather data). The historical energy data, compiled in the July 1998 version of the Short-Term Integrated Forecasting System (STIFS) database, are mostly EIA data regularly published in the Monthly Energy Review, Petroleum Supply Monthly, and other EIA publications. Minor discrepancies between the data in these publications and the historical data in this Outlook are due to independent rounding. 28 figs., 19 tabs.

  6. Short- and long-term memory: differential involvement of neurotransmitter systems and signal transduction cascades

    Directory of Open Access Journals (Sweden)

    MÔNICA R.M. VIANNA

    2000-09-01

    Full Text Available Since William James (1890 first distinguished primary from secondary memory, equivalent to short- and long-term memory, respectively, it has been assumed that short-term memory processes are in charge of cognition while long-term memory is being consolidated. From those days a major question has been whether short-term memory is merely a initial phase of long-term memory, or a separate phenomena. Recent experiments have shown that many treatments with specific molecular actions given into the hippocampus and related brain areas after one-trial avoidance learning can effectively cancel short-term memory without affecting long-term memory formation. This shows that short-term memory and long-term memory involve separate mechanisms and are independently processed. Other treatments, however, influence both memory types similarly, suggesting links between both at the receptor and at the post-receptor level, which should not be surprising as they both deal with nearly the same sensorimotor representations. This review examines recent advances in short- and long-term memory mechanisms based on the effect of intra-hippocampal infusion of drugs acting upon neurotransmitter and signal transduction systems on both memory types.

  7. Dispositional optimism as predictor of outcome in short- and long-term psychotherapy.

    Science.gov (United States)

    Heinonen, Erkki; Heiskanen, Tiia; Lindfors, Olavi; Härkäpää, Kristiina; Knekt, Paul

    2017-09-01

    Dispositional optimism predicts various beneficial outcomes in somatic health and treatment, but has been little studied in psychotherapy. This study investigated whether an optimistic disposition differentially predicts patients' ability to benefit from short-term versus long-term psychotherapy. A total of 326 adult outpatients with mood and/or anxiety disorder were randomized into short-term (solution-focused or short-term psychodynamic) or long-term psychodynamic therapy and followed up for 3 years. Dispositional optimism was assessed by patients at baseline with the self-rated Life Orientation Test (LOT) questionnaire. Outcome was assessed at baseline and seven times during the follow-up, in terms of depressive (BDI, HDRS), anxiety (SCL-90-ANX, HARS), and general psychiatric symptoms (SCL-90-GSI), all seven follow-up points including patients' self-reports and three including interview-based measures. Lower dispositional optimism predicted faster symptom reduction in short-term than in long-term psychotherapy. Higher optimism predicted equally rapid and eventually greater benefits in long-term, as compared to short-term, psychotherapy. Weaker optimism appeared to predict sustenance of problems early in long-term therapy. Stronger optimism seems to best facilitate engaging in and benefiting from a long-term therapy process. Closer research might clarify the psychological processes responsible for these effects and help fine-tune both briefer and longer interventions to optimize treatment effectiveness for particular patients and their psychological qualities. Weaker dispositional optimism does not appear to inhibit brief therapy from effecting symptomatic recovery. Patients with weaker optimism do not seem to gain added benefits from long-term therapy, but instead may be susceptible to prolonged psychiatric symptoms in the early stages of long-term therapy. © 2016 The British Psychological Society.

  8. Deficits in verbal long-term memory and learning in children with poor phonological short-term memory skills.

    Science.gov (United States)

    Gathercole, Susan E; Briscoe, Josie; Thorn, Annabel; Tiffany, Claire

    2008-03-01

    Possible links between phonological short-term memory and both longer term memory and learning in 8-year-old children were investigated in this study. Performance on a range of tests of long-term memory and learning was compared for a group of 16 children with poor phonological short-term memory skills and a comparison group of children of the same age with matched nonverbal reasoning abilities but memory scores in the average range. The low-phonological-memory group were impaired on longer term memory and learning tasks that taxed memory for arbitrary verbal material such as names and nonwords. However, the two groups performed at comparable levels on tasks requiring the retention of visuo-spatial information and of meaningful material and at carrying out prospective memory tasks in which the children were asked to carry out actions at a future point in time. The results are consistent with the view that poor short-term memory function impairs the longer-term retention and ease of learning of novel verbal material.

  9. Insensitivity of visual short-term memory to irrelevant visual information

    OpenAIRE

    Andrade, Jackie; Kemps, Eva; Werniers, Yves; May, Jon; Szmalec, Arnaud

    2002-01-01

    Several authors have hypothesised that visuo-spatial working memory is functionally analogous to verbal working memory. Irrelevant background speech impairs verbal short-term memory. We investigated whether irrelevant visual information has an analogous effect on visual short-term memory, using a dynamic visual noise (DVN) technique known to disrupt visual imagery (Quinn & McConnell, 1996a). Experiment 1 replicated the effect of DVN on pegword imagery. Experiments 2 and 3 showed no effect of ...

  10. Distraction in Verbal Short-Term Memory: Insights from Developmental Differences

    OpenAIRE

    Elliott, Emily; Hughes, Robert W.; Briganti, A; Joseph, Tanya Nicolette; Marsh, John Everett; Macken, William J.

    2016-01-01

    The contribution of two mechanisms of auditory distraction in verbal serial short-term memory-interference with the serial rehearsal processes used to support short-term recall and general attentional diversion-was investigated by exploiting differences in auditory distraction in children and adults. Experiment 1 showed that serial rehearsal plays a role in children's as well as adults' distractibility: Auditory distraction from irrelevant speech was greater for both children and adults as th...

  11. Short-Term and Long-Term Educational Mobility of Families: A Two-Sex Approach.

    Science.gov (United States)

    Song, Xi; Mare, Robert D

    2017-02-01

    We use a multigenerational perspective to investigate how families reproduce and pass their educational advantages to succeeding generations. Unlike traditional mobility studies that have typically focused on one-sex influences from fathers to sons, we rely on a two-sex approach that accounts for interactions between males and females-the process in which males and females mate and have children with those of similar educational statuses and jointly determine the educational status attainment of their offspring. Using data from the Panel Study of Income Dynamics, we approach this issue from both a short-term and a long-term perspective. For the short term, grandparents' educational attainments have a direct association with grandchildren's education as well as an indirect association that is mediated by parents' education and demographic behaviors. For the long term, initial educational advantages of families may benefit as many as three subsequent generations, but such advantages are later offset by the lower fertility of highly educated persons. Yet, all families eventually achieve the same educational distribution of descendants because of intermarriages between families of high- and low-education origin.

  12. Risk-averse decision-making for civil infrastructure exposed to low-probability, high-consequence events

    International Nuclear Information System (INIS)

    Cha, Eun Jeong; Ellingwood, Bruce R.

    2012-01-01

    Quantitative analysis and assessment of risk to civil infrastructure has two components: probability of a potentially damaging event and consequence of damage, measured in terms of financial or human losses. Decision models that have been utilized during the past three decades take into account the probabilistic component rationally, but address decision-maker attitudes toward consequences and risk only to a limited degree. The application of models reflecting these attitudes to decisions involving low-probability, high-consequence events that may impact civil infrastructure requires a fundamental understanding of risk acceptance attitudes and how they affect individual and group choices. In particular, the phenomenon of risk aversion may be a significant factor in decisions for civil infrastructure exposed to low-probability events with severe consequences, such as earthquakes, hurricanes or floods. This paper utilizes cumulative prospect theory to investigate the role and characteristics of risk-aversion in assurance of structural safety.

  13. A neuromorphic circuit mimicking biological short-term memory.

    Science.gov (United States)

    Barzegarjalali, Saeid; Parker, Alice C

    2016-08-01

    Research shows that the way we remember things for a few seconds is a different mechanism from the way we remember things for a longer time. Short-term memory is based on persistently firing neurons, whereas storing information for a longer time is based on strengthening the synapses or even forming new neural connections. Information about location and appearance of an object is segregated and processed by separate neurons. Furthermore neurons can continue firing using different mechanisms. Here, we have designed a biomimetic neuromorphic circuit that mimics short-term memory by firing neurons, using biological mechanisms to remember location and shape of an object. Our neuromorphic circuit has a hybrid architecture. Neurons are designed with CMOS 45nm technology and synapses are designed with carbon nanotubes (CNT).

  14. Short-term depression and transient memory in sensory cortex.

    Science.gov (United States)

    Gillary, Grant; Heydt, Rüdiger von der; Niebur, Ernst

    2017-12-01

    Persistent neuronal activity is usually studied in the context of short-term memory localized in central cortical areas. Recent studies show that early sensory areas also can have persistent representations of stimuli which emerge quickly (over tens of milliseconds) and decay slowly (over seconds). Traditional positive feedback models cannot explain sensory persistence for at least two reasons: (i) They show attractor dynamics, with transient perturbations resulting in a quasi-permanent change of system state, whereas sensory systems return to the original state after a transient. (ii) As we show, those positive feedback models which decay to baseline lose their persistence when their recurrent connections are subject to short-term depression, a common property of excitatory connections in early sensory areas. Dual time constant network behavior has also been implemented by nonlinear afferents producing a large transient input followed by much smaller steady state input. We show that such networks require unphysiologically large onset transients to produce the rise and decay observed in sensory areas. Our study explores how memory and persistence can be implemented in another model class, derivative feedback networks. We show that these networks can operate with two vastly different time courses, changing their state quickly when new information is coming in but retaining it for a long time, and that these capabilities are robust to short-term depression. Specifically, derivative feedback networks with short-term depression that acts differentially on positive and negative feedback projections are capable of dynamically changing their time constant, thus allowing fast onset and slow decay of responses without requiring unrealistically large input transients.

  15. Short-term energy outlook. Quarterly projections, first quarter 1995

    International Nuclear Information System (INIS)

    1995-02-01

    The Energy Information Administration (EIA) prepares quarterly, short-term energy supply, demand, and price projections for publication in February, May, August, and November in the Short-Term Energy Outlook (Outlook). The forecast period for this issue of the Outlook extends from the first quarter of 1995 through the fourth quarter of 1996. Values for the fourth quarter of 1994, however, are preliminary EIA estimates or are calculated from model simulations using the latest exogenous information available (for example, electricity sales and generation are simulated using actual weather data). The historical energy data, compiled into the first quarter 1995 version of the Short-Term Integrated Forecasting System (STIFS) database, are mostly EIA data regularly published in the Monthly Energy Review, Petroleum Supply Monthly, and other EIA publications. Minor discrepancies between the data in these publications and the historical data in this Outlook are due to independent rounding. The STIFS database is archived quarterly and is available from the National Technical Information Service. The cases are produced using the Short-Term Integrated Forecasting System (STIFS). The STIFS model is driven principally by three sets of assumptions or inputs: estimates of key macroeconomic variables, world oil price assumptions, and assumptions about the severity of weather. Macroeconomic estimates are produced by DRI/McGraw-Hill but are adjusted by EIA to reflect EIA assumptions about the world price of crude oil, energy product prices, and other assumptions which may affect the macroeconomic outlook. The EIA model is available on computer tape from the National Technical Information Service

  16. Intermediate-depth earthquakes facilitated by eclogitization-related stresses

    Science.gov (United States)

    Nakajima, Junichi; Uchida, Naoki; Shiina, Takahiro; Hasegawa, Akira; Hacker, Bradley R.; Kirby, Stephen H.

    2013-01-01

    Eclogitization of the basaltic and gabbroic layer in the oceanic crust involves a volume reduction of 10%–15%. One consequence of the negative volume change is the formation of a paired stress field as a result of strain compatibility across the reaction front. Here we use waveform analysis of a tiny seismic cluster in the lower crust of the downgoing Pacific plate and reveal new evidence in favor of this mechanism: tensional earthquakes lying 1 km above compressional earthquakes, and earthquakes with highly similar waveforms lying on well-defined planes with complementary rupture areas. The tensional stress is interpreted to be caused by the dimensional mismatch between crust transformed to eclogite and underlying untransformed crust, and the earthquakes are probably facilitated by reactivation of fossil faults extant in the subducting plate. These observations provide seismic evidence for the role of volume change–related stresses and, possibly, fluid-related embrittlement as viable processes for nucleating earthquakes in downgoing oceanic lithosphere.

  17. Self-reported immature defense style as a predictor of outcome in short-term and long-term psychotherapy.

    Science.gov (United States)

    Laaksonen, Maarit A; Sirkiä, Carlos; Knekt, Paul; Lindfors, Olavi

    2014-07-01

    Identification of pretreatment patient characteristics predictive of psychotherapy outcome could help to guide treatment choices. This study evaluates patients' initial level of immature defense style as a predictor of the outcome of short-term versus long-term psychotherapy. In the Helsinki Psychotherapy Study, 326 adult outpatients with mood or anxiety disorder were randomized to individual short-term (psychodynamic or solution-focused) or long-term (psychodynamic) psychotherapy. Their defense style was assessed at baseline using the 88-item Defense Style Questionnaire and classified as low or high around the median value of the respective score. Both specific (Beck Depression Inventory [BDI], Hamilton Depression Rating Scale [HDRS], Symptom Check List Anxiety Scale [SCL-90-Anx], Hamilton Anxiety Rating Scale [HARS]) and global (Symptom Check List Global Severity Index [SCL-90-GSI], Global Assessment of Functioning Scale [GAF]) psychiatric symptoms were measured at baseline and 3-7 times during a 3-year follow-up. Patients with high use of immature defense style experienced greater symptom reduction in long-term than in short-term psychotherapy by the end of the 3-year follow-up (50% vs. 34%). Patients with low use of immature defense style experienced faster symptom reduction in short-term than in long-term psychotherapy during the first year of follow-up (34% vs. 19%). Knowledge of patients' initial level of immature defense style may potentially be utilized in tailoring treatments. Further research on defense styles as outcome predictors in psychotherapies of different types is needed.

  18. Measuring Short-term Energy Security

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Ensuring energy security has been at the centre of the IEA mission since its inception, following the oil crises of the early 1970s. While the security of oil supplies remains important, contemporary energy security policies must address all energy sources and cover a comprehensive range of natural, economic and political risks that affect energy sources, infrastructures and services. In response to this challenge, the IEA is currently developing a Model Of Short-term Energy Security (MOSES) to evaluate the energy security risks and resilience capacities of its member countries. The current version of MOSES covers short-term security of supply for primary energy sources and secondary fuels among IEA countries. It also lays the foundation for analysis of vulnerabilities of electricity and end-use energy sectors. MOSES contains a novel approach to analysing energy security, which can be used to identify energy security priorities, as a starting point for national energy security assessments and to track the evolution of a country's energy security profile. By grouping together countries with similar 'energy security profiles', MOSES depicts the energy security landscape of IEA countries. By extending the MOSES methodology to electricity security and energy services in the future, the IEA aims to develop a comprehensive policy-relevant perspective on global energy security. This Brochure provides and overview of the analysis and results. Readers interested in an in-depth discussion of methodology are referred to the MOSES Working Paper.

  19. Remote Triggering of the Mw 6.9 Hokkaido Earthquake as a Result of the Mw 6.6 Indonesian Earthquake on September 11, 2008

    Directory of Open Access Journals (Sweden)

    Cheng-Horng Lin

    2012-01-01

    Full Text Available Only just recently, the phenomenon of earthquakes being triggered by a distant earthquake has been well established. Yet, most of the triggered earthquakes have been limited to small earthquakes (M < 3. Also, the exact triggering mechanism for earthquakes is still not clear. Here I show how one strong earthquake (Mw = 6.6 is capable of triggering another (Mw = 6.9 at a remote distance (~4750 km. On September 11, 2008, two strong earthquakes with magnitudes (Mw of 6.6 and 6.9 hit respectively in Indonesia and Japan within a short interval of ~21 minutes time. Careful examination of broadband seismograms recorded in Japan shows that the Hokkaido earthquake occurred just as the surface waves generated by the Indonesia earthquake arrived. Although the peak dynamic stress estimated at the focus of the Hokkaido earthquake was just reaching the lower bound for the capability of triggering earthquakes in general, a more plausible mechanism for triggering an earthquake might be attributed to the change of a fault property by fluid infiltration. These observations suggest that the Hokkaido earthquake was likely triggered from a remote distance by the surface waves generated from the Indonesia earthquake. If some more cases can be observed, a temporal warning of possible interaction between strong earthquakes might be concerned in the future.

  20. Short-term feeding strategies and pork quality

    NARCIS (Netherlands)

    Geesink, G.H.; Buren, van R.G.C.; Savenije, B.; Verstegen, M.W.A.; Ducro, B.J.; Palen, van der J.G.P.; Hemke, G.

    2004-01-01

    Two experiments were done to determine whether short-term supplementation (5 days pre-slaughter) with magnesium acetate, or a combination of magnesium acetate, tryptophan, vitamin E and vitamin C would improve pork quality. In the first experiment the pigs (Pietrain x Yorkshire, n = 96) were fed a