WorldWideScience

Sample records for abstract global earthquake

  1. Global Earthquake Hazard Distribution - Peak Ground Acceleration

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Distribution-Peak Ground Acceleration is a 2.5 by 2.5 minute grid of global earthquake hazards developed using Global Seismic Hazard Program...

  2. Global Earthquake Hazard Distribution - Peak Ground Acceleration

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Distribution-peak ground acceleration is a 2.5 minute grid of global earthquake hazards developed using Global Seismic Hazard Program...

  3. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  4. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 by 2.5 minute global utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  5. Global Earthquake Total Economic Loss Risk Deciles

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Total Economic Loss Risk Deciles is a 2.5 minute grid of global earthquake total economic loss risks. A process of spatially allocating Gross...

  6. Global Earthquake Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Mortality Risks and Distribution is a 2.5 minute grid of global earthquake mortality risks. Gridded Population of the World, Version 3 (GPWv3) data...

  7. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  8. Global Earthquake Proportional Economic Loss Risk Deciles

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Proportional Economic Loss Risk Deciles is a 2.5 minute grid of earthquake hazard economic loss as proportions of Gross Domestic Product (GDP) per...

  9. Global Earthquake Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  10. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  11. Trends in global earthquake loss

    Science.gov (United States)

    Arnst, Isabel; Wenzel, Friedemann; Daniell, James

    2016-04-01

    Based on the CATDAT damage and loss database we analyse global trends of earthquake losses (in current values) and fatalities for the period between 1900 and 2015 from a statistical perspective. For this time period the data are complete for magnitudes above 6. First, we study the basic statistics of losses and find that losses below 10 bl. US satisfy approximately a power law with an exponent of 1.7 for the cumulative distribution. Higher loss values are modelled with the General Pareto Distribution (GPD). The 'transition' between power law and GPD is determined with the Mean Excess Function. We split the data set into a period of pre 1955 and post 1955 loss data as in those periods the exposure is significantly different due to population growth. The Annual Average Loss (AAL) for direct damage for events below 10 bl. US differs by a factor of 6, whereas the incorporation of the extreme loss events increases the AAL from 25 bl. US/yr to 30 bl. US/yr. Annual Average Deaths (AAD) show little (30%) difference for events below 6.000 fatalities and AAD values of 19.000 and 26.000 deaths per year if extreme values are incorporated. With data on the global Gross Domestic Product (GDP) that reflects the annual expenditures (consumption, investment, government spending) and on capital stock we relate losses to the economic capacity of societies and find that GDP (in real terms) grows much faster than losses so that the latter one play a decreasing role given the growing prosperity of mankind. This reasoning does not necessarily apply on a regional scale. Main conclusions of the analysis are that (a) a correct projection of historic loss values to nowadays US values is critical; (b) extreme value analysis is mandatory; (c) growing exposure is reflected in the AAL and AAD results for the periods pre and post 1955 events; (d) scaling loss values with global GDP data indicates that the relative size - from a global perspective - of losses decreases rapidly over time.

  12. Global Significant Earthquake Database, 2150 BC to present

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Significant Earthquake Database is a global listing of over 5,700 earthquakes from 2150 BC to the present. A significant earthquake is classified as one that...

  13. Do weak global stresses synchronize earthquakes?

    Science.gov (United States)

    Bendick, R.; Bilham, R.

    2017-08-01

    Insofar as slip in an earthquake is related to the strain accumulated near a fault since a previous earthquake, and this process repeats many times, the earthquake cycle approximates an autonomous oscillator. Its asymmetric slow accumulation of strain and rapid release is quite unlike the harmonic motion of a pendulum and need not be time predictable, but still resembles a class of repeating systems known as integrate-and-fire oscillators, whose behavior has been shown to demonstrate a remarkable ability to synchronize to either external or self-organized forcing. Given sufficient time and even very weak physical coupling, the phases of sets of such oscillators, with similar though not necessarily identical period, approach each other. Topological and time series analyses presented here demonstrate that earthquakes worldwide show evidence of such synchronization. Though numerous studies demonstrate that the composite temporal distribution of major earthquakes in the instrumental record is indistinguishable from random, the additional consideration of event renewal interval serves to identify earthquake groupings suggestive of synchronization that are absent in synthetic catalogs. We envisage the weak forces responsible for clustering originate from lithospheric strain induced by seismicity itself, by finite strains over teleseismic distances, or by other sources of lithospheric loading such as Earth's variable rotation. For example, quasi-periodic maxima in rotational deceleration are accompanied by increased global seismicity at multidecadal intervals.

  14. Crowd-Sourced Global Earthquake Early Warning

    Science.gov (United States)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  15. Octree-based Global Earthquake Simulations

    Science.gov (United States)

    Ramirez-Guzman, L.; Juarez, A.; Bielak, J.; Salazar Monroy, E. F.

    2017-12-01

    Seismological research has motivated recent efforts to construct more accurate three-dimensional (3D) velocity models of the Earth, perform global simulations of wave propagation to validate models, and also to study the interaction of seismic fields with 3D structures. However, traditional methods for seismogram computation at global scales are limited by computational resources, relying primarily on traditional methods such as normal mode summation or two-dimensional numerical methods. We present an octree-based mesh finite element implementation to perform global earthquake simulations with 3D models using topography and bathymetry with a staircase approximation, as modeled by the Carnegie Mellon Finite Element Toolchain Hercules (Tu et al., 2006). To verify the implementation, we compared the synthetic seismograms computed in a spherical earth against waveforms calculated using normal mode summation for the Preliminary Earth Model (PREM) for a point source representation of the 2014 Mw 7.3 Papanoa, Mexico earthquake. We considered a 3 km-thick ocean layer for stations with predominantly oceanic paths. Eigen frequencies and eigen functions were computed for toroidal, radial, and spherical oscillations in the first 20 branches. Simulations are valid at frequencies up to 0.05 Hz. Matching among the waveforms computed by both approaches, especially for long period surface waves, is excellent. Additionally, we modeled the Mw 9.0 Tohoku-Oki earthquake using the USGS finite fault inversion. Topography and bathymetry from ETOPO1 are included in a mesh with more than 3 billion elements; constrained by the computational resources available. We compared estimated velocity and GPS synthetics against observations at regional and teleseismic stations of the Global Seismological Network and discuss the differences among observations and synthetics, revealing that heterogeneity, particularly in the crust, needs to be considered.

  16. Characteristics of global strong earthquakes and their implications ...

    Indian Academy of Sciences (India)

    11

    the Global/Harvard centroid moment tensor (CMT) catalogue, the characteristics of global strong earthquakes and the. 18 present-day stress pattern were analyzed based on these data. The majority of global strong earthquakes are located around. 19 plate boundaries, shallow-focus, and thrust faulting (TF) regime.

  17. Global risk of big earthquakes has not recently increased.

    Science.gov (United States)

    Shearer, Peter M; Stark, Philip B

    2012-01-17

    The recent elevated rate of large earthquakes has fueled concern that the underlying global rate of earthquake activity has increased, which would have important implications for assessments of seismic hazard and our understanding of how faults interact. We examine the timing of large (magnitude M≥7) earthquakes from 1900 to the present, after removing local clustering related to aftershocks. The global rate of M≥8 earthquakes has been at a record high roughly since 2004, but rates have been almost as high before, and the rate of smaller earthquakes is close to its historical average. Some features of the global catalog are improbable in retrospect, but so are some features of most random sequences--if the features are selected after looking at the data. For a variety of magnitude cutoffs and three statistical tests, the global catalog, with local clusters removed, is not distinguishable from a homogeneous Poisson process. Moreover, no plausible physical mechanism predicts real changes in the underlying global rate of large events. Together these facts suggest that the global risk of large earthquakes is no higher today than it has been in the past.

  18. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  19. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  20. Characteristics of global strong earthquakes and their implications ...

    Indian Academy of Sciences (India)

    Ju Wei

    2017-10-06

    Oct 6, 2017 ... compiled in the Global/Harvard centroid moment tensor (CMT) catalogue, the characteristics of global strong earthquakes and the present-day stress pattern were analyzed based on these ...... the WSM standard were calculated for individual mechanism (figure 2). Generally, the most common stress regime ...

  1. Global Review of Induced and Triggered Earthquakes

    Science.gov (United States)

    Foulger, G. R.; Wilson, M.; Gluyas, J.; Julian, B. R.; Davies, R. J.

    2016-12-01

    Natural processes associated with very small incremental stress changes can modulate the spatial and temporal occurrence of earthquakes. These processes include tectonic stress changes, the migration of fluids in the crust, Earth tides, surface ice and snow loading, heavy rain, atmospheric pressure, sediment unloading and groundwater loss. It is thus unsurprising that large anthropogenic projects which may induce stress changes of a similar size also modulate seismicity. As human development accelerates and industrial projects become larger in scale and more numerous, the number of such cases is increasing. That mining and water-reservoir impoundment can induce earthquakes has been accepted for several decades. Now, concern is growing about earthquakes induced by activities such as hydraulic fracturing for shale-gas extraction and waste-water disposal via injection into boreholes. As hydrocarbon reservoirs enter their tertiary phases of production, seismicity may also increase there. The full extent of human activities thought to induce earthquakes is, however, much wider than generally appreciated. We have assembled as near complete a catalog as possible of cases of earthquakes postulated to have been induced by human activity. Our database contains a total of 705 cases and is probably the largest compilation made to date. We include all cases where reasonable arguments have been made for anthropogenic induction, even where these have been challenged in later publications. Our database presents the results of our search but leaves judgment about the merits of individual cases to the user. We divide anthropogenic earthquake-induction processes into: a) Surface operations, b) Extraction of mass from the subsurface, c) Introduction of mass into the subsurface, and d) Explosions. Each of these categories is divided into sub-categories. In some cases, categorization of a particular case is tentative because more than one anthropogenic activity may have preceded or been

  2. Global building inventory for earthquake loss estimation and risk management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  3. Rapid estimation of the economic consequences of global earthquakes

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  4. Abstracts

    International Nuclear Information System (INIS)

    1989-09-01

    The proceedings contain 106 papers of which 2 fall under the INIS Scope. One concerns seismic risk assessment at radioactive waste repositories in the U.S., the other concerns the possibility of predicting earthquakes from changes in radon 222 levels in selected ground water springs of northern Italy. (M.D.)

  5. Neutrons for global energy solutions. Book of abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2010-07-01

    The book of abstracts of the conference on neutrons for global energy solutions include contributions to the following topics: Views from politics: What do we need in European energy research: cooperation, large facilities, more science? Fundamental research for energy supply. View from the United States. View from industry: Neutrons for nuclear reactor development in transition stage between generation III and generation IV. Toyotas's expectations for neutron analysis. Instrumentation and cross cutting issues. Energy sources. Waste management and environment. Li ion batteries. Photovoltaics. Savings and catalysis. Fuel cells. Hydrogen storage.

  6. Characteristics of global strong earthquakes and their implications ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 126; Issue 7. Characteristics of global strong earthquakes and their implications for ... We grouped 518 of them into 12 regions (Boxes) based on their geographical proximity and tectonic setting. For each box, the present-day stress field and regime were obtained ...

  7. How complete is the ISC-GEM Global Earthquake Catalog?

    Science.gov (United States)

    Michael, Andrew J.

    2014-01-01

    The International Seismological Centre, in collaboration with the Global Earthquake Model effort, has released a new global earthquake catalog, covering the time period from 1900 through the end of 2009. In order to use this catalog for global earthquake studies, I determined the magnitude of completeness (Mc) as a function of time by dividing the earthquakes shallower than 60 km into 7 time periods based on major changes in catalog processing and data availability and applying 4 objective methods to determine Mc, with uncertainties determined by non-parametric bootstrapping. Deeper events were divided into 2 time periods. Due to differences between the 4 methods, the final Mc was determined subjectively by examining the features that each method focused on in both the cumulative and binned magnitude frequency distributions. The time periods and Mc values for shallow events are: 1900-1917, Mc=7.7; 1918-1939, Mc=7.0; 1940-1954, Mc=6.8; 1955-1963, Mc=6.5; 1964-1975, Mc=6.0; 1976-2003, Mc=5.8; and 2004-2009, Mc=5.7. Using these Mc values for the longest time periods they are valid for (e.g. 1918-2009, 1940-2009,…) the shallow data fits a Gutenberg-Richter distribution with b=1.05 and a=8.3, within 1 standard deviation, with no declustering. The exception is for time periods that include 1900-1917 in which there are only 33 events with M≥ Mc and for those few data b=2.15±0.46. That result calls for further investigations for this time period, ideally having a larger number of earthquakes. For deep events, the results are Mc=7.1 for 1900-1963, although the early data are problematic; and Mc=5.7 for 1964-2009. For that later time period, b=0.99 and a=7.3.

  8. ABSTRACT

    African Journals Online (AJOL)

    University Health Services. Ahmadu Bella University, Zaria, Nigeria. ABSTRACT. Phywieo-chemical methods were used to analyse the commonly used lcualt samples bought from Zaria and Kano local markets. Blood-leadaoncentrations in ltuali ...

  9. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-01-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1 . For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β , of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08 . We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01 . The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog

  10. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  11. Global assessment of human losses due to earthquakes

    Science.gov (United States)

    Silva, Vitor; Jaiswal, Kishor; Weatherill, Graeme; Crowley, Helen

    2014-01-01

    Current studies have demonstrated a sharp increase in human losses due to earthquakes. These alarming levels of casualties suggest the need for large-scale investment in seismic risk mitigation, which, in turn, requires an adequate understanding of the extent of the losses, and location of the most affected regions. Recent developments in global and uniform datasets such as instrumental and historical earthquake catalogues, population spatial distribution and country-based vulnerability functions, have opened an unprecedented possibility for a reliable assessment of earthquake consequences at a global scale. In this study, a uniform probabilistic seismic hazard assessment (PSHA) model was employed to derive a set of global seismic hazard curves, using the open-source software OpenQuake for seismic hazard and risk analysis. These results were combined with a collection of empirical fatality vulnerability functions and a population dataset to calculate average annual human losses at the country level. The results from this study highlight the regions/countries in the world with a higher seismic risk, and thus where risk reduction measures should be prioritized.

  12. Abstract

    African Journals Online (AJOL)

    Francis

    Abstract. Aqueous, methanol and chloroform extracts from the leaves of Ficus religiosa, Thespesia populnea and Hibiscus tiliaceus were completely screened for antibacterial and antifungal activity. The chloroform extract of F. religiosa possessed a broad spectrum of antibacterial activity with a zone of inhibition of 10 to 21 ...

  13. Abstract,

    African Journals Online (AJOL)

    Abstract·. A study was carried out to investigate the effect of overso~ing legumes on ~a~gela~d pe'rtormance in. Shinyanga'region, Tanzania. Four leguminous species namely Centrosema pubescence, Clito-':iii ternatea,. cMacroptilium atropurpureum and Stylosanthes hamata were Qversown in. a"natural ran,geland in a.

  14. Abstract

    Indian Academy of Sciences (India)

    65

    Abstract. For well over three hundred years, the monsoon has been considered to be a gigantic land-sea breeze driven by the land-ocean contrast in surface temperature. In this paper, this hypothesis ..... primary driver of the monsoon in many papers and most textbooks (e.g. Lau and Li, 1984,. Webster 1987a, Meehl 1994, ...

  15. ABSTRACT

    African Journals Online (AJOL)

    Email: jameskigera@yahoo.co.uk. ABSTRACT. Background: Implant orthopaedic surgery is associated with a risk of post operative Surgical Site. Infection (SSI). This can have devastating consequences in the case of arthroplasty. Due to the less than ideal circumstances under which surgery is conducted in Africa, there are ...

  16. Abstract

    African Journals Online (AJOL)

    WORKERS ON THEIR JOB PERFORMANCE IN IMO STATE, NIGERIA. NGOZI OKEREKE AND no. ONU. ABSTRACT. The study focused on the. efl'ect of socioeconomic characteristics of field extension workers on their job performance in.1mo state agricultural development programme, Nigeria. Data was collected with the ...

  17. Abstract

    African Journals Online (AJOL)

    PROF. OLIVER OSUAGWA

    Abstract. Many mathematical models of stochastic dynamical systems were based on the assumption that the drift and volatility coefficients were linear function of the solution. In this work, we arrive at the drift and the volatility by observing the dynamics of change in the selected stocks in a sufficiently small interval t∆ .

  18. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    Directory of Open Access Journals (Sweden)

    W. F. Peng

    2012-03-01

    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  19. Intelligent earthquake data processing for global adjoint tomography

    Science.gov (United States)

    Chen, Y.; Hill, J.; Li, T.; Lei, W.; Ruan, Y.; Lefebvre, M. P.; Tromp, J.

    2016-12-01

    Due to the increased computational capability afforded by modern and future computing architectures, the seismology community is demanding a more comprehensive understanding of the full waveform information from the recorded earthquake seismograms. Global waveform tomography is a complex workflow that matches observed seismic data with synthesized seismograms by iteratively updating the earth model parameters based on the adjoint state method. This methodology allows us to compute a very accurate model of the earth's interior. The synthetic data is simulated by solving the wave equation in the entire globe using a spectral-element method. In order to ensure the inversion accuracy and stability, both the synthesized and observed seismograms must be carefully pre-processed. Because the scale of the inversion problem is extremely large and there is a very large volume of data to both be read and written, an efficient and reliable pre-processing workflow must be developed. We are investigating intelligent algorithms based on a machine-learning (ML) framework that will automatically tune parameters for the data processing chain. One straightforward application of ML in data processing is to classify all possible misfit calculation windows into usable and unusable ones, based on some intelligent ML models such as neural network, support vector machine or principle component analysis. The intelligent earthquake data processing framework will enable the seismology community to compute the global waveform tomography using seismic data from an arbitrarily large number of earthquake events in the fastest, most efficient way.

  20. Detection of change points in underlying earthquake rates, with application to global mega-earthquakes

    Science.gov (United States)

    Touati, Sarah; Naylor, Mark; Main, Ian

    2016-02-01

    The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large

  1. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    Science.gov (United States)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  2. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  3. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  4. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    DEFF Research Database (Denmark)

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1997-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  5. Earthquakes.

    Science.gov (United States)

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  6. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  7. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    Science.gov (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  8. Global earthquake casualties due to secondary effects: A quantitative analysis for improving PAGER losses

    Science.gov (United States)

    Wald, David J.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability.

  9. Characteristics of global strong earthquakes and their implications ...

    Indian Academy of Sciences (India)

    Ju Wei

    2017-10-06

    Oct 6, 2017 ... these plates along the plate boundaries will induce earthquakes. The gliding process between plates produces great but variable stress (Frisch et al. 2011). In recent years, the world has stepped into a natural disaster-prone period, and frequent natural disasters such as earthquakes pose a great threat.

  10. Expanding Horizons in Mitigating Earthquake Related Disasters in Urban Areas: Global Development of Real-Time Seismology

    OpenAIRE

    Utkucu, Murat; Küyük, Hüseyin Serdar; Demir, İsmail Hakkı

    2016-01-01

    Abstract Real-time seismology is a newly developing alternative approach in seismology to mitigate earthquake hazard. It exploits up-to-date advances in seismic instrument technology, data acquisition, digital communications and computer systems for quickly transforming data into earthquake information in real-time to reduce earthquake losses and its impact on social and economic life in the earthquake prone densely populated urban and industrial areas.  Real-time seismology systems are not o...

  11. Mechanical and Statistical Evidence of Human-Caused Earthquakes - A Global Data Analysis

    Science.gov (United States)

    Klose, C. D.

    2012-12-01

    The causality of large-scale geoengineering activities and the occurrence of earthquakes with magnitudes of up to M=8 is discussed and mechanical and statistical evidence is provided. The earthquakes were caused by artificial water reservoir impoundments, underground and open-pit mining, coastal management, hydrocarbon production and fluid injections/extractions. The presented global earthquake catalog has been recently published in the Journal of Seismology and is available for the public at www.cdklose.com. The data show evidence that geomechanical relationships exist with statistical significance between a) seismic moment magnitudes of observed earthquakes, b) anthropogenic mass shifts on the Earth's crust, and c) lateral distances of the earthquake hypocenters to the locations of the mass shifts. Research findings depend on uncertainties, in particular, of source parameter estimations of seismic events before instrumental recoding. First analyses, however, indicate that that small- to medium size earthquakes (M6) tend to be triggered. The rupture propagation of triggered events might be dominated by pre-existing tectonic stress conditions. Besides event specific evidence, large earthquakes such as China's 2008 M7.9 Wenchuan earthquake fall into a global pattern and can not be considered as outliers or simply seen as an act of god. Observations also indicate that every second seismic event tends to occur after a decade, while pore pressure diffusion seems to only play a role when injecting fluids deep underground. The chance of an earthquake to nucleate after two or 20 years near an area with a significant mass shift is 25% or 75% respectively. Moreover, causative effects of seismic activities highly depend on the tectonic stress regime in the Earth's crust in which geoengineering takes place.

  12. Earthquake and nuclear explosion location using the global seismic network

    International Nuclear Information System (INIS)

    Lopez, L.M.

    1983-01-01

    The relocation of nuclear explosions, aftershock sequence and regional seismicity is addressed by using joint hypocenter determination, Lomnitz' distance domain location, and origin time and earthquake depth determination with local observations. Distance domain and joint hypocenter location are used for a stepwise relocation of nuclear explosions in the USSR. The resulting origin times are 2.5 seconds earlier than those obtained by ISC. Local travel times from the relocated explosions are compared to Jeffreys-Bullen tables. P times are found to be faster at 9-30 0 distances, the largest deviation being around 10 seconds at 13-18 0 . At these distances S travel times also are faster by approximately 20 seconds. The 1977 Sumba earthquake sequence is relocated by iterative joint hypocenter determination of events with most station reports. Simultaneously determined station corrections are utilized for the relocation of smaller aftershocks. The relocated hypocenters indicate that the aftershocks were initially concentrated along the deep trench. Origin times and depths are recalculated for intermediate depth and deep earthquakes using local observations in and around the Japanese Islands. It is found that origin time and depth differ systematically from ISC values for intermediate depth events. Origin times obtained for events below the crust down to 100 km depth are earlier, whereas no general bias seem to exist for origin times of events in the 100-400 km depth range. The recalculated depths for earthquakes shallower than 100 km are shallower than ISC depths. The depth estimates for earthquakes deeper than 100 km were increased by the recalculations

  13. Earthquake and nuclear explosion location using the global seismic network

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, L.M.

    1983-01-01

    The relocation of nuclear explosions, aftershock sequence and regional seismicity is addressed by using joint hypocenter determination, Lomnitz' distance domain location, and origin time and earthquake depth determination with local observations. Distance domain and joint hypocenter location are used for a stepwise relocation of nuclear explosions in the USSR. The resulting origin times are 2.5 seconds earlier than those obtained by ISC. Local travel times from the relocated explosions are compared to Jeffreys-Bullen tables. P times are found to be faster at 9-30/sup 0/ distances, the largest deviation being around 10 seconds at 13-18/sup 0/. At these distances S travel times also are faster by approximately 20 seconds. The 1977 Sumba earthquake sequence is relocated by iterative joint hypocenter determination of events with most station reports. Simultaneously determined station corrections are utilized for the relocation of smaller aftershocks. The relocated hypocenters indicate that the aftershocks were initially concentrated along the deep trench. Origin times and depths are recalculated for intermediate depth and deep earthquakes using local observations in and around the Japanese Islands. It is found that origin time and depth differ systematically from ISC values for intermediate depth events. Origin times obtained for events below the crust down to 100 km depth are earlier, whereas no general bias seem to exist for origin times of events in the 100-400 km depth range. The recalculated depths for earthquakes shallower than 100 km are shallower than ISC depths. The depth estimates for earthquakes deeper than 100 km were increased by the recalculations.

  14. 100+ years of instrumental seismology: the example of the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Storchak, Dmitry; Di Giacomo, Domenico

    2015-04-01

    Systematic seismological observations of earthquakes using seismic instruments on a global scale began more than 100 years ago. Since then seismologists made many discoveries about the Earth interior and the physics of the earthquakes, also thanks to major developments in the seismic instrumentation deployed around the world. Besides, since the establishment of the first global networks (Milne and Jesuit networks), seismologists around the world stored and exchanged the results of routine observations (e.g., picking of arrival times, amplitude-period measurements, etc.) or more sophisticated analyses (e.g., moment tensor inversion) in seismological bulletins/catalogues. With a project funded by the GEM Foundation (www.globalquakemodel.org), the ISC and the Team of International Experts released a new global earthquake catalogue, the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php), which, differently from previous global seismic catalogues, has the unique feature of covering the entire period of instrumental seismology with locations and magnitude re-assessed using modern approaches for the global earthquakes selected for processing (in the current version approximately 21,000). During the 110 years covered by the ISC-GEM catalogue many seismological developments occurred in terms of instrumentation, seismological practice and knowledge of the physics of the earthquakes. In this contribution we give a brief overview of the major milestones characterizing the last 100+ years of instrumental seismology that were relevant for the production of the ISC-GEM catalogue and the major challenges we faced to obtain a catalogue as homogenous as possible.

  15. Global permanent deformations triggered by the Sumatra earthquake

    OpenAIRE

    Boschi, E.; Casarotti, E.; Devoti, R.; Melini, D.; Piersanti, A.; Pietrantonio, G.; Riguzzi, F.

    2005-01-01

    The giant Sumatra-Andaman earthquake of December 26 2004 caused permanent deformations effects in a region of previously never observed extension. The GPS data from the world wide network of permanent IGS sites show significant coseismic displacements in an area exceeding 107 km^2. The effects of the permanent residual deformation field could be detected as far as Australia, the Phillipines and Japanese archipelagos, and, on the West, as far as the indian continent. The synthetic simulations ...

  16. Earthquakes

    Science.gov (United States)

    ... Centers Evacuation Center Play Areas Animals in Public Evacuation Centers Pet Shelters Interim Guidelines for Animal Health and Control of Disease Transmission in Pet Shelters Protect Your Pets Earthquakes Language: English (US) Español (Spanish) Recommend on Facebook ...

  17. Rapid, Global Assessment of the Societal Impacts of Earthquake Induced Landsliding

    Science.gov (United States)

    Godt, J. W.; Verdin, K. L.; Jibson, R. W.; Wald, D. J.; Earle, P. S.; Harp, E. L.

    2006-05-01

    We evaluate the feasibility of rapidly estimating landslide potential after large earthquakes by combining near- real-time estimates of ground shaking with a simple slope stability model that uses a new global topographic database derived from elevation data collected as part of the Shuttle Radar Topography Mission (SRTM). Landslides triggered by ground shaking during earthquakes have caused widespread loss of life and damage to critical infrastructure. For example, the magnitude-7.6 earthquake of 8 October 2005 in Pakistan-administered Kashmir generated thousands of landslides that blocked many roads and damned rivers in the mountainous region. Overland access to many remote villages has yet to be restored 5 months after the quake. To provide timely information to emergency relief organizations on the possible societal effects of earthquakes, the USGS has developed an alarm system, PAGER (Prompt Assessment of Global Earthquakes for Response) that combines an estimate of ground shaking with a global population database. Maps of peak ground acceleration are generated in near real time using the methodology and software developed for ShakeMap (http://earthquake.usgs.gov/eqcenter/shakemap/). To evaluate the seismic landslide susceptibility worldwide, we rely on the SRTM data to generate statistics (at 1-km spacing) on the distribution of topographic slope calculated from 3-arcsecond (90m) data. Because many small areas of no more than a few square kilometers lack SRTM data, statistical methods referencing other elevation data were used to create a globally complete dataset. These topographic data are then used in a simplified Newmark analysis that uses spatially uniform material strengths and neglects the effects of groundwater to estimate the relative susceptibility to both shallow and deep landslides from a given earthquake. We present an initial application from the Muzaffarabad region of Pakistan and discuss results in the context of field and aerial observations

  18. The Key Role of Eyewitnesses in Rapid Impact Assessment of Global Earthquake

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Etivant, C.; Frobert, L.; Godey, S.

    2014-12-01

    Uncertainties in rapid impact assessments of global earthquakes are intrinsically large because they rely on 3 main elements (ground motion prediction models, building stock inventory and related vulnerability) which values and/or spatial variations are poorly constrained. Furthermore, variations of hypocentral location and magnitude within their respective uncertainty domain can lead to significantly different shaking level for centers of population and change the scope of the disaster. We present the strategy and methods implemented at the Euro-Med Seismological Centre (EMSC) to rapidly collect in-situ observations on earthquake effects from eyewitnesses for reducing uncertainties of rapid earthquake impact assessment. It comprises crowdsourced information (online questionnaires, pics) as well as information derived from real time analysis of web traffic (flashourcing technique), and more recently deployment of QCN (Quake Catcher Network) low cost sensors. We underline the importance of merging results of different methods to improve performances and reliability of collected data.We try to better understand and respond to public demands and expectations after earthquakes through improved information services and diversification of information tools (social networks, smartphone app., browsers adds-on…), which, in turn, drive more eyewitnesses to our services and improve data collection. We will notably present our LastQuake Twitter feed (Quakebot) and smartphone applications (IOs and android) which only report earthquakes that matter for the public and authorities, i.e. felt and damaging earthquakes identified thanks to citizen generated information.

  19. Global Earthquake Activity Rate models based on version 2 of the Global Strain Rate Map

    Science.gov (United States)

    Bird, P.; Kreemer, C.; Kagan, Y. Y.; Jackson, D. D.

    2013-12-01

    Global Earthquake Activity Rate (GEAR) models have usually been based on either relative tectonic motion (fault slip rates and/or distributed strain rates), or on smoothing of seismic catalogs. However, a hybrid approach appears to perform better than either parent, at least in some retrospective tests. First, we construct a Tectonic ('T') forecast of shallow (≤ 70 km) seismicity based on global plate-boundary strain rates from version 2 of the Global Strain Rate Map. Our approach is the SHIFT (Seismic Hazard Inferred From Tectonics) method described by Bird et al. [2010, SRL], in which the character of the strain rate tensor (thrusting and/or strike-slip and/or normal) is used to select the most comparable type of plate boundary for calibration of the coupled seismogenic lithosphere thickness and corner magnitude. One difference is that activity of offshore plate boundaries is spatially smoothed using empirical half-widths [Bird & Kagan, 2004, BSSA] before conversion to seismicity. Another is that the velocity-dependence of coupling in subduction and continental-convergent boundaries [Bird et al., 2009, BSSA] is incorporated. Another forecast component is the smoothed-seismicity ('S') forecast model of [Kagan & Jackson, 1994, JGR; Kagan & Jackson, 2010, GJI], which was based on optimized smoothing of the shallow part of the GCMT catalog, years 1977-2004. Both forecasts were prepared for threshold magnitude 5.767. Then, we create hybrid forecasts by one of 3 methods: (a) taking the greater of S or T; (b) simple weighted-average of S and T; or (c) log of the forecast rate is a weighted average of the logs of S and T. In methods (b) and (c) there is one free parameter, which is the fractional contribution from S. All hybrid forecasts are normalized to the same global rate. Pseudo-prospective tests for 2005-2012 (using versions of S and T calibrated on years 1977-2004) show that many hybrid models outperform both parents (S and T), and that the optimal weight on S

  20. PAGER-CAT: A composite earthquake catalog for calibrating global fatality models

    Science.gov (United States)

    Allen, T.I.; Marano, K.D.; Earle, P.S.; Wald, D.J.

    2009-01-01

    We have described the compilation and contents of PAGER-CAT, an earthquake catalog developed principally for calibrating earthquake fatality models. It brings together information from a range of sources in a comprehensive, easy to use digital format. Earthquake source information (e.g., origin time, hypocenter, and magnitude) contained in PAGER-CAT has been used to develop an Atlas of Shake Maps of historical earthquakes (Allen et al. 2008) that can subsequently be used to estimate the population exposed to various levels of ground shaking (Wald et al. 2008). These measures will ultimately yield improved earthquake loss models employing the uniform hazard mapping methods of ShakeMap. Currently PAGER-CAT does not consistently contain indicators of landslide and liquefaction occurrence prior to 1973. In future PAGER-CAT releases we plan to better document the incidence of these secondary hazards. This information is contained in some existing global catalogs but is far from complete and often difficult to parse. Landslide and liquefaction hazards can be important factors contributing to earthquake losses (e.g., Marano et al. unpublished). Consequently, the absence of secondary hazard indicators in PAGER-CAT, particularly for events prior to 1973, could be misleading to sorne users concerned with ground-shaking-related losses. We have applied our best judgment in the selection of PAGER-CAT's preferred source parameters and earthquake effects. We acknowledge the creation of a composite catalog always requires subjective decisions, but we believe PAGER-CAT represents a significant step forward in bringing together the best available estimates of earthquake source parameters and reports of earthquake effects. All information considered in PAGER-CAT is stored as provided in its native catalog so that other users can modify PAGER preferred parameters based on their specific needs or opinions. As with all catalogs, the values of some parameters listed in PAGER-CAT are

  1. 7th U.S. / Japan Natural Resources (UJNR) Panel on Earthquake Research: Abstract Volume and Technical Program

    Science.gov (United States)

    Detweiler, Shane T.; Ellsworth, William L.

    2008-01-01

    The U.S. / Japan Natural Resources (UJNR) Panel on Earthquake Research promotes advanced study toward a more fundamental understanding of the earthquake process and hazard estimation. The Panel promotes basic and applied research to improve our understanding of the causes and effects of earthquakes and to facilitate the transmission of research results to those who implement hazard reduction measures on both sides of the Pacific and around the world. Meetings are held every other year, and alternate between countries with short presentation on current research and local field trips being the highlights. The 5th Joint Panel meeting was held at Asilomar, California in October, 2004. The technical sessions featured reports on the September 28, 2004 Parkfield, California earthquake, progress on earthquake early warning and rapid post-event assessment technology, probabilistic earthquake forecasting and the newly discovered phenomenon of nonvolcanic tremor. The Panel visited the epicentral region of the M 6.0 Parkfield earthquake and viewed the surface ruptures along the San Andreas Fault. They also visited the San Andreas Fault Observatory at Depth (SAFOD), which had just completed the first phase of drilling into the fault. The 6th Joint Panel meeting was held in Tokushima, Japan in November, 2006. The meeting included very productive exchanges of information on approaches to systematic observation of earthquake processes. Sixty eight technical papers were presented during the meeting on a wide range of subjects, including interplate earthquakes in subduction zones, slow slip and nonvolcanic tremor, crustal deformation, recent earthquake activity and hazard mapping. Through our discussion, we reaffirmed the benefits of working together to achieve our common goal of reducing earthquake hazard, continued cooperation on issues involving densification of observation networks and the open exchange of data among scientific communities. We also reaffirmed the importance of

  2. Assessment of impact of strong earthquakes to the global economy by example of Thoku event

    Science.gov (United States)

    Tatiana, Skufina; Peter, Skuf'in; Sergey, Baranov; Vera, Samarina; Taisiya, Shatalova

    2016-04-01

    We examine the economic consequences of strong earthquakes by example of M9 Tahoku one that occurred on March 11, 2011 close to the northeast shore of Japanese coast Honshu. This earthquake became the strongest in the whole history of the seismological observations in this part of the planet. The generated tsunami killed more than 15,700 people, damaged 332,395 buildings and 2,126 roads. The total economic loss in Japan was estimated at 309 billion. The catastrophe in Japan also impacted global economy. To estimate its impact, we used regional and global stock indexes, production indexes, stock prices of the main Japanese, European and US companies, import and export dynamics, as well as the data provided by the custom of Japan. We also demonstrated that the catastrophe substantially affected the markets and on the short run in some indicators it even exceeded the effect of the global financial crisis of 2008. The last strong earthquake occurred in Nepal (25.04.2015, M7.8) and Chile (16.09.2015, M8.3), both actualized the research of cost assessments of the overall economic impact of seismic hazard. We concluded that it is necessary to treat strong earthquakes as one very important factor that affects the world economy depending on their location. The research was supported by Russian Foundation for Basic Research (Project 16-06-00056A).

  3. Facilitating open global data use in earthquake source modelling to improve geodetic and seismological approaches

    Science.gov (United States)

    Sudhaus, Henriette; Heimann, Sebastian; Steinberg, Andreas; Isken, Marius; Vasyura-Bathke, Hannes

    2017-04-01

    In the last few years impressive achievements have been made in improving inferences about earthquake sources by using InSAR (Interferometric Synthetic Aperture Radar) data. Several factors aided these developments. The open data basis of earthquake observations has expanded vastly with the two powerful Sentinel-1 SAR sensors up in space. Increasing computer power allows processing of large data sets for more detailed source models. Moreover, data inversion approaches for earthquake source inferences are becoming more advanced. By now data error propagation is widely implemented and the estimation of model uncertainties is a regular feature of reported optimum earthquake source models. Also, more regularly InSAR-derived surface displacements and seismological waveforms are combined, which requires finite rupture models instead of point-source approximations and layered medium models instead of homogeneous half-spaces. In other words the disciplinary differences in geodetic and seismological earthquake source modelling shrink towards common source-medium descriptions and a source near-field/far-field data point of view. We explore and facilitate the combination of InSAR-derived near-field static surface displacement maps and dynamic far-field seismological waveform data for global earthquake source inferences. We join in the community efforts with the particular goal to improve crustal earthquake source inferences in generally not well instrumented areas, where often only the global backbone observations of earthquakes are available provided by seismological broadband sensor networks and, since recently, by Sentinel-1 SAR acquisitions. We present our work on modelling standards for the combination of static and dynamic surface displacements in the source's near-field and far-field, e.g. on data and prediction error estimations as well as model uncertainty estimation. Rectangular dislocations and moment-tensor point sources are exchanged by simple planar finite

  4. Development of the Global Earthquake Model’s neotectonic fault database

    Science.gov (United States)

    Christophersen, Annemarie; Litchfield, Nicola; Berryman, Kelvin; Thomas, Richard; Basili, Roberto; Wallace, Laura; Ries, William; Hayes, Gavin P.; Haller, Kathleen M.; Yoshioka, Toshikazu; Koehler, Richard D.; Clark, Dan; Wolfson-Schwehr, Monica; Boettcher, Margaret S.; Villamor, Pilar; Horspool, Nick; Ornthammarath, Teraphan; Zuñiga, Ramon; Langridge, Robert M.; Stirling, Mark W.; Goded, Tatiana; Costa, Carlos; Yeats, Robert

    2015-01-01

    The Global Earthquake Model (GEM) aims to develop uniform, openly available, standards, datasets and tools for worldwide seismic risk assessment through global collaboration, transparent communication and adapting state-of-the-art science. GEM Faulted Earth (GFE) is one of GEM’s global hazard module projects. This paper describes GFE’s development of a modern neotectonic fault database and a unique graphical interface for the compilation of new fault data. A key design principle is that of an electronic field notebook for capturing observations a geologist would make about a fault. The database is designed to accommodate abundant as well as sparse fault observations. It features two layers, one for capturing neotectonic faults and fold observations, and the other to calculate potential earthquake fault sources from the observations. In order to test the flexibility of the database structure and to start a global compilation, five preexisting databases have been uploaded to the first layer and two to the second. In addition, the GFE project has characterised the world’s approximately 55,000 km of subduction interfaces in a globally consistent manner as a basis for generating earthquake event sets for inclusion in earthquake hazard and risk modelling. Following the subduction interface fault schema and including the trace attributes of the GFE database schema, the 2500-km-long frontal thrust fault system of the Himalaya has also been characterised. We propose the database structure to be used widely, so that neotectonic fault data can make a more complete and beneficial contribution to seismic hazard and risk characterisation globally.

  5. Seismic waves and earthquakes in a global monolithic model

    Science.gov (United States)

    Roubíček, Tomáš

    2018-03-01

    The philosophy that a single "monolithic" model can "asymptotically" replace and couple in a simple elegant way several specialized models relevant on various Earth layers is presented and, in special situations, also rigorously justified. In particular, global seismicity and tectonics is coupled to capture, e.g., (here by a simplified model) ruptures of lithospheric faults generating seismic waves which then propagate through the solid-like mantle and inner core both as shear (S) or pressure (P) waves, while S-waves are suppressed in the fluidic outer core and also in the oceans. The "monolithic-type" models have the capacity to describe all the mentioned features globally in a unified way together with corresponding interfacial conditions implicitly involved, only when scaling its parameters appropriately in different Earth's layers. Coupling of seismic waves with seismic sources due to tectonic events is thus an automatic side effect. The global ansatz is here based, rather for an illustration, only on a relatively simple Jeffreys' viscoelastic damageable material at small strains whose various scaling (limits) can lead to Boger's viscoelastic fluid or even to purely elastic (inviscid) fluid. Self-induced gravity field, Coriolis, centrifugal, and tidal forces are counted in our global model, as well. The rigorous mathematical analysis as far as the existence of solutions, convergence of the mentioned scalings, and energy conservation is briefly presented.

  6. ISC-GEM: Global Instrumental Earthquake Catalogue (1900-2009) I. Location and Seismicity Patterns

    Science.gov (United States)

    Bondar, I.; Engdahl, E. R.; Villasenor, A.; Storchak, D. A.

    2012-12-01

    We present the final results of a two-year project sponsored by the GEM (Global Earthquake Model) Foundation. The ISC-GEM global catalogue consists of some 19 thousand instrumentally recorded, moderate to large earthquakes, spanning 110 years of seismicity. We relocated all events in the catalogue using a two-tier approach. The EHB location methodology (Engdahl et al., 1998) was applied first to obtain improved hypocentres with special focus on the depth determination. The locations were further refined in the next step by fixing the depths to those from the EHB analysis and applying the new ISC location algorithm (Bondár and Storchak, 2011) that reduces location bias by accounting for correlated travel-time prediction error structure. To facilitate the relocation effort, some 900,000 seismic P and S wave arrival-time data were added to the ISC database for the period between 1904 and 1963, either from original station bulletins in the ISC archive or by digitizing the scanned images of the ISS bulletin (Villaseñor and Engdahl, 2005; 2007). Although no substantial amount of new phase data were acquired for the modern period (1964-2009), the number of phases used in the location has still increased by 3 million, owing to fact that both the EHB and ISC locators use all ak135 (Kennett et al., 1995) phases in the location. We show that the relocation effort yielded substantially improved locations, especially in the first half of the 20th century; we demonstrate significant improvements in focal depth estimates in subduction zones and other seismically active regions; and we show that the ISC-GEM catalogue provides an improved view of 110 years of global seismicity of the Earth. The ISC-GEM Global Instrumental Earthquake Catalogue represents the final product of one of the ten global components in the GEM program, and will be made available to researchers at the ISC (www.isc.ac.uk) website.

  7. Revision of earthquake hypocentre locations in global bulletin data sets using source-specific station terms

    Science.gov (United States)

    Nooshiri, Nima; Saul, Joachim; Heimann, Sebastian; Tilmann, Frederik; Dahm, Torsten

    2017-02-01

    Global earthquake locations are often associated with very large systematic travel-time residuals even for clear arrivals, especially for regional and near-regional stations in subduction zones because of their strongly heterogeneous velocity structure. Travel-time corrections can drastically reduce travel-time residuals at regional stations and, in consequence, improve the relative location accuracy. We have extended the shrinking-box source-specific station terms technique to regional and teleseismic distances and adopted the algorithm for probabilistic, nonlinear, global-search location. We evaluated the potential of the method to compute precise relative hypocentre locations on a global scale. The method has been applied to two specific test regions using existing P- and pP-phase picks. The first data set consists of 3103 events along the Chilean margin and the second one comprises 1680 earthquakes in the Tonga-Fiji subduction zone. Pick data were obtained from the GEOFON earthquake bulletin, produced using data from all available, global station networks. A set of timing corrections varying as a function of source position was calculated for each seismic station. In this way, we could correct the systematic errors introduced into the locations by the inaccuracies in the assumed velocity structure without explicitly solving for a velocity model. Residual statistics show that the median absolute deviation of the travel-time residuals is reduced by 40-60 per cent at regional distances, where the velocity anomalies are strong. Moreover, the spread of the travel-time residuals decreased by ˜20 per cent at teleseismic distances (>28°). Furthermore, strong variations in initial residuals as a function of recording distance are smoothed out in the final residuals. The relocated catalogues exhibit less scattered locations in depth and sharper images of the seismicity associated with the subducting slabs. Comparison with a high-resolution local catalogue reveals that

  8. GEM1: First-year modeling and IT activities for the Global Earthquake Model

    Science.gov (United States)

    Anderson, G.; Giardini, D.; Wiemer, S.

    2009-04-01

    GEM is a public-private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) to build an independent standard for modeling and communicating earthquake risk worldwide. GEM is aimed at providing authoritative, open information about seismic risk and decision tools to support mitigation. GEM will also raise risk awareness and help post-disaster economic development, with the ultimate goal of reducing the toll of future earthquakes. GEM will provide a unified set of seismic hazard, risk, and loss modeling tools based on a common global IT infrastructure and consensus standards. These tools, systems, and standards will be developed in partnership with organizations around the world, with coordination by the GEM Secretariat and its Secretary General. GEM partners will develop a variety of global components, including a unified earthquake catalog, fault database, and ground motion prediction equations. To ensure broad representation and community acceptance, GEM will include local knowledge in all modeling activities, incorporate existing detailed models where possible, and independently test all resulting tools and models. When completed in five years, GEM will have a versatile, penly accessible modeling environment that can be updated as necessary, and will provide the global standard for seismic hazard, risk, and loss models to government ministers, scientists and engineers, financial institutions, and the public worldwide. GEM is now underway with key support provided by private sponsors (Munich Reinsurance Company, Zurich Financial Services, AIR Worldwide Corporation, and Willis Group Holdings); countries including Belgium, Germany, Italy, Singapore, Switzerland, and Turkey; and groups such as the European Commission. The GEM Secretariat has been selected by the OECD and will be hosted at the Eucentre at the University of Pavia in Italy; the Secretariat is now formalizing the creation of the GEM Foundation. Some of GEM's global

  9. Prediction of Global Damage and Reliability Based Upon Sequential Identification and Updating of RC Structures Subject to Earthquakes

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Skjærbæk, P. S.; Köylüoglu, H. U.

    The paper deals with the prediction of global damage and future structural reliability with special emphasis on sensitivity, bias and uncertainty of these predictions dependent on the statistically equivalent realizations of the future earthquake. The predictions are based on a modified Clough......-Johnston single-degree-of-freedom (SDOF) oscillator with three parameters which are calibrated to fit the displacement response and the damage development in the past earthquake....

  10. The Benefits and Limitations of Crowdsourced Information for Rapid Damage Assessment of Global Earthquakes

    Science.gov (United States)

    Bossu, R.; Landès, M.; Roussel, F.

    2017-12-01

    The Internet has fastened the collection of felt reports and macroseismic data after global earthquakes. At the European-Mediterranean Seismological Centre (EMSC), where the traditional online questionnaires have been replace by thumbnail-based questionnaires, an average of half of the reports are collected within 10 minutes of an earthquake's occurrence. In regions where EMSC is well identified this goes down to 5 min. The user simply specifies the thumbnail corresponding to observed effects erasing languages barriers and improving collection via small smartphone screens. A previous study has shown that EMSC data is well correlated with "Did You Feel It" (DYFI) data and 3 independent, manually collected datasets. The efficiency and rapidity of felt report collection through thumbnail-based questionnaires does not necessarily mean that they offer a complete picture of the situation for all intensities values, especially the higher ones. There are several potential limitations. Demographics probably play a role but so might eyewitnesses' behaviors: it is probably not their priority to report when their own safety and that of their loved ones is at stake. We propose to test this hypothesis on EMSC felt reports and to extend the study to LastQuake smartphone application uses. LastQuake is a free smartphone app providing very rapid information on felt earthquakes. There are currently 210 000 active users around the world covering almost every country except for a few ones in Sub-Saharan Africa. Along with felt reports we also analyze the characteristics of LastQuake app launches. For both composite datasets created from 108 earthquakes, we analyze the rapidity of eyewitnesses' reaction and how it changes with intensity values and surmise how they reflect different types of behaviors. We will show the intrinsic limitations of crowdsourced information for rapid situation awareness. More importantly, we will show in which cases the lack of crowdsourced information could

  11. ISC-GEM: Global Instrumental Earthquake Catalogue (1900-2009), II. Location and seismicity patterns

    Science.gov (United States)

    Bondár, I.; Engdahl, E. Robert; Villaseñor, A.; Harris, James; Storchak, D.

    2015-02-01

    We present the final results of a two-year project sponsored by the Global Earthquake Model (GEM) Foundation. The ISC-GEM global catalogue consists of some 19 thousand instrumentally recorded, moderate to large earthquakes, spanning 110 years of seismicity. We relocated all events in the catalogue using a two-tier approach. The EHB location methodology (Engdahl et al., 1998) was applied first to obtain improved hypocentres with special focus on the depth determination. The locations were further refined in the next step by fixing the depths to those from the EHB analysis and applying the new International Seismological Centre (ISC) location algorithm (Bondár and Storchak, 2011) that reduces location bias by accounting for correlated travel-time prediction error structure. To facilitate the relocation effort, some one million seismic P and S wave arrival-time data were added to the ISC database for the period between 1904 and 1970, either from original station bulletins in the ISC archive or by digitizing the scanned images of the International Seismological Summary (ISS) bulletin (Villaseñor and Engdahl, 2005, 2007). Although no substantial amount of new phase data were acquired for the modern period (1964-2009), the number of phases used in the location has still increased by three millions, owing to fact that both the EHB and ISC locators use most well-recorded ak135 (Kennett et al., 1995) phases in the location. We show that the relocation effort yielded substantially improved locations, especially in the first half of the 20th century; we demonstrate significant improvements in focal depth estimates in subduction zones and other seismically active regions; and we show that the ISC-GEM catalogue provides an improved view of 110 years of global seismicity of the Earth. The ISC-GEM Global Instrumental Earthquake Catalogue represents the final product of one of the ten global components in the GEM program, and is available to researchers at the ISC (http://www.isc.ac.uk).

  12. A summary of hazard datasets and guidelines supported by the Global Earthquake Model during the first implementation phase

    Directory of Open Access Journals (Sweden)

    Marco Pagani

    2015-04-01

    Full Text Available The Global Earthquake Model (GEM initiative promotes open, transparent and collaborative science aimed at the assessment of earthquake risk and its reduction worldwide. During the first implementation phase (2009-2014 GEM sponsored five projects aimed at the creation of global datasets and guidelines toward the creation of open, transparent and, as far as possible, homogeneous hazard input models. These projects concentrated on the following global databases and models: an instrumental catalogue, a historical earthquake archive and catalogue, a geodetic strain rate model, a database of active faults, and set of ground motion prediction equations. This paper describes the main outcomes of these projects illustrating some initial applications as well as challenges in the creation of hazard models.

  13. The orientation of disaster donations: differences in the global response to five major earthquakes.

    Science.gov (United States)

    Wei, Jiuchang; Marinova, Dora

    2016-07-01

    This study analyses the influence of gift giving, geographical location, political regime, and trade openness on disaster donation decisions, using five severe earthquakes that occurred between 2008 and 2012 as case studies. The results show that global disaster donation is not dominated by only philanthropy or trade interests, and that the determinants of donation decisions vary with the scale of the natural disaster and the characteristics of the disaster-affected countries. While gift giving exists in the case of middle-size earthquakes, political regimes play a very important part in the overall donation process. Countries with higher perceived corruption may donate more frequently, but those that are more democratic may be more generous in their donations. Generosity based on geographical proximity to the calamity is significant in the decision-making process for most natural disasters, yet it may have a negative effect on donations in Latin America and the Caribbean. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  14. Development of the U.S. Geological Survey's PAGER system (Prompt Assessment of Global Earthquakes for Response)

    Science.gov (United States)

    Wald, D.J.; Earle, P.S.; Allen, T.I.; Jaiswal, K.; Porter, K.; Hearne, M.

    2008-01-01

    The Prompt Assessment of Global Earthquakes for Response (PAGER) System plays a primary alerting role for global earthquake disasters as part of the U.S. Geological Survey’s (USGS) response protocol. We provide an overview of the PAGER system, both of its current capabilities and our ongoing research and development. PAGER monitors the USGS’s near real-time U.S. and global earthquake origins and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts. Current PAGER notifications and Web pages estimate the population exposed to each seismic intensity level. In addition to being a useful indicator of potential impact, PAGER’s intensity/exposure display provides a new standard in the dissemination of rapid earthquake information. We are currently developing and testing a more comprehensive alert system that will include casualty estimates. This is motivated by the idea that an estimated range of possible number of deaths will aid in decisions regarding humanitarian response. Underlying the PAGER exposure and loss models are global earthquake ShakeMap shaking estimates, constrained as quickly as possible by finite-fault modeling and observed ground motions and intensities, when available. Loss modeling is being developed comprehensively with a suite of candidate models that range from fully empirical to largely analytical approaches. Which of these models is most appropriate for use in a particular earthquake depends on how much is known about local building stocks and their vulnerabilities. A first-order country-specific global building inventory has been developed, as have corresponding vulnerability functions. For calibrating PAGER loss models, we have systematically generated an Atlas of 5,000 ShakeMaps for significant global earthquakes during the last 36 years. For many of these, auxiliary earthquake source and shaking intensity data are also available. Refinements to the loss models are ongoing

  15. Estimating Fallout Building Attributes from Architectural Features and Global Earthquake Model (GEM) Building Descriptions

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Staci R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-03-01

    A nuclear explosion has the potential to injure or kill tens to hundreds of thousands (or more) of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing fallout radiation exposures) by placing material and distance between fallout particles and individuals indoors. Prior efforts have determined an initial set of building attributes suitable to reasonably assess a given building’s protection against fallout radiation. The current work provides methods to determine the quantitative values for these attributes from (a) common architectural features and data and (b) buildings described using the Global Earthquake Model (GEM) taxonomy. These methods will be used to improve estimates of fallout protection for operational US Department of Defense (DoD) and US Department of Energy (DOE) consequence assessment models.

  16. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    Science.gov (United States)

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  17. Nowcasting Earthquakes

    Science.gov (United States)

    Rundle, J. B.; Donnellan, A.; Grant Ludwig, L.; Turcotte, D. L.; Luginbuhl, M.; Gail, G.

    2016-12-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system, and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(nearthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(nearthquake cycle in the defined region at the current time.

  18. The first IGAC scientific conference: global atmospheric-biospheric chemistry. Book of abstracts

    International Nuclear Information System (INIS)

    1993-04-01

    Various global/transfrontier air pollution problems are described. The causes of these problems are presented. The impact on ecology and biosphere are discussed. Special attention is given to the greenhouse causing agents

  19. MaizeGDB: Global support for maize research through open access information [abstract

    Science.gov (United States)

    MaizeGDB is the open-access global repository for maize genetic and genomic information – from single genes that determine nutritional quality to whole genome-scale data for complex traits including yield and drought tolerance. The data and tools at MaizeGDB enable researchers from Ethiopia to Ghan...

  20. Information Exchange in Global Logistics Chains : An application for Model-based Auditing (abstract)

    NARCIS (Netherlands)

    Veenstra, A.W.; Hulstijn, J.; Christiaanse, R.; Tan, Y.

    2013-01-01

    An integrated data pipeline has been proposed to meet requirements for supply chain visibility and control. How can data integration be used for risk assessment, monitoring and control in global supply chains? We argue that concepts from model-based auditing can be used to model the ‘ideal’ flow of

  1. ANZSEE Biennial Conference Abstracts: Thriving through transformation: Local to global sustainability

    Directory of Open Access Journals (Sweden)

    Miriam Verbeek

    2017-06-01

    Full Text Available 2015 ANZSEE Biennial Conference 2015 The conference themes centred on ideas for transforming to a sustainable human existence at all geographical scales, particularly at the local, regional, and remote scales, but also at the national and global scales: A. Place-based perspectives on sustainability & transformation B. Institutions for resilience & transformation C. Economics of equity & distribution in transformation D. Making the marginal mainstream: expanding horizons Special sessions: Indigenous Wellbeing; Wilderness; & Local Government

  2. Strengthening global practices for protecting nuclear material (NUMAT). Book of Abstracts

    International Nuclear Information System (INIS)

    Steinhaeusler, F.; Heissl, C.

    2002-08-01

    The International Conference on Physical Protection 'Strengthening Global Practices for Protecting Nuclear Material' was organized by the Institute of Physics and Biophysics, Salzburg University in cooperation with/supported by the European Commission, Lawrence Livermore National Laboratory, European Forum of the Stanford University's Institute for International Studies and Austria Institute for European Security. Its purpose was fostering exchange of information on the policy and technical aspects require to ensure the security of nuclear material around the world. There is a general concern that the international community needs to establish effective measures to counter theft, sabotage, and other illicit uses of nuclear fissile and other radioactive materials. The main subjects addressed by this conference were: a) global and local threat development and 'design basis'; b) standards for physical protection (PP), its adequacy and future needs; c) national practices in PP of nuclear materials (how to strengthen national security culture?); d) current R and D in security and detection technologies (identification of focus points for future R and D); e) programmes to aid in training, design, and implementation of physical protection systems (how to improve efficiency and assure sustainability of assistance programmes?). (nevyjel)

  3. Comparison of earthquake source parameters and interseismic plate coupling variations in global subduction zones (Invited)

    Science.gov (United States)

    Bilek, S. L.; Moyer, P. A.; Stankova-Pursley, J.

    2010-12-01

    Geodetically determined interseismic coupling variations have been found in subduction zones worldwide. These coupling variations have been linked to heterogeneities in interplate fault frictional conditions. These connections to fault friction imply that observed coupling variations are also important in influencing details in earthquake rupture behavior. Because of the wealth of newly available geodetic models along many subduction zones, it is now possible to examine detailed variations in coupling and compare to seismicity characteristics. Here we use a large catalog of earthquake source time functions and slip models for moderate to large magnitude earthquakes to explore these connections, comparing earthquake source parameters with available models of geodetic coupling along segments of the Japan, Kurile, Kamchatka, Peru, Chile, and Alaska subduction zones. In addition, we use published geodetic results along the Costa Rica margin to compare with source parameters of small magnitude earthquakes recorded with an onshore-offshore network of seismometers. For the moderate to large magnitude earthquakes, preliminary results suggest a complex relationship between earthquake parameters and estimates of strongly and weakly coupled segments of the plate interface. For example, along the Kamchatka subduction zone, these earthquakes occur primarily along the transition between strong and weak coupling, with significant heterogeneity in the pattern of moment scaled duration with respect to the coupling estimates. The longest scaled duration event in this catalog occurred in a region of strong coupling. Earthquakes along the transition between strong and weakly coupled exhibited the most complexity in the source time functions. Use of small magnitude (0.5 earthquake spectra, with higher corner frequencies and higher mean apparent stress for earthquakes that occur in along the Osa Peninsula relative to the Nicoya Peninsula, mimicking the along-strike variations in

  4. Academia vs Industry: vanishing boundaries between global earthquake seismology and exploration seismics.

    Science.gov (United States)

    van der Hilst, R. D.

    2011-12-01

    Global seismology and exploration seismics have long lived in parallel universes, with little cross-fertilization of methodologies and with interaction between the associated communities often limited to company recruitment of students. Fortunately, this traditional separation of technology and people has begun to disappear. This is driven not only by continuing demands for human and financial resources (for companies and academia, respectively) but increasingly also by overlapping intellectual interest. First, 'waves are waves' (that is, the fundamental physics - and math to describe/handle it - is scale invariant) and many artificial boundaries are being removed by use of better wave theory, faster computers, and new data acquisition paradigms. For example, the development of dense sensor arrays (in USA, Europe, Asia - mostly China and Japan) is increasing the attraction (and need) of industry-style interrogation of massive data sets. Examples include large scale seismic exploration of Earth's deep interior with inverse scattering of teleseismic wavefields (e.g., Van der Hilst et al., Science, 2007). On the other hand, reservoir exploration and production benefits from expertise in earthquake seismology, both for better characterization of reservoirs and their overburden and for (induced) micro-earthquake analysis. Passive source methods (including but not restricted to ambient noise tomography) are providing new, economic opportunities for velocity analysis and monitoring, and studies of (micro)seismicity (e.g., source location, parameters, and moment tensor) allow in situ stress determination, tomographic velocity analysis with natural sources in the reservoir, and 4D monitoring (e.g., for hydrocarbon production, carbon sequestration, enhanced geothermal systems, and unconventional gas production). Second, the gap between the frequency ranges traditionally considered by both communities is being bridged by better theory, new sensor technology, and through

  5. Global navigation satellite system detection of preseismic ionospheric total electron content anomalies for strong magnitude (Mw>6) Himalayan earthquakes

    Science.gov (United States)

    Sharma, Gopal; Champati ray, Prashant Kumar; Mohanty, Sarada; Gautam, Param Kirti Rao; Kannaujiya, Suresh

    2017-10-01

    Electron content in the ionosphere is very sensitive to temporary disturbances of the Earth's magnetosphere (geomagnetic storm), solar flares, and seismic activities. The Global Navigation Satellite System (GNSS)-based total electron content (TEC) measurement has emerged as an important technique for computations of earthquake precursor signals. We examined the pre-earthquake signatures for eight strong magnitude (Mw>6: 6.1 to 7.8) earthquakes with the aid of GNSS-based TEC measurement in the tectonically active Himalayan region using International GNSS Service (IGS) stations as well as local GNSS-based continuously operating reference stations (CORS). The results indicate very significant ionospheric anomalies in the vertical total electron content (vTEC) a few days before the main shock for all of the events. Geomagnetic activities were also studied during the TEC observation window to ascertain their role in ionospheric perturbations. It was also inferred that TEC variation due to low magnitude events could also be monitored if the epicenter lies closer to the GNSS or IGS station. Therefore, the study has confirmed TEC anomalies before major Himalayan earthquakes, thereby making it imperative to set up a much denser network of IGS/CORS for real-time data analysis and forewarning.

  6. Natural Time, Nowcasting and the Physics of Earthquakes: Estimation of Seismic Risk to Global Megacities

    Science.gov (United States)

    Rundle, John B.; Luginbuhl, Molly; Giguere, Alexis; Turcotte, Donald L.

    2017-11-01

    Natural Time ("NT") refers to the concept of using small earthquake counts, for example of M > 3 events, to mark the intervals between large earthquakes, for example M > 6 events. The term was first used by Varotsos et al. (2005) and later by Holliday et al. (2006) in their studies of earthquakes. In this paper, we discuss ideas and applications arising from the use of NT to understand earthquake dynamics, in particular by use of the idea of nowcasting. Nowcasting differs from forecasting, in that the goal of nowcasting is to estimate the current state of the system, rather than the probability of a future event. Rather than focus on an individual earthquake faults, we focus on a defined local geographic region surrounding a particular location. This local region is considered to be embedded in a larger regional setting from which we accumulate the relevant statistics. We apply the nowcasting idea to the practical development of methods to estimate the current state of risk for dozens of the world's seismically exposed megacities, defined as cities having populations of over 1 million persons. We compute a ranking of these cities based on their current nowcast value, and discuss the advantages and limitations of this approach. We note explicitly that the nowcast method is not a model, in that there are no free parameters to be fit to data. Rather, the method is simply a presentation of statistical data, which the user can interpret. Among other results, we find, for example, that the current nowcast ranking of the Los Angeles region is comparable to its ranking just prior to the January 17, 1994 Northridge earthquake.

  7. Natural Time, Nowcasting and the Physics of Earthquakes: Estimation of Seismic Risk to Global Megacities

    Science.gov (United States)

    Rundle, John B.; Luginbuhl, Molly; Giguere, Alexis; Turcotte, Donald L.

    2018-02-01

    Natural Time ("NT") refers to the concept of using small earthquake counts, for example of M > 3 events, to mark the intervals between large earthquakes, for example M > 6 events. The term was first used by Varotsos et al. (2005) and later by Holliday et al. (2006) in their studies of earthquakes. In this paper, we discuss ideas and applications arising from the use of NT to understand earthquake dynamics, in particular by use of the idea of nowcasting. Nowcasting differs from forecasting, in that the goal of nowcasting is to estimate the current state of the system, rather than the probability of a future event. Rather than focus on an individual earthquake faults, we focus on a defined local geographic region surrounding a particular location. This local region is considered to be embedded in a larger regional setting from which we accumulate the relevant statistics. We apply the nowcasting idea to the practical development of methods to estimate the current state of risk for dozens of the world's seismically exposed megacities, defined as cities having populations of over 1 million persons. We compute a ranking of these cities based on their current nowcast value, and discuss the advantages and limitations of this approach. We note explicitly that the nowcast method is not a model, in that there are no free parameters to be fit to data. Rather, the method is simply a presentation of statistical data, which the user can interpret. Among other results, we find, for example, that the current nowcast ranking of the Los Angeles region is comparable to its ranking just prior to the January 17, 1994 Northridge earthquake.

  8. Global teleseismic earthquake relocation from improved travel times and procedures for depth determination

    NARCIS (Netherlands)

    Engdahl, E.R.; Hilst, R.D. van der; Buland, Raymond

    We relocate nearly 100,000 events that occurred during the period 1964 to 1995 and are well-constrained teleseismically by arrival-time data reported to the International Seismological Centre (ISC) and to the U.S. Geological Survey's National Earthquake Information Center (NEIC). Hypocenter

  9. The global historical and future economic loss and cost of earthquakes during the production of adaptive worldwide economic fragility functions

    Science.gov (United States)

    Daniell, James; Wenzel, Friedemann

    2014-05-01

    macroseismic intensity, capital stock estimate, GDP estimate, year and the combined seismic building index (a created combination of the global seismic code index, building practice factor, building age and infrastructure vulnerability). The analysis provided three key results: a) The production of economic fragility functions from the 1900-2008 events showed very good correlation to the economic loss and cost from earthquakes from 2009-2013, in real-time. This methodology has been extended to other natural disaster types (typhoon, flood, drought). b) The reanalysis of historical earthquake events in order to check associated historical loss and costs versus the expected exposure in terms of intensities. The 1939 Chillan, 1948 Turkmenistan, 1950 Iran, 1972 Managua, 1980 Western Nepal and 1992 Erzincan earthquake events were seen as huge outliers compared with the modelled capital stock and GDP and thus additional studies were undertaken to check the original loss results. c) A worldwide GIS layer database of capital stock (gross and net), GDP, infrastructure age and economic indices over the period 1900-2013 have been created in conjunction with the CATDAT database in order to define correct economic loss and costs.

  10. Building Capacity for Earthquake Monitoring: Linking Regional Networks with the Global Community

    Science.gov (United States)

    Willemann, R. J.; Lerner-Lam, A.

    2006-12-01

    Installing or upgrading a seismic monitoring network is often among the mitigation efforts after earthquake disasters, and this is happening in response to the events both in Sumatra during December 2004 and in Pakistan during October 2005. These networks can yield improved hazard assessment, more resilient buildings where they are most needed, and emergency relief directed more quickly to the worst hit areas after the next large earthquake. Several commercial organizations are well prepared for the fleeting opportunity to provide the instruments that comprise a seismic network, including sensors, data loggers, telemetry stations, and the computers and software required for the network center. But seismic monitoring requires more than hardware and software, no matter how advanced. A well-trained staff is required to select appropriate and mutually compatible components, install and maintain telemetered stations, manage and archive data, and perform the analyses that actually yield the intended benefits. Monitoring is more effective when network operators cooperate with a larger community through free and open exchange of data, sharing information about working practices, and international collaboration in research. As an academic consortium, a facility operator and a founding member of the International Federation of Digital Seismographic Networks, IRIS has access to a broad range of expertise with the skills that are required to help design, install, and operate a seismic network and earthquake analysis center, and stimulate the core training for the professional teams required to establish and maintain these facilities. But delivering expertise quickly when and where it is unexpectedly in demand requires advance planning and coordination in order to respond to the needs of organizations that are building a seismic network, either with tight time constraints imposed by the budget cycles of aid agencies following a disastrous earthquake, or as part of more informed

  11. Global correlations between maximum magnitudes of subduction zone interface thrust earthquakes and physical parameters of subduction zones

    NARCIS (Netherlands)

    Schellart, W. P.; Rawlinson, N.

    2013-01-01

    The maximum earthquake magnitude recorded for subduction zone plate boundaries varies considerably on Earth, with some subduction zone segments producing giant subduction zone thrust earthquakes (e.g. Chile, Alaska, Sumatra-Andaman, Japan) and others producing relatively small earthquakes (e.g.

  12. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  13. Abstracts presented at the 7th World Alliance for Risk Factor Surveillance (WARFS) Global Conference. October 16-19, 2011. Toronto, Ontario, Canada.

    Science.gov (United States)

    2012-01-01

    The 7th World Alliance for Risk Factor Surveillance (WARFS) Global Conference, hosted by the Public Health Agency of Canada, was held in Toronto, Ontario, Canada, from October 16 to 19, 2011. Previous WARFS conferences were held in USA (1999), Finland (2001), Australia (2003), Uruguay (2005) and Italy (2007, 2009). WARFS is a global working group on surveillance under the International Union for Health Promotion and Education (IUHPE) It supports the development of risk factor surveillance as a tool for evidence-based public health, acknowledging the importance of this source of information to inform, monitor and evaluate disease prevention and health promotion policies and programs. The theme of the 2011 Global Conference was the role of surveillance in the promotion of health. The Global Conference had 146 registered participants, making it the second most attended WARFS conference in its history. Over the three days, participants attended oral and poster presentations from 30 countries. The conference would not have been possible without the hard work of the International Scientific Committee and the Local Organizing Committee. To highlight the importance and the significance of this conference at an international level, Chronic Diseases and Injuries in Canada (CDIC) is pleased to publish this supplementary issue, which contains 70 abstracts presented at the 7th WARFS Global Conference. In the spirit the Global Conference, this collection of abstracts brings together surveillance material on risk factors, chronic diseases, infectious diseases and injuries from around the world. By making these abstracts widely available, CDIC hopes to further the conference objectives through a continued dialogue between those interested in linking risk factor surveillance to health promotion.

  14. 23rd WiN Global Annual Conference: Women in Nuclear meet Atoms for Peace. Programme and Abstracts

    International Nuclear Information System (INIS)

    2015-01-01

    Women in Nuclear (WiN) Global is a worldwide non-profit-making association made up mostly of female professionals working in the various fields of nuclear energy and radiation applications. WiN Global aims to promote understanding and public awareness of the benefits of nuclear and radiation applications through a series of active networks, both national and international. It has approximately 25 000 members from more than 100 countries, organized in national, regional and international chapters. Every year, a chapter of WiN Global organizes the annual conference, which is a unique occasion for the WiN Global community to meet. The 23. WiN Global Annual Conference will highlight the vital role women play in all applications of nuclear science and technology. At the same time, it will provide opportunities for networking, exchanging ideas, technical visits and obtaining the most up-to-date information on the nuclear programmes and facilities around the world as well as on employment opportunities at the International Atomic Energy Agency (IAEA).

  15. Western Pyrenees facing global change: comparison of the effects of climatic and anthropogenic change on water abstractions

    International Nuclear Information System (INIS)

    Terrasson, Isabelle; Chazot, Sebastien; Maton, Laure; Rinaudo, Jean-Daniel; Caballero, Yvan

    2014-01-01

    In the French Western Pyrenees, the trend to the decrease of low water flows that has been observed during the current years should be going on in the future. This may increase the hydric stress on aquatic ecosystems, and the competition among water uses and users for accessing water resources. The research project ANR-VULCAIN compared the impacts of climatic and socio-economic change on the hydro-systems of the French Western Pyrenees. Modeling and participative prospect analysis have been coupled to quantify the evolution of water abstractions under these two types of change. Socio-economic scenarios have been built together with local stakeholders during workshops (urbanism / land planning on the one hand and agriculture on the other hand). Their results have been quantified with the models developed so as to assess anthropogenic change impacts on domestic and agricultural abstractions. In parallel, the agricultural model has been fed with climatic scenarios so as to assess the impacts of climate change on agricultural water needs. In the created scenarios, the evolution of agricultural water needs under climate change have a bigger range than the evolution of abstractions for domestic water and agricultural needs under anthropic change, which are the same order of magnitude. To satisfy this evolution, there are some rooms to maneuver: make distribution modalities more efficient, optimize the management of storage capacity, or use substitution resources. This paper presents the approach that has been followed, and some of the main results. (authors)

  16. Global Compilation of InSAR Earthquake Source Models: Comparisons with Seismic Catalogues and the Effects of 3D Earth Structure

    Science.gov (United States)

    Weston, J. M.; Ferreira, A. M.; Funning, G. J.

    2010-12-01

    While past progress in seismology led to extensive earthquake catalogues such as the Global Centroid Moment Tensor (GCMT) catalogue, recent advances in space geodesy have enabled earthquake parameter estimations from the measurement of the deformation of the Earth’s surface, notably using InSAR data. Many earthquakes have now been studied using InSAR, but a full assessment of the quality and of the additional value of these source parameters compared to traditional seismological techniques is still lacking. In this study we present results of systematic comparisons between earthquake CMT parameters determined using InSAR and seismic data, on a global scale. We compiled a large database of source parameters obtained using InSAR data from the literature and estimated the corresponding CMT parameters into a ICMT compilation. We here present results from the analysis of 58 earthquakes that occurred between 1992-2007 from about 80 published InSAR studies. Multiple studies of the same earthquake are included in the archive, as they are valuable to assess uncertainties. Where faults are segmented, with changes in width along-strike, a weighted average based on the seismic moment in each fault has been used to determine overall earthquake parameters. For variable slip models, we have calculated source parameters taking the spatial distribution of slip into account. The parameters in our ICMT compilation are compared with those taken from the Global CMT (GCMT), ISC, EHB and NEIC catalogues. We find that earthquake fault strike, dip and rake values in the GCMT and ICMT archives are generally compatible with each other. Likewise, the differences in seismic moment in these two archives are relatively small. However, the locations of the centroid epicentres show substantial discrepancies, which are larger when comparing with GCMT locations (10-30km differences) than for EHB, ISC and NEIC locations (5-15km differences). Since InSAR data have a high spatial resolution, and thus

  17. A test of a global seismic system for monitoring earthquakes and underground nuclear explosions

    International Nuclear Information System (INIS)

    Bowman, J.R.; Muirhead, K.; Spiliopoulos, S.; Jepsen, D.; Leonard, M.

    1993-01-01

    Australia is a member of the Group of Scientific Experts (GSE) to consider international cooperative measures to detect and identify events, an ad hoc group of the United Nations Conference on Disarmament. The GSE conducted a large-scale technical test (GSETT-2) from 22 April to 9 June 1991 that focused on the exchange and analysis of seismic parameter and waveform data. Thirty-four countries participated in GSETT-2, and data were contributed from 60 stations on all continents. GSETT-2 demonstrated the feasibility of collecting and transmitting large volumes (around 1 giga-byte) of digital data around the world, and of producing a preliminary bulletin of global seismicity within 48 hours and a final bulletin within 7 days. However, the experiment also revealed the difficulty of keeping up with the flow of data and analysis with existing resources. The Final Event Bulletins listed 3715 events for the 42 recording days of the test, about twice the number reported routinely by another international agency 5 months later. The quality of the Final Event Bulletin was limited by the uneven spatial distribution of seismic stations that contributed to GSETT-2 and by the ambiguity of associating phases detected by widely separated stations to form seismic events. A monitoring system similar to that used in GSETT-2 could provide timely and accurate reporting of global seismicity. It would need an improved distribution of stations, application of more conservative event formation rules and further development of analysis software. 8 refs., 9 figs

  18. Earthquake Monitoring with the MyShake Global Smartphone Seismic Network

    Science.gov (United States)

    Inbal, A.; Kong, Q.; Allen, R. M.; Savran, W. H.

    2017-12-01

    Smartphone arrays have the potential for significantly improving seismic monitoring in sparsely instrumented urban areas. This approach benefits from the dense spatial coverage of users, as well as from communication and computational capabilities built into smartphones, which facilitate big seismic data transfer and analysis. Advantages in data acquisition with smartphones trade-off with factors such as the low-quality sensors installed in phones, high noise levels, and strong network heterogeneity, all of which limit effective seismic monitoring. Here we utilize network and array-processing schemes to asses event detectability with the MyShake global smartphone network. We examine the benefits of using this network in either triggered or continuous modes of operation. A global database of ground motions measured on stationary phones triggered by M2-6 events is used to establish detection probabilities. We find that the probability of detecting an M=3 event with a single phone located 20 nearby phones closely match the regional catalog locations. We use simulated broadband seismic data to examine how location uncertainties vary with user distribution and noise levels. To this end, we have developed an empirical noise model for the metropolitan Los-Angeles (LA) area. We find that densities larger than 100 stationary phones/km2 are required to accurately locate M 2 events in the LA basin. Given the projected MyShake user distribution, that condition may be met within the next few years.

  19. Solar wind proton density variations that preceded the M6+ earthquakes occurring on a global scale between 17 and 20 April 2014

    Science.gov (United States)

    Cataldi, Gabriele; Cataldi, Daniele; Straser, Valentino

    2015-04-01

    Between 17 and 20 April 2014 on Earth were recorded six M6+ earthquakes: Balleny Islands region M6,2 earthquake occurred on 17 April at 15:06 UTC; Solomon Islands M6,1 earthquake occurred on 18 April at 04:13 UTC; Mexico M7,2 earthquake occurred on 18 April at 14:27 UTC; Papua New Guinea M6,6 earthquake occurred on 19 April at 01:04 UTC; Papua New Guinea M7,5 earthquake occurred on 19 April at 13:28 UTC; Papua New Guinea M6,2 earthquake occurred on 20 April at 00:15 UTC. The authors analyzed the modulation of solar wind ion density during the period from 14 to 23 April 2014 to determine whether the six earthquakes were preceded by a variations of the solar wind ion density and for testing a method to be applied in the future also for the prediction of tsunami. The data on ion density used to realize the correlation study are represented by: solar wind ion density variation detected by ACE (Advanced Composition Explorer) Satellite, in orbit near the L1 Lagrange point, at 1.5 million of km from Earth, in direction of the Sun. The instrument used to perform the measurement of the solar wind ion density is the Electron, Proton, and Alpha Monitor (EPAM) instrument, equipped on the ACE Satellite. To conduct the study, the authors have taken in consideration the variation of the solar wind protons density that have these characteristics: differential proton flux 1060-1900 keV (p/cm^2-sec-ster-MeV); differential proton flux 761-1220 keV (p/cm^2-sec-ster-MeV); differential proton flux 310-580 keV (p/cm^2-sec-ster-MeV) and differential proton flux 115-195 keV (p/cm^2-sec-ster-MeV). This data set has been marked with the time data (time markers) of M6+ earthquakes occurred on a global scale between 17 and 20 April 2014 (the data on M6+ seismic activity are provided in real time by USGS, INGV and CSEM). The result of the analysis showed that the six M6+ earthquakes occurred on a global scale in the time period taken as a reference, were preceded by a significant variation of

  20. Global Earthquake and Volcanic Eruption Economic losses and costs from 1900-2014: 115 years of the CATDAT database - Trends, Normalisation and Visualisation

    Science.gov (United States)

    Daniell, James; Skapski, Jens-Udo; Vervaeck, Armand; Wenzel, Friedemann; Schaefer, Andreas

    2015-04-01

    tolls from historic events is discussed. The CATDAT socioeconomic databases of parameters like disaggregated population, GDP, capital stock, building typologies, food security and inter-country export interactions are used to create a current exposure view of the world. The potential for losses globally is discussed with a re-creation of each damaging event since 1900, with well in excess of 10 trillion USD in normalised losses being seen from the 115 years of events. Potential worst case events for volcano and earthquake around the globe are discussed in terms of their potential for damage and huge economic loss today, and over the next century using SSP projections adjusted over a country basis including inter-country effects.

  1. Predictable earthquakes?

    Science.gov (United States)

    Martini, D.

    2002-12-01

    acceleration) and global number of earthquake for this period from published literature which give us a great picture about the dynamical geophysical phenomena. Methodology: The computing of linear correlation coefficients gives us a chance to quantitatively characterise the relation among the data series, if we suppose a linear dependence in the first step. The correlation coefficients among the Earth's rotational acceleration and Z-orbit acceleration (perpendicular to the ecliptic plane) and the global number of the earthquakes were compared. The results clearly demonstrate the common feature of both the Earth's rotation and Earth's Z-acceleration around the Sun and also between the Earth's rotational acceleration and the earthquake number. This fact might means a strong relation among these phenomena. The mentioned rather strong correlation (r = 0.75) and the 29 year period (Saturn's synodic period) was clearly shown in the counted cross correlation function, which gives the dynamical characteristic of correlation, of Earth's orbital- (Z-direction) and rotational acceleration. This basic period (29 year) was also obvious in the earthquake number data sets with clear common features in time. Conclusion: The Core, which involves the secular variation of the Earth's magnetic field, is the only sufficiently mobile part of the Earth with a sufficient mass to modify the rotation which probably effects on the global time distribution of the earthquakes. Therefore it might means that the secular variation of the earthquakes is inseparable from the changes in Earth's magnetic field, i.e. the interior process of the Earth's core belongs to the dynamical state of the solar system. Therefore if the described idea is real the global distribution of the earthquakes in time is predictable.

  2. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  3. BALWOIS: Abstracts

    International Nuclear Information System (INIS)

    Morell, Morell; Todorovik, Olivija; Dimitrov, Dobri

    2004-01-01

    BALWOIS 2004 is supported by the European Commission, Ministry of Education and Science of Republic of Macedonia, French Embassy in Macedonia and the International Association of Hydrological Sciences. BALWOIS Conference is prepared by l'Institut de Recherche pour le Developpement (IRD - Montpellier - France) and the Meteorological Association of the Republic of Macedonia (METEO MAK) with the kind support of the Hydro meteorological Services of Macedonia, Hydro biological Institute of Ohrid, National Hydro meteorological Institutes of Albania and Bulgaria, and the Municipality of Ohrid as well. Its international shared waters (rivers, lakes and groundwater tables) make this area an earth of challenges to apply the well known concept of Integrated Water Resources Management in a context of regional climate changes and anthropogenic pressures on environment. The role of a water observation and information system for decision support is to enhance the links between research institutions and operational centres, to help the decision makers and all water actors as well as to disseminate to large public useful information on related water issues. The main objectives of BALWOIS are to encourage scientific exchanges between researchers coming from Balkan institutions and to offer them opportunities to improve their networking at European and global level. Ohrid is one of the most welcoming towns of Republic of Macedonia with a very significant cultural heritage. Nobody would contest that Ohrid Lake - 358 km 2 , several millions years old and surrounded by splendid Macedonian and Albanian mountains - is among the most beautiful lakes of Europe. Together Ohrid and Prespa lakes and their environments offer a large biodiversity with endemic species. The location of the conference on the shore of Ohrid lake is particularly well chosen to work on topics linked with climate change which already affects the balance of the lakes, protection of the biodiversity against increasing

  4. Reaching the global community during disasters: findings from a content analysis of the organizational use of Twitter after the 2010 Haiti earthquake.

    Science.gov (United States)

    Gurman, Tilly A; Ellenberger, Nicole

    2015-01-01

    Social networking sites provide virtual environments in which individuals and organizations exchange real-time information on a multitude of topics, including health promotion and disease prevention. The January 2010 earthquake in Haiti has been posited as a turning point in the way in which organizations use social media, such as Twitter, for crisis communication. The purpose of this content analysis was to explore whether organizations' use of Twitter changed after the 2010 Haiti earthquake. A team of 13 coders analyzed all English-language tweets (N = 2,616) during the 3 months before and post earthquake from 6 leading organizations in the Haiti disaster relief efforts. Study findings indicate that the ways in which organizations used Twitter changed over time. Chi-square analyses demonstrated that organizations decreased in their use of certain strategies to disseminate information through Twitter, such as the use of links. Organizations did not change in their use of techniques to involve users (e.g., retweet, call to action), with the exception of using tweets as a fundraising mechanism. Study findings highlight missed opportunities among organizations to maximize Twitter in order to encourage more interactive and immediate communication with the global community.

  5. Source Parameter Inversion for Recent Great Earthquakes from a Decade-long Observation of Global Gravity Fields

    Science.gov (United States)

    Han, Shin-Chan; Riva, Ricccardo; Sauber, Jeanne; Okal, Emile

    2013-01-01

    We quantify gravity changes after great earthquakes present within the 10 year long time series of monthly Gravity Recovery and Climate Experiment (GRACE) gravity fields. Using spherical harmonic normal-mode formulation, the respective source parameters of moment tensor and double-couple were estimated. For the 2004 Sumatra-Andaman earthquake, the gravity data indicate a composite moment of 1.2x10(exp 23)Nm with a dip of 10deg, in agreement with the estimate obtained at ultralong seismic periods. For the 2010 Maule earthquake, the GRACE solutions range from 2.0 to 2.7x10(exp 22)Nm for dips of 12deg-24deg and centroid depths within the lower crust. For the 2011 Tohoku-Oki earthquake, the estimated scalar moments range from 4.1 to 6.1x10(exp 22)Nm, with dips of 9deg-19deg and centroid depths within the lower crust. For the 2012 Indian Ocean strike-slip earthquakes, the gravity data delineate a composite moment of 1.9x10(exp 22)Nm regardless of the centroid depth, comparing favorably with the total moment of the main ruptures and aftershocks. The smallest event we successfully analyzed with GRACE was the 2007 Bengkulu earthquake with M(sub 0) approx. 5.0x10(exp 21)Nm. We found that the gravity data constrain the focal mechanism with the centroid only within the upper and lower crustal layers for thrust events. Deeper sources (i.e., in the upper mantle) could not reproduce the gravity observation as the larger rigidity and bulk modulus at mantle depths inhibit the interior from changing its volume, thus reducing the negative gravity component. Focal mechanisms and seismic moments obtained in this study represent the behavior of the sources on temporal and spatial scales exceeding the seismic and geodetic spectrum.

  6. PAGER--Rapid assessment of an earthquake?s impact

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  7. Abstract algebra

    CERN Document Server

    Garrett, Paul B

    2007-01-01

    Designed for an advanced undergraduate- or graduate-level course, Abstract Algebra provides an example-oriented, less heavily symbolic approach to abstract algebra. The text emphasizes specifics such as basic number theory, polynomials, finite fields, as well as linear and multilinear algebra. This classroom-tested, how-to manual takes a more narrative approach than the stiff formalism of many other textbooks, presenting coherent storylines to convey crucial ideas in a student-friendly, accessible manner. An unusual feature of the text is the systematic characterization of objects by universal

  8. Article Abstract

    African Journals Online (AJOL)

    Abstract. Simple learning tools to improve clinical laboratory practical skills training. B Taye, BSc, MPH. Addis Ababa University, College of Health Sciences, Addis Ababa, ... concerns about the competence of medical laboratory science graduates. ... standardised practical learning guides and assessment checklists would.

  9. Abstract Introduction

    African Journals Online (AJOL)

    Abstract. Cyclic ovarian activity and plasma progesterone (P4) concentrations were assessed for 179 days in 5. (free grazing) and 6 (free grazing + high energy and protein-supplemented) normocyclic donkeys. In addition, plasma p4 and cortisol were measured in blood samples collected at J5·min intervals in the.

  10. Abstract Introduction

    African Journals Online (AJOL)

    cce

    Abstract. Hemoglobin is a tetrameric protein which is able to dissociate into dimers. The dimers can in turn dissociate into tetramers. It has been found that dimers are more reactive than tetramers. The difference in the reactivity of these two species has been used to determine the tetramer- dimer dissociation constant of ...

  11. The economic costs of natural disasters globally from 1900-2015: historical and normalised floods, storms, earthquakes, volcanoes, bushfires, drought and other disasters

    Science.gov (United States)

    Daniell, James; Wenzel, Friedemann; Schaefer, Andreas

    2016-04-01

    For the first time, a breakdown of natural disaster losses from 1900-2015 based on over 30,000 event economic losses globally is given based on increased analysis within the CATDAT Damaging Natural Disaster databases. Using country-CPI and GDP deflator adjustments, over 7 trillion (2015-adjusted) in losses have occurred; over 40% due to flood/rainfall, 26% due to earthquake, 19% due to storm effects, 12% due to drought, 2% due to wildfire and under 1% due to volcano. Using construction cost indices, higher percentages of flood losses are seen. Depending on how the adjustment of dollars are made to 2015 terms (CPI vs. construction cost indices), between 6.5 and 14.0 trillion USD (2015-adjusted) of natural disaster losses have been seen from 1900-2015 globally. Significant reductions in economic losses have been seen in China and Japan from 1950 onwards. An AAL of around 200 billion in the last 16 years has been seen equating to around 0.25% of Global GDP or around 0.1% of Net Capital Stock per year. Normalised losses have also been calculated to examine the trends in vulnerability through time for economic losses. The normalisation methodology globally using the exposure databases within CATDAT that were undertaken previously in papers for the earthquake and volcano databases, are used for this study. The original event year losses are adjusted directly by capital stock change, very high losses are observed with respect to floods over time (however with improved flood control structures). This shows clear trends in the improvement of building stock towards natural disasters and a decreasing trend in most perils for most countries.

  12. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  13. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  14. The coseismic displacements of the 2013 Lushan Mw6.6 earthquake determined using continuous global positioning system measurements

    Directory of Open Access Journals (Sweden)

    Huang Yong

    2013-05-01

    Full Text Available Based on Continuous GPS (CGPS observation data of the Crustal Movement Observation Network of China (CMONOC and the Sichuan Continuous Operational Reference System (SCCORS, we calculated the horizontal coseismic displacements of CGPS sites caused by the 2013 Lushan Mw 6. 6 earthquake. The results indicate that the horizontal coseismic deformations of CGPS stations are consistent with thrust-compression rupture. Furthermore, the sites closest to the epicenter underwent significant coseismic displacements. Three network stations exhibited displacements greater than 9 mm (the largest is 20. 9 mm at SCTQ, while the others were displaced approximately 1–4 mm.

  15. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  16. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  17. Earthquake Facts

    Science.gov (United States)

    ... estimated 830,000 people. In 1976 another deadly earthquake struck in Tangshan, China, where more than 250,000 people were killed. Florida and North Dakota have the smallest number of earthquakes in the United States. The deepest earthquakes typically ...

  18. Ground motion attenuation during M 7.1 Darfield and M 6.2 Christchurch, New Zealand, earthquakes and performance of global Ppedictive models

    Science.gov (United States)

    Segou, Margaret; Kalkan, Erol

    2011-01-01

    The M 7.1 Darfield earthquake occurred 40 km west of Christchurch (New Zealand) on 4 September 2010. Six months after, the city was struck again with an M 6.2 event on 22 February local time (21 February UTC). These events resulted in significant damage to infrastructure in the city and its suburbs. The purpose of this study is to evaluate the performance of global predictive models (GMPEs) using the strong motion data obtained from these two events to improve future seismic hazard assessment and building code provisions for the Canterbury region.The Canterbury region is located on the boundary between the Pacific and Australian plates; its surface expression is the active right lateral Alpine fault (Berryman et al. 1993). Beneath the North Island and the north South Island, the Pacific plate subducts obliquely under the Australian plate, while at the southwestern part of the South Island, a reverse process takes place. Although New Zealand has experienced several major earthquakes in the past as a result of its complex seismotectonic environment (e.g., M 7.1 1888 North Canterbury, M 7.0 1929 Arthur's Pass, and M 6.2 1995 Cass), there was no evidence of prior seismic activity in Christchurch and its surroundings before the September event. The Darfield and Christchurch earthquakes occurred along the previously unmapped Greendale fault in the Canterbury basin, which is covered by Quaternary alluvial deposits (Forsyth et al. 2008). In Figure 1, site conditions of the Canterbury epicentral area are depicted on a VS30 map. This map was determined on the basis of topographic slope calculated from a 1-km grid using the method of Allen and Wald (2007). Also shown are the locations of strong motion stations.The Darfield event was generated as a result of a complex rupture mechanism; the recordings and geodetic data reveal that earthquake consists of three sub-events (Barnhart et al. 2011, page 815 of this issue). The first event was due to rupturing of a blind reverse

  19. Global Positioning System constraints on crustal deformation before and during the 21 February 2008 Wells, Nevada M6.0 earthquake

    Science.gov (United States)

    Hammond, William C.; Blewitt, Geoffrey; Kreemer, Corné; Murray-Moraleda, Jessica R.; Svarc, Jerry L.; dePolo, Craig M.; LaPointe, Daphne D.

    2011-01-01

    Using Global Positioning System (GPS) data from permanent sites and U.S. Geological Survey (USGS) campaign data we have estimated co-seismic displacements and secular background crustal deformation patterns associated with the 21 February 2008 Wells Nevada earthquake. Estimated displacements at nearby permanent GPS sites ELKO (84 km distant) and GOSH (81 km distant) are 1.0±0.2 mm and 1.1±0.3 mm, respectively. The magnitude and direction are in agreement with those predicted from a rupture model based on InSAR measurements of the near-field co-seismic surface displacement. Analysis of long GPS time series (>10 years) from the permanent sites within 250 km of the epicenter indicate the eastern Nevada Basin and Range undergoes steady tectonic transtension with rates on the order of 1 mm/year over approximately 250 km. The azimuth of maximum horizontal crustal extension is consistent with the azimuth of the Wells earthquake co-seismic slip vector. The orientation of crustal shear is consistent with deformation associated with Pacific/North America plate boundary relative motion seen elsewhere in the Basin and Range. In response to the event, we deployed a new GPS site with the capability to telemeter high rate, low latency data that will in the future allow for rapid estimation of surface displacement should aftershocks or postseismic deformations occur. We estimated co-seismic displacements using campaign GPS data collected before and after the event, however in most cases their uncertainties were larger than the offsets. Better precision in co-seismic displacement could have been achieved for the campaign sites if they had been surveyed more times or over a longer interval to better estimate their pre-event velocity.

  20. A Deviation-Time-Space-Thermal (DTS-T Method for Global Earth Observation System of Systems (GEOSS-Based Earthquake Anomaly Recognition: Criterions and Quantify Indices

    Directory of Open Access Journals (Sweden)

    Shanjun Liu

    2013-10-01

    Full Text Available The particular process of LCA (Lithosphere-Coversphere-Atmosphere coupling referring to local tectonic structures and coversphere conditions is very important for understanding seismic anomaly from GEOSS (Global Earth Observation System of Systems. The LCA coupling based multiple-parameters analysis should be the foundation for earthquake prewarning. Three improved criterions: deviation notable enough, time quasi-synchronism, and space geo-adjacency, plus their quantify indices are defined for earthquake anomaly recognition, and applied to thermal parameters as a DTS-T (Deviation-Time-Space-Thermal method. A normalized reliability index is preliminarily defined considering three quantify indices for deviation-time-space criterions. As an example, the DTS-T method is applied to the Ms 7.1 Yushu earthquake of 14 April 2010 in China. Furthermore, combining with the previous analysis of six recent significant earthquakes in the world, the statistical results regarding effective parameters, the occurrence of anomaly before the main shocks and a reliability index for each earthquake are introduced. It shows that the DTS-T method is reasonable and can be applied for routine monitoring and prewarning in the tectonic seismicity region.

  1. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  2. Fault model of the 2017 Jiuzhaigou Mw 6.5 earthquake estimated from coseismic deformation observed using Global Positioning System and Interferometric Synthetic Aperture Radar data

    Science.gov (United States)

    Nie, Zhaosheng; Wang, Di-Jin; Jia, Zhige; Yu, Pengfei; Li, Liangfa

    2018-04-01

    On August 8, 2017, the Jiuzhaigou Mw 6.5 earthquake occurred in Sichuan province, southwestern China, along the eastern margin of the Tibetan Plateau. The epicenter is surrounded by the Minjiang, Huya, and Tazang Faults. As the seismic activity and tectonics are very complicated, there is controversy regarding the accurate location of the epicenter and the seismic fault of the Jiuzhaigou earthquake. To investigate these aspects, first, the coseismic deformation field was derived from Global Positioning System (GPS) and Interferometric Synthetic Aperture Radar (InSAR) measurements. Second, the fault geometry, coseismic slip model, and Coulomb stress changes around the seismic region were calculated using a homogeneous elastic half-space model. The coseismic deformation field derived from InSAR measurements shows that this event was mainly dominated by a left-lateral strike-slip fault. The maximal and minimal displacements were approximately 0.15 m and - 0.21 m, respectively, along line-of-sight observation. The whole deformation field follows a northwest-trending direction and is mainly concentrated west of the fault. The coseismic slip is 28 km along the strike and 18 km along the dip. It is dominated by a left-lateral strike-slip fault. The average and maximal fault slip is 0.18 and 0.85 m, respectively. The rupture did not fully reach the ground surface. The focal mechanism derived from GPS and InSAR data is consistent with the kinematics and geometry of the Huya Fault. Therefore, we conclude that the northern section or the Shuzheng segment of the Huya Fault is the seismogenic fault. The maximal fault slip is located at 33.25°N and 103.82°E at a depth of 11 km, and the release moment is approximately 6.635 × 1018 Nm, corresponding to a magnitude of Mw 6.49, which is consistent with results reported by the US Geological Survey, Global Centroid Moment Tensor, and other researchers. The coseismic Coulomb stress changes enhanced the stress on the northwest and

  3. Undead earthquakes

    Science.gov (United States)

    Musson, R. M. W.

    This short communication deals with the problem of fake earthquakes that keep returning into circulation. The particular events discussed are some very early earthquakes supposed to have occurred in the U.K., which all originate from a single enigmatic 18th century source.

  4. Earthquake impact scale

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  5. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  6. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  7. GPS (global positioning system) studies of the Wasatch fault zone, Utah, with implications for elastic and viscoelastic fault behavior and earthquake hazard

    Science.gov (United States)

    Chang, Wu-Lung

    Contemporary crustal deformation along the 370 km-long Wasatch fault, Utah, has been measured by the Global Positioning System (GPS) and modeled for elastic and viscoelastic mechanisms. The Wasatch Front GPS network, including 107 campaign sites surveyed in 1992--1995, 1999, and 2001, and 11 permanent stations operating continuously from as early as mid-1996, spans a 100-km wide area across the fault. Combining these GPS measurement data revealed surface velocities with horizontal components of 1.8 +/- 0.5 mm/yr and 2.2 +/- 1.0 mm/yr across the northern and southern part of the Wasatch fault, respectively, with directions nearly perpendicular to the fault (E-W). Analysis of the spatial variation of the strain rate field, moreover, revealed a notable strain concentration across the Salt Lake City segment of the Wasatch fault that may be produced by the interseismic fault loading. Mechanisms other than fault loading that could contribute surface deformation signals to the Wasatch Front GPS observations were first examined, which include postseismic viscoelastic relaxation of the Earth's lithosphere and fluctuations of water table and the level of Great Salt Lake. Results showed that deformation signals induced by these effects are within the error ranges of GPS horizontal velocities, which imply that the Wasatch fault may be the main tectonic feature responsible for the contemporary deformation of the Wasatch Front area. A nonlinear optimization algorithm was then implemented to the GPS observations to investigate the geometry and loading rate of the Wasatch fault zone. An optimal model that best fits the observed horizontal velocity field shows a fault plane dipping 27° and creeping at 7 mm/yr from depths of 9--20 km, which may correspond to the interseismic loading-zone of the Wasatch fault. Examining the rheological properties of crustal and fault-zone rocks, on the other hand, suggests the brittle thickness of 7 to 9 km for the Wasatch fault zone and the depth

  8. ESPR 2015. Abstracts

    International Nuclear Information System (INIS)

    2015-01-01

    The volume includes the abstracts of the ESPR 2015 covering the following topics: PCG (post graduate courses): Radiography; fluoroscopy and general issue; nuclear medicine, interventional radiology and hybrid imaging, pediatric CT, pediatric ultrasound; MRI in childhood. Scientific sessions and task force sessions: International aspects; neuroradiology, neonatal imaging, engineering techniques to simulate injury in child abuse, CT - dose and quality, challenges in the chest, cardiovascular and chest, muscoskeletal, oncology, pediatric uroradiology and abdominal imaging, fetal and postmortem imaging, education and global challenges, neuroradiology - head and neck, gastrointestinal and genitourinary.

  9. ESPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-05-10

    The volume includes the abstracts of the ESPR 2015 covering the following topics: PCG (post graduate courses): Radiography; fluoroscopy and general issue; nuclear medicine, interventional radiology and hybrid imaging, pediatric CT, pediatric ultrasound; MRI in childhood. Scientific sessions and task force sessions: International aspects; neuroradiology, neonatal imaging, engineering techniques to simulate injury in child abuse, CT - dose and quality, challenges in the chest, cardiovascular and chest, muscoskeletal, oncology, pediatric uroradiology and abdominal imaging, fetal and postmortem imaging, education and global challenges, neuroradiology - head and neck, gastrointestinal and genitourinary.

  10. Abstracts and Abstracting in Knowledge Discovery.

    Science.gov (United States)

    Pinto, Maria; Lancaster, F. W.

    1999-01-01

    Presents various levels of criteria for judging the quality of abstracts and abstracting. Requirements for abstracts to be read by humans are compared with requirements for those to be searched by computer. Concludes that the wide availability of complete text in electronic form does not reduce the value of abstracts for information retrieval.…

  11. Global study of great (M>= 7) deep focus seismic events having regard to the May 24, 2013 Mw 8.3 earthquake the Sea of Okhotsk, Russia

    Science.gov (United States)

    Varga, Peter; Rogozhin, Evgeny; Süle, Bálint; Andreeva, Nadezda

    2014-05-01

    Distribution of great seismic events M >= 7.0 and consequently the released seismic energy along the Earth radius is of bimodal character. 90% of the great seismic events, which are responsible for the most of energy released, occur relatively close to the Earth's surface, at an average depth of 50 km. The vast majority of remaining 10% is associated with seismic events that occur very deep, an average of 580-590 km, above the border between transition zone and lower mantle (660 km). These very deep earthquakes (depth >= 500 km) differ significantly from the shallow events. For the study of the distribution of M >= 7.0 earthquakes and their radiated energy a catalogue was completed for the time-interval between 1900 and 2013. Examination of the source zones in which both shallow and deep M >= 7.0 earthquakes occur shows that linear distribution of deep earthquakes is considerably shorter than that found for the shallow earthquakes, which determine the length of the zone. The position of very deep (≥ 500 km) earthquakes foci show where the down going lithosperic plates conflict with the upper boundary of lower mantle, and where they probably cross it. This passage generates compression - elongation inside the slab. A comparison of temporal distribution of shallow and deep seismic events of a given source zone suggests that there is no direct relationship in the distribution of these two different earthquake activities. The largest of these great deep earthquakes, the May 24, 2013 Mw 8.3 earthquake the Sea of Okhotsk, was preceded by an earthquake swarm, which consists 58 M >= 5 events and occurred between May 15 and 24, 2013 in the higher part of the sinking slab east of Kamchatka. The aftershock activity after the Okhotsk Sea earthquake was moderate: twelve events with magnitudes above M 4 were observed till June 27. These events determine a fault area (2.64x104 km2) similar to the case of a shallow M 8.3 event. The effect of Okhotsk Sea was felt throughout

  12. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  13. Globalization

    Directory of Open Access Journals (Sweden)

    Tulio Rosembuj

    2006-12-01

    Full Text Available There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  14. Globalization

    OpenAIRE

    Tulio Rosembuj

    2006-01-01

    There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  15. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  16. From Abstract Art to Abstracted Artists

    Directory of Open Access Journals (Sweden)

    Romi Mikulinsky

    2016-11-01

    Full Text Available What lineage connects early abstract films and machine-generated YouTube videos? Hans Richter’s famous piece Rhythmus 21 is considered to be the first abstract film in the experimental tradition. The Webdriver Torso YouTube channel is composed of hundreds of thousands of machine-generated test patterns designed to check frequency signals on YouTube. This article discusses geometric abstraction vis-à-vis new vision, conceptual art and algorithmic art. It argues that the Webdriver Torso is an artistic marvel indicative of a form we call mathematical abstraction, which is art performed by computers and, quite possibly, for computers.

  17. Globalization

    OpenAIRE

    Andru?cã Maria Carmen

    2013-01-01

    The field of globalization has highlighted an interdependence implied by a more harmonious understanding determined by the daily interaction between nations through the inducement of peace and the management of streamlining and the effectiveness of the global economy. For the functioning of the globalization, the developing countries that can be helped by the developed ones must be involved. The international community can contribute to the institution of the development environment of the gl...

  18. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  19. Programme and abstracts

    International Nuclear Information System (INIS)

    1975-01-01

    Abstracts of 25 papers presented at the congress are given. The abstracts cover various topics including radiotherapy, radiopharmaceuticals, radioimmunoassay, health physics, radiation protection and nuclear medicine

  20. Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Hearne, Mike

    2009-01-01

    We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.

  1. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  2. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  3. Darwin's earthquake.

    Science.gov (United States)

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant.

  4. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  5. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  6. Globalization

    DEFF Research Database (Denmark)

    Plum, Maja

    Globalization is often referred to as external to education - a state of affair facing the modern curriculum with numerous challenges. In this paper it is examined as internal to curriculum; analysed as a problematization in a Foucaultian sense. That is, as a complex of attentions, worries, ways...... of reasoning, producing curricular variables. The analysis is made through an example of early childhood curriculum in Danish Pre-school, and the way the curricular variable of the pre-school child comes into being through globalization as a problematization, carried forth by the comparative practices of PISA...

  7. Globalization

    OpenAIRE

    F. Gerard Adams

    2008-01-01

    The rapid globalization of the world economy is causing fundamental changes in patterns of trade and finance. Some economists have argued that globalization has arrived and that the world is “flat†. While the geographic scope of markets has increased, the author argues that new patterns of trade and finance are a result of the discrepancies between “old†countries and “new†. As the differences are gradually wiped out, particularly if knowledge and technology spread worldwide, the t...

  8. Statistical tests of simple earthquake cycle models

    Science.gov (United States)

    Devries, Phoebe M. R.; Evans, Eileen

    2016-01-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM ~ 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  9. Factors associated with inpatient mortality in a field hospital following the Haiti earthquake, January-May 2010.

    Science.gov (United States)

    Dulski, Theresa M; Basavaraju, Sridhar V; Hotz, Gillian A; Xu, Likang; Selent, Monica U; DeGennaro, Vincent A; Andrews, David; Ford, Henri; Coronado, Victor G; Ginzburg, Enrique

    2011-01-01

    To describe factors associated with inpatient mortality in a field hospital established following the 2010 Haiti earthquake. Data were abstracted from medical records of patients admitted to the University of Miami Global Institute/Project Medishare hospital. Decedents were compared to survivors in terms of age, sex, length of stay, admission ward, diagnosis, and where relevant, injury mechanism and surgical procedure. Three multivariate logistic regression models were constructed to determine predictors of death among all patients, injured patients, and noninjured patients. During the study period, 1,339 patients were admitted to the hospital with 100 inpatient deaths (7.5 percent). The highest proportion of deaths occurred among patients aged earthquakes in resource-limited settings, survivors may require care in field hospitals for injuries or exacerbation of chronic medical conditions. Planning for sustained post-earthquake response should address these needs and include pediatric-specific preparation and long-term critical care requirements.

  10. A Decade of Giant Earthquakes - What does it mean?

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Terry C. Jr. [Los Alamos National Laboratory

    2012-07-16

    On December 26, 2004 the largest earthquake since 1964 occurred near Ache, Indonesia. The magnitude 9.2 earthquake and subsequent tsunami killed a quarter of million people; it also marked the being of a period of extraordinary seismicity. Since the Ache earthquake there have been 16 magnitude 8 earthquakes globally, including 2 this last April. For the 100 years previous to 2004 there was an average of 1 magnitude 8 earthquake every 2.2 years; since 2004 there has been 2 per year. Since magnitude 8 earthquakes dominate global seismic energy release, this period of seismicity has seismologist rethinking what they understand about plate tectonics and the connectivity between giant earthquakes. This talk will explore this remarkable period of time and its possible implications.

  11. Program and abstracts

    International Nuclear Information System (INIS)

    1975-01-01

    Abstracts of the papers given at the conference are presented. The abstracts are arranged under sessions entitled:Theoretical Physics; Nuclear Physics; Solid State Physics; Spectroscopy; Physics Education; SANCGASS; Astronomy; Plasma Physics; Physics in Industry; Applied and General Physics

  12. Earthquake Loss Estimation Uncertainties

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  13. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part B, historical earthquakes

    Science.gov (United States)

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.

  14. Globalization

    DEFF Research Database (Denmark)

    Plum, Maja

    Globalization is often referred to as external to education - a state of affair facing the modern curriculum with numerous challenges. In this paper it is examined as internal to curriculum; analysed as a problematization in a Foucaultian sense. That is, as a complex of attentions, worries, ways...... of reasoning, producing curricular variables. The analysis is made through an example of early childhood curriculum in Danish Pre-school, and the way the curricular variable of the pre-school child comes into being through globalization as a problematization, carried forth by the comparative practices of PISA....... It thus explores the systems of reason that educational comparative practices carry through time; focusing on the way configurations are reproduced and transformed, forming the pre-school child as a central curricular variable....

  15. Posttraumatic stress disorder: a serious post-earthquake complication

    OpenAIRE

    Farooqui, Mudassir; Quadri, Syed A.; Suriya, Sajid S.; Khan, Muhammad Adnan; Ovais, Muhammad; Sohail, Zohaib; Shoaib, Samra; Tohid, Hassaan; Hassan, Muhammad

    2017-01-01

    Abstract Objectives Earthquakes are unpredictable and devastating natural disasters. They can cause massive destruction and loss of life and survivors may suffer psychological symptoms of severe intensity. Our goal in this article is to review studies published in the last 20 years to compile what is known about posttraumatic stress disorder (PTSD) occurring after earthquakes. The review also describes other psychiatric complications that can be associated with earthquakes, to provide reader...

  16. Compilation of Theses Abstracts

    National Research Council Canada - National Science Library

    2005-01-01

    This publication contains unclassified/unrestricted abstracts of classified or restricted theses submitted for the degrees of Doctor of Philosophy, Master of Business Administration, Master of Science...

  17. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    and class instantiations. Our teaching experience shows that many novice programmers find it difficult to write programs with abstractions that materialise to concrete objects later in the development process. The contribution of this paper is the idea of initiating a programming process by creating......In this paper we discuss computational abstraction steps as a way to create class abstractions from concrete objects, and from examples. Computational abstraction steps are regarded as symmetric counterparts to computational concretisation steps, which are well-known in terms of function calls...

  18. Nuclear medicine. Abstracts; Nuklearmedizin 2000. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2000-07-01

    This issue of the journal contains the abstracts of the 183 conference papers as well as 266 posters presented at the conference. Subject fields covered are: Neurology, psychology, oncology, pediatrics, radiopharmacy, endocrinology, EDP, measuring equipment and methods, radiological protection, cardiology, and therapy. (orig./CB) [German] Die vorliegende Zeitschrift enthaelt die Kurzfassungen der 183 auf der Tagung gehaltenen Vortraege sowie der 226 praesentierten Poster, die sich mit den folgenden Themen befassten: Neurologie, Psychiatrie, Onkologie, Paediatrie, Radiopharmazie, Endokrinologie, EDV, Messtechnik, Strahlenschutz, Kardiologie sowie Therapie. (MG)

  19. Neighbourhood Abstraction in GROOVE

    NARCIS (Netherlands)

    Rensink, Arend; Zambon, Eduardo; De Lara, J.; Varro, D.

    2011-01-01

    Important classes of graph grammars have infinite state spaces and therefore cannot be verified with traditional model checking techniques. One way to address this problem is to perform graph abstraction, which allows us to generate a finite abstract state space that over-approximates the original

  20. Truthful Monadic Abstractions

    DEFF Research Database (Denmark)

    Brock-Nannestad, Taus; Schürmann, Carsten

    2012-01-01

    indefinitely, finding neither a proof nor a disproof of a given subgoal. In this paper we characterize a family of truth-preserving abstractions from intuitionistic first-order logic to the monadic fragment of classical first-order logic. Because they are truthful, these abstractions can be used to disprove...

  1. Check Sample Abstracts.

    Science.gov (United States)

    Alter, David; Grenache, David G; Bosler, David S; Karcher, Raymond E; Nichols, James; Rajadhyaksha, Aparna; Camelo-Piragua, Sandra; Rauch, Carol; Huddleston, Brent J; Frank, Elizabeth L; Sluss, Patrick M; Lewandrowski, Kent; Eichhorn, John H; Hall, Janet E; Rahman, Saud S; McPherson, Richard A; Kiechle, Frederick L; Hammett-Stabler, Catherine; Pierce, Kristin A; Kloehn, Erica A; Thomas, Patricia A; Walts, Ann E; Madan, Rashna; Schlesinger, Kathie; Nawgiri, Ranjana; Bhutani, Manoop; Kanber, Yonca; Abati, Andrea; Atkins, Kristen A; Farrar, Robert; Gopez, Evelyn Valencerina; Jhala, Darshana; Griffin, Sonya; Jhala, Khushboo; Jhala, Nirag; Bentz, Joel S; Emerson, Lyska; Chadwick, Barbara E; Barroeta, Julieta E; Baloch, Zubair W; Collins, Brian T; Middleton, Owen L; Davis, Gregory G; Haden-Pinneri, Kathryn; Chu, Albert Y; Keylock, Joren B; Ramoso, Robert; Thoene, Cynthia A; Stewart, Donna; Pierce, Arand; Barry, Michelle; Aljinovic, Nika; Gardner, David L; Barry, Michelle; Shields, Lisa B E; Arnold, Jack; Stewart, Donna; Martin, Erica L; Rakow, Rex J; Paddock, Christopher; Zaki, Sherif R; Prahlow, Joseph A; Stewart, Donna; Shields, Lisa B E; Rolf, Cristin M; Falzon, Andrew L; Hudacki, Rachel; Mazzella, Fermina M; Bethel, Melissa; Zarrin-Khameh, Neda; Gresik, M Vicky; Gill, Ryan; Karlon, William; Etzell, Joan; Deftos, Michael; Karlon, William J; Etzell, Joan E; Wang, Endi; Lu, Chuanyi M; Manion, Elizabeth; Rosenthal, Nancy; Wang, Endi; Lu, Chuanyi M; Tang, Patrick; Petric, Martin; Schade, Andrew E; Hall, Geraldine S; Oethinger, Margret; Hall, Geraldine; Picton, Avis R; Hoang, Linda; Imperial, Miguel Ranoa; Kibsey, Pamela; Waites, Ken; Duffy, Lynn; Hall, Geraldine S; Salangsang, Jo-Anne M; Bravo, Lulette Tricia C; Oethinger, Margaret D; Veras, Emanuela; Silva, Elvia; Vicens, Jimena; Silva, Elvio; Keylock, Joren; Hempel, James; Rushing, Elizabeth; Posligua, Lorena E; Deavers, Michael T; Nash, Jason W; Basturk, Olca; Perle, Mary Ann; Greco, Alba; Lee, Peng; Maru, Dipen; Weydert, Jamie Allen; Stevens, Todd M; Brownlee, Noel A; Kemper, April E; Williams, H James; Oliverio, Brock J; Al-Agha, Osama M; Eskue, Kyle L; Newlands, Shawn D; Eltorky, Mahmoud A; Puri, Puja K; Royer, Michael C; Rush, Walter L; Tavora, Fabio; Galvin, Jeffrey R; Franks, Teri J; Carter, James Elliot; Kahn, Andrea Graciela; Lozada Muñoz, Luis R; Houghton, Dan; Land, Kevin J; Nester, Theresa; Gildea, Jacob; Lefkowitz, Jerry; Lacount, Rachel A; Thompson, Hannis W; Refaai, Majed A; Quillen, Karen; Lopez, Ana Ortega; Goldfinger, Dennis; Muram, Talia; Thompson, Hannis

    2009-02-01

    The following abstracts are compiled from Check Sample exercises published in 2008. These peer-reviewed case studies assist laboratory professionals with continuing medical education and are developed in the areas of clinical chemistry, cytopathology, forensic pathology, hematology, microbiology, surgical pathology, and transfusion medicine. Abstracts for all exercises published in the program will appear annually in AJCP.

  2. Program and abstracts

    International Nuclear Information System (INIS)

    1976-01-01

    Abstracts of the papers given at the conference are presented. The abstracts are arranged under sessions entitled: Theoretical Physics; Nuclear Physics; Solid State Physics; Spectroscopy; Plasma Physics; Solar-Terrestrial Physics; Astrophysics and Astronomy; Radioastronomy; General Physics; Applied Physics; Industrial Physics

  3. Magnitudes and frequencies of earthquakes in relation to seismic risk

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1989-01-01

    Estimating the frequencies of occurrence of earthquakes of different magnitudes on a regional basis is an important task in estimating seismic risk at a construction site. Analysis of global earthquake data provides an insight into the magnitudes frequency relationship in a statistical manner. It turns out that, whereas a linear relationship between the logarithm of earthquake occurrence rates and the corresponding earthquake magnitudes fits well in the magnitude range between 5 and 7, a second degree polynomial in M, the earthquake magnitude provides a better description of the frequencies of earthquakes in a much wider range of magnitudes. It may be possible to adopt magnitude frequency relation for regions, for which adequate earthquake data are not available, to carry out seismic risk calculations. (author). 32 refs., 8 tabs., 7 figs

  4. Completeness of Lyapunov Abstraction

    DEFF Research Database (Denmark)

    Wisniewski, Rafal; Sloth, Christoffer

    2013-01-01

    the vector field, which allows the generation of a complete abstraction. To compute the functions that define the subdivision of the state space in an algorithm, we formulate a sum of squares optimization problem. This optimization problem finds the best subdivisioning functions, with respect to the ability......This paper addresses the generation of complete abstractions of polynomial dynamical systems by timed automata. For the proposed abstraction, the state space is divided into cells by sublevel sets of functions. We identify a relation between these functions and their directional derivatives along...

  5. Science meeting. Abstracts

    International Nuclear Information System (INIS)

    2000-01-01

    the document is a collection of the science meeting abstracts in the fields of nuclear physics, medical sciences, chemistry, agriculture, environment, engineering, material sciences different aspects of energy and presents research done in 2000 in these fields

  6. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    To most people the concept of abstract machines is connected to the name of Alan Turing and the development of the modern computer. The Turing machine is universal, axiomatic and symbolic (E.g. operating on symbols). Inspired by Foucault, Deleuze and Guattari extended the concept of abstract...... machines to singular, non-axiomatic and diagrammatic machines. That is: Machines which constitute becomings. This presentation gives a survey of the development of the concept of abstract machines in the philosophy of Deleuze and Guatari and the function of these abstract machines in the creation of works...... of art. From Difference and Repetition to Anti-Oedipus, the machines are conceived as binary machines based on the exclusive or inclusive use respectively of the three syntheses: conexa, disjuncta and conjuncta. The machines have a twofold embedment: In the desiring-production and in the social...

  7. Mathematical games, abstract games

    CERN Document Server

    Neto, Joao Pedro

    2013-01-01

    User-friendly, visually appealing collection offers both new and classic strategic board games. Includes abstract games for two and three players and mathematical games such as Nim and games on graphs.

  8. Introduction to abstract algebra

    CERN Document Server

    Smith, Jonathan D H

    2008-01-01

    Taking a slightly different approach from similar texts, Introduction to Abstract Algebra presents abstract algebra as the main tool underlying discrete mathematics and the digital world. It helps students fully understand groups, rings, semigroups, and monoids by rigorously building concepts from first principles. A Quick Introduction to Algebra The first three chapters of the book show how functional composition, cycle notation for permutations, and matrix notation for linear functions provide techniques for practical computation. The author also uses equivalence relations to introduc

  9. Abstract Storage Devices

    OpenAIRE

    Koenig, Robert; Maurer, Ueli; Tessaro, Stefano

    2007-01-01

    A quantum storage device differs radically from a conventional physical storage device. Its state can be set to any value in a certain (infinite) state space, but in general every possible read operation yields only partial information about the stored state. The purpose of this paper is to initiate the study of a combinatorial abstraction, called abstract storage device (ASD), which models deterministic storage devices with the property that only partial information about the state can be re...

  10. Abstracts of contributed papers

    Energy Technology Data Exchange (ETDEWEB)

    1994-08-01

    This volume contains 571 abstracts of contributed papers to be presented during the Twelfth US National Congress of Applied Mechanics. Abstracts are arranged in the order in which they fall in the program -- the main sessions are listed chronologically in the Table of Contents. The Author Index is in alphabetical order and lists each paper number (matching the schedule in the Final Program) with its corresponding page number in the book.

  11. Earthquake friction

    Science.gov (United States)

    Mulargia, Francesco; Bizzarri, Andrea

    2016-12-01

    Laboratory friction slip experiments on rocks provide firm evidence that the static friction coefficient μ has values ∼0.7. This would imply large amounts of heat produced by seismically active faults, but no heat flow anomaly is observed, and mineralogic evidence of frictional heating is virtually absent. This stands for lower μ values ∼0.2, as also required by the observed orientation of faults with respect to the maximum compressive stress. We show that accounting for the thermal and mechanical energy balance of the system removes this inconsistence, implying a multi-stage strain release process. The first stage consists of a small and slow aseismic slip at high friction on pre-existent stress concentrators within the fault volume but angled with the main fault as Riedel cracks. This introduces a second stage dominated by frictional temperature increase inducing local pressurization of pore fluids around the slip patches, which is in turn followed by a third stage in which thermal diffusion extends the frictionally heated zones making them coalesce into a connected pressurized region oriented as the fault plane. Then, the system enters a state of equivalent low static friction in which it can undergo the fast elastic radiation slip prescribed by dislocation earthquake models.

  12. Relationship between Lineament Density Extraction from Satellite Image and Earthquake Distribution of Taungtonelone Area, Myanmar

    OpenAIRE

    MYINT, Soe; WON-IN, KRIT; TAKASHIMA, lsao; CHARUSIRI, Punya

    2007-01-01

    [ABSTRACT] We studied relationship between lineament density extraction from satellite image and earthquake distribution using remote sensing applications. The result of this study aim to set up a complete earthquake hazard Map. The selected area is located in the Taungtonelone area,northern Mynmar. Myanmar is an earth-quake-prone country. It lies in a major earthquake zone of the world called Mediterranean -Himalayan belt. As the major urban areas in Myanmar lie in earthquake prone zones, ea...

  13. Smartphone MEMS accelerometers and earthquake early warning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  14. Losses Associated with Secondary Effects in Earthquakes

    Directory of Open Access Journals (Sweden)

    James E. Daniell

    2017-06-01

    Full Text Available The number of earthquakes with high damage and high losses has been limited to around 100 events since 1900. Looking at historical losses from 1900 onward, we see that around 100 key earthquakes (or around 1% of damaging earthquakes have caused around 93% of fatalities globally. What is indeed interesting about this statistic is that within these events, secondary effects have played a major role, causing around 40% of economic losses and fatalities as compared to shaking effects. Disaggregation of secondary effect economic losses and fatalities demonstrating the relative influence of historical losses from direct earthquake shaking in comparison to tsunami, fire, landslides, liquefaction, fault rupture, and other type losses is important if we are to understand the key causes post-earthquake. The trends and major event impacts of secondary effects are explored in terms of their historic impact as well as looking to improved ways to disaggregate them through two case studies of the Tohoku 2011 event for earthquake, tsunami, liquefaction, fire, and the nuclear impact; as well as the Chilean 1960 earthquake and tsunami event.

  15. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  16. Metacognition and abstract reasoning.

    Science.gov (United States)

    Markovits, Henry; Thompson, Valerie A; Brisson, Janie

    2015-05-01

    The nature of people's meta-representations of deductive reasoning is critical to understanding how people control their own reasoning processes. We conducted two studies to examine whether people have a metacognitive representation of abstract validity and whether familiarity alone acts as a separate metacognitive cue. In Study 1, participants were asked to make a series of (1) abstract conditional inferences, (2) concrete conditional inferences with premises having many potential alternative antecedents and thus specifically conducive to the production of responses consistent with conditional logic, or (3) concrete problems with premises having relatively few potential alternative antecedents. Participants gave confidence ratings after each inference. Results show that confidence ratings were positively correlated with logical performance on abstract problems and concrete problems with many potential alternatives, but not with concrete problems with content less conducive to normative responses. Confidence ratings were higher with few alternatives than for abstract content. Study 2 used a generation of contrary-to-fact alternatives task to improve levels of abstract logical performance. The resulting increase in logical performance was mirrored by increases in mean confidence ratings. Results provide evidence for a metacognitive representation based on logical validity, and show that familiarity acts as a separate metacognitive cue.

  17. The Bhuj Earthquake, Gujarat, India, 2001

    Indian Academy of Sciences (India)

    Pre-seismic, co-seismic and post-seismic displacements associated with the Bhuj 2001 earthquake derived from recent and historic geodetic data · Sridevi Jade M Mukul I A Parvez M B Ananda P D Kumar V K Gaur R Bendick R Bilham F Blume K Wallace I A Abbasi M Asif Khan S Ulhadi · More Details Abstract Fulltext PDF.

  18. Identification of resonant earthquake ground motion

    Indian Academy of Sciences (India)

    Abstract. Resonant ground motion has been observed in earthquake records measured at several parts of the world. This class of ground motion is characteri- zed by its energy being contained in a narrow frequency band. This paper deve- lops measures to quantify the frequency content of the ground motion using the.

  19. Flexure of the Indian plate and intraplate earthquakes

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    1993-09-30

    Sep 30, 1993 ... by increased urban populations and by recent changes in urban construzction methods (Bilham. 1988). Yet to some observers the recent increase in fatal earthquakes in India appears to exceed the global average. This recent increase in fatalities from earthquakes is almost entirely attributable to intraplate ...

  20. The 2007 Bengkulu earthquake, its rupture model and implications ...

    Indian Academy of Sciences (India)

    The 12 September 2007 great Bengkulu earthquake (Mw 8.4) occurred on the west coast of Sumatra about 130 km SW of Bengkulu. The earthquake was followed by two strong aftershocks of Mw 7.9 and 7.0. We estimate coseismic offsets due to the mainshock, derived from near-field Global Posi- tioning System (GPS) ...

  1. Redefining Earthquakes and the Earthquake Machine

    Science.gov (United States)

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  2. Abstract Objects of Verbs

    DEFF Research Database (Denmark)

    Robering, Klaus

    2014-01-01

    Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which these obj......Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which...

  3. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  4. Operational earthquake forecasting can enhance earthquake preparedness

    Science.gov (United States)

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  5. Monadic abstract interpreters

    DEFF Research Database (Denmark)

    Sergey, Ilya; Devriese, Dominique; Might, Matthew

    2013-01-01

    -insensitive analysis. To achieve this unification, we develop a systematic method for transforming a concrete semantics into a monadically-parameterized abstract machine. Changing the monad changes the behavior of the machine. By changing the monad, we recover a spectrum of machines—from the original concrete...

  6. WWNPQFT-2013 - Abstracts

    International Nuclear Information System (INIS)

    Cessac, B.; Bianchi, E.; Bellon, M.; Fried, H.; Krajewski, T.; Schubert, C.; Barre, J.; Hofmann, R.; Muller, B.; Raffaelli, B.

    2014-01-01

    The object of this Workshop is to consolidate and publicize new efforts in non perturbative-like Field Theories, relying in Functional Methods, Renormalization Group, and Dyson-Schwinger Equations. A presentation deals with effective vertices and photon-photon scattering in SU(2) Yang-Mills thermodynamics. This document gathers the abstracts of the presentations

  7. 2002 NASPSA Conference Abstracts.

    Science.gov (United States)

    Journal of Sport & Exercise Psychology, 2002

    2002-01-01

    Contains abstracts from the 2002 conference of the North American Society for the Psychology of Sport and Physical Activity. The publication is divided into three sections: the preconference workshop, "Effective Teaching Methods in the Classroom;" symposia (motor development, motor learning and control, and sport psychology); and free…

  8. The Abstraction Engine

    DEFF Research Database (Denmark)

    Fortescue, Michael David

    The main thesis of this book is that abstraction, far from being confined to higher formsof cognition, language and logical reasoning, has actually been a major driving forcethroughout the evolution of creatures with brains. It is manifest in emotive as well as rationalthought. Wending its way th...

  9. Composing Interfering Abstract Protocols

    Science.gov (United States)

    2016-04-01

    Tecnologia , Universidade Nova de Lisboa, Caparica, Portugal. This document is a companion technical report of the paper, “Composing Interfering Abstract...a Ciência e Tecnologia (Portuguese Foundation for Science and Technology) through the Carnegie Mellon Portugal Program under grant SFRH / BD / 33765

  10. Abstract Film and Beyond.

    Science.gov (United States)

    Le Grice, Malcolm

    A theoretical and historical account of the main preoccupations of makers of abstract films is presented in this book. The book's scope includes discussion of nonrepresentational forms as well as examination of experiments in the manipulation of time in films. The ten chapters discuss the following topics: art and cinematography, the first…

  11. Seismic Consequence Abstraction

    International Nuclear Information System (INIS)

    Gross, M.

    2004-01-01

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274])

  12. Seismic Consequence Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  13. Abstract Objects of Verbs

    DEFF Research Database (Denmark)

    2014-01-01

    Verbs do often take arguments of quite different types. In an orthodox type-theoretic framework this results in an extreme polysemy of many verbs. In this article, it is shown that this unwanted consequence can be avoided when a theory of "abstract objects" is adopted according to which these obj...

  14. Abstracts of submitted papers

    International Nuclear Information System (INIS)

    1987-01-01

    The conference proceedings contain 152 abstracts of presented papers relating to various aspects of personnel dosimetry, the dosimetry of the working and living environment, various types of dosemeters and spectrometers, the use of radionuclides in various industrial fields, the migration of radionuclides on Czechoslovak territory after the Chernobyl accident, theoretical studies of some parameters of ionizing radiation detectors, and their calibration. (M.D.)

  15. Metaphors in Abstract Thought

    NARCIS (Netherlands)

    I. Boot (Inge)

    2010-01-01

    textabstractThe aim of the dissertation was to investigate the Conceptual Metaphor Theory (CMT, Lakoff & Johnson, 1980, 1999).The CMT proposes that abstract concepts are partly structured by concrete concepts through the mechanism of metaphorical mapping. In Chapter 2 we wanted to investigate the

  16. SPR 2015. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-04-01

    The volume contains the abstracts of the SPR (society for pediatric radiology) 2015 meeting covering the following issues: fetal imaging, muscoskeletal imaging, cardiac imaging, chest imaging, oncologic imaging, tools for process improvement, child abuse, contrast enhanced ultrasound, image gently - update of radiation dose recording/reporting/monitoring - meaningful or useless meaning?, pediatric thoracic imaging, ALARA.

  17. Reflective Abstraction and Representation.

    Science.gov (United States)

    Lewin, Philip

    Piaget's theory of reflective abstraction can supplement cognitive science models of representation by specifying both the act of construction and the component steps through which knowers pass as they acquire knowledge. But, while approaches suggested by cognitive science supplement Piaget by awakening researchers to the role of auxiliary factors…

  18. Building Safe Concurrency Abstractions

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    2014-01-01

    Concurrent object-oriented programming in Beta is based on semaphores and coroutines and the ability to define high-level concurrency abstractions like monitors, and rendezvous-based communication, and their associated schedulers. The coroutine mechanism of SIMULA has been generalized into the no...

  19. Poster Session- Extended Abstracts

    Science.gov (United States)

    Jack D. Alexander III; Jean Findley; Brenda K. Kury; Jan L. Beyers; Douglas S. Cram; Terrell T. Baker; Jon C. Boren; Carl Edminster; Sue A. Ferguson; Steven McKay; David Nagel; Trent Piepho; Miriam Rorig; Casey Anderson; Jeanne Hoadley; Paulette L. Ford; Mark C. Andersen; Ed L. Fredrickson; Joe Truett; Gary W. Roemer; Brenda K. Kury; Jennifer Vollmer; Christine L. May; Danny C. Lee; James P. Menakis; Robert E. Keane; Zhi-Liang Zhu; Carol Miller; Brett Davis; Katharine Gray; Ken Mix; William P. Kuvlesky Jr.; D. Lynn Drawe; Marcia G. Narog; Roger D. Ottmar; Robert E. Vihnanek; Clinton S. Wright; Timothy E. Paysen; Burton K. Pendleton; Rosemary L. Pendleton; Carleton S. White; John Rogan; Doug Stow; Janet Franklin; Jennifer Miller; Lisa Levien; Chris Fischer; Emma Underwood; Robert Klinger; Peggy Moore; Clinton S. Wright

    2008-01-01

    Titles found within Poster Session-Extended Abstracts include:Assessment of emergency fire rehabilitation of four fires from the 2000 fire season on the Vale, Oregon, BLM district: review of the density sampling materials and methods: p. 329 Growth of regreen, seeded for erosion control, in the...

  20. Abstract Introduction Materials & Methods

    African Journals Online (AJOL)

    plzfg

    Abstract. Oral administration to male rats of 200mg kg-1 body weight of an extract of Calendula officinalis flowers every day for 60 days did not cause loss of body weight, but decreased significantly the weight of the testis, epididymis, seminal vesicle and ventral prostate. Sperm motility as well as sperm density were reduced ...

  1. Testing abstract behavioral specifications

    NARCIS (Netherlands)

    P.Y.H. Wong; R. Bubel (Richard); F.S. de Boer (Frank); C.P.T. de Gouw (Stijn); M. Gómez-Zamalloa; R Haehnle; K. Meinke; M.A. Sindhu

    2015-01-01

    htmlabstractWe present a range of testing techniques for the Abstract Behavioral Specification (ABS) language and apply them to an industrial case study. ABS is a formal modeling language for highly variable, concurrent, component-based systems. The nature of these systems makes them susceptible to

  2. Impredicative concurrent abstract predicates

    DEFF Research Database (Denmark)

    Svendsen, Kasper; Birkedal, Lars

    2014-01-01

    We present impredicative concurrent abstract predicates { iCAP { a program logic for modular reasoning about concurrent, higher- order, reentrant, imperative code. Building on earlier work, iCAP uses protocols to reason about shared mutable state. A key novel feature of iCAP is the ability to dene...

  3. Leadership Abstracts, 2002.

    Science.gov (United States)

    Wilson, Cynthia, Ed.; Milliron, Mark David, Ed.

    2002-01-01

    This 2002 volume of Leadership Abstracts contains issue numbers 1-12. Articles include: (1) "Skills Certification and Workforce Development: Partnering with Industry and Ourselves," by Jeffrey A. Cantor; (2) "Starting Again: The Brookhaven Success College," by Alice W. Villadsen; (3) "From Digital Divide to Digital Democracy," by Gerardo E. de los…

  4. Circularity and Lambda Abstraction

    DEFF Research Database (Denmark)

    Danvy, Olivier; Thiemann, Peter; Zerny, Ian

    2013-01-01

    unknowns from what is done to them, which we lambda-abstract with functions. The circular unknowns then become dead variables, which we eliminate. The result is a strict circu- lar program a la Pettorossi. This transformation is reversible: given a strict circular program a la Pettorossi, we introduce...

  5. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    Schreiner, R.

    2001-01-01

    The purpose of this work is to develop the Engineered Barrier System (EBS) radionuclide transport abstraction model, as directed by a written development plan (CRWMS M and O 1999a). This abstraction is the conceptual model that will be used to determine the rate of release of radionuclides from the EBS to the unsaturated zone (UZ) in the total system performance assessment-license application (TSPA-LA). In particular, this model will be used to quantify the time-dependent radionuclide releases from a failed waste package (WP) and their subsequent transport through the EBS to the emplacement drift wall/UZ interface. The development of this conceptual model will allow Performance Assessment Operations (PAO) and its Engineered Barrier Performance Department to provide a more detailed and complete EBS flow and transport abstraction. The results from this conceptual model will allow PA0 to address portions of the key technical issues (KTIs) presented in three NRC Issue Resolution Status Reports (IRSRs): (1) the Evolution of the Near-Field Environment (ENFE), Revision 2 (NRC 1999a), (2) the Container Life and Source Term (CLST), Revision 2 (NRC 1999b), and (3) the Thermal Effects on Flow (TEF), Revision 1 (NRC 1998). The conceptual model for flow and transport in the EBS will be referred to as the ''EBS RT Abstraction'' in this analysis/modeling report (AMR). The scope of this abstraction and report is limited to flow and transport processes. More specifically, this AMR does not discuss elements of the TSPA-SR and TSPA-LA that relate to the EBS but are discussed in other AMRs. These elements include corrosion processes, radionuclide solubility limits, waste form dissolution rates and concentrations of colloidal particles that are generally represented as boundary conditions or input parameters for the EBS RT Abstraction. In effect, this AMR provides the algorithms for transporting radionuclides using the flow geometry and radionuclide concentrations determined by other

  6. Earthquake swarms in South America

    Science.gov (United States)

    Holtkamp, S. G.; Pritchard, M. E.; Lohman, R. B.

    2011-10-01

    We searched for earthquake swarms in South America between 1973 and 2009 using the global Preliminary Determination of Epicenters (PDE) catalogue. Seismicity rates vary greatly over the South American continent, so we employ a manual search approach that aims to be insensitive to spatial and temporal scales or to the number of earthquakes in a potential swarm. We identify 29 possible swarms involving 5-180 earthquakes each (with total swarm moment magnitudes between 4.7 and 6.9) within a range of tectonic and volcanic locations. Some of the earthquake swarms on the subduction megathrust occur as foreshocks and delineate the limits of main shock rupture propagation for large earthquakes, including the 2010 Mw 8.8 Maule, Chile and 2007 Mw 8.1 Pisco, Peru earthquakes. Also, subduction megathrust swarms commonly occur at the location of subduction of aseismic ridges, including areas of long-standing seismic gaps in Peru and Ecuador. The magnitude-frequency relationship of swarms we observe appears to agree with previously determined magnitude-frequency scaling for swarms in Japan. We examine geodetic data covering five of the swarms to search for an aseismic component. Only two of these swarms (at Copiapó, Chile, in 2006 and near Ticsani Volcano, Peru, in 2005) have suitable satellite-based Interferometric Synthetic Aperture Radar (InSAR) observations. We invert the InSAR geodetic signal and find that the ground deformation associated with these swarms does not require a significant component of aseismic fault slip or magmatic intrusion. Three swarms in the vicinity of the volcanic arc in southern Peru appear to be triggered by the Mw= 8.5 2001 Peru earthquake, but predicted static Coulomb stress changes due to the main shock were very small at the swarm locations, suggesting that dynamic triggering processes may have had a role in their occurrence. Although we identified few swarms in volcanic regions, we suggest that particularly large volcanic swarms (those that

  7. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  8. Earthquake cycle deformation and the Moho: Implications for the rheology of continental lithosphere

    OpenAIRE

    Wright, TJ; Elliott, JR; Wang, H; Ryder, I

    2013-01-01

    The last 20. years has seen a dramatic improvement in the quantity and quality of geodetic measurements of the earthquake loading cycle. In this paper we compile and review these observations and test whether crustal thickness exerts any control. We found 78 earthquake source mechanisms for continental earthquakes derived from satellite geodesy, 187 estimates of interseismic "locking depth", and 23 earthquakes (or sequences) for which postseismic deformation has been observed. Globally we est...

  9. DEGRO 2017. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2017-06-15

    The volume includes abstracts of the Annual DEGRO Meeting 2017 covering lectures and poster sessions with the following issues: lymphoma, biology, physics, radioimmunotherapy, sarcomas and rare tumors, prostate carcinoma, lung tumors, benign lesions and new media, mamma carcinoma, gastrointestinal tumors, quality of life, care science and quality assurance, high-technology methods and palliative situation, head-and-neck tumors, brain tumors, central nervous system metastases, guidelines, radiation sensitivity, radiotherapy, radioimmunotherapy.

  10. The deleuzian abstract machines

    DEFF Research Database (Denmark)

    Werner Petersen, Erik

    2005-01-01

    in emphasis from the three syntheses to mappings and rhizomatic diagrams that cut across semiotics or “blow apart regimes of signs”. The aim here is the absolute deterritorialization. Deleuze has shown how abstract machines operate in the philosophy of Foucault, the literature of Proust and Kafka......, and the painting of Bacon. We will finish our presentation by showing how these machines apply to architecture....

  11. SPR 2014. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-05-15

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  12. SPR 2014. Abstracts

    International Nuclear Information System (INIS)

    2014-01-01

    The proceedings of the SPR 2014 meeting include abstracts on the following topics: Body imaging techniques: practical advice for clinic work; thoracic imaging: focus on the lungs; gastrointestinal imaging: focus on the pancreas and bowel; genitourinary imaging: focus on gonadal radiology; muscoskeletal imaging; focus on oncology; child abuse and nor child abuse: focus on radiography; impact of NMR and CT imaging on management of CHD; education and communication: art and practice in pediatric radiology.

  13. WWNPQFT-2011 - Abstracts

    International Nuclear Information System (INIS)

    Bianchi, E.; Bender, C.; Culetu, H.; Fried, H.; Grossmann, A.; Hofmann, R.; Le Bellac, M.; Martinetti, P.; Muller, B.; Patras, F.; Raffaeli, B.; Vitting Andersen, J.

    2013-01-01

    The object of this workshop is to consolidate and publicize new efforts in non-perturbative field theories. This year the presentations deal with quantum gravity, non-commutative geometry, fat-tailed wave-functions, strongly coupled field theories, space-times two time-like dimensions, and multiplicative renormalization. A presentation is dedicated to the construction of a nucleon-nucleon potential from an analytical, non-perturbative gauge invariant QCD. This document gathers the abstracts of the presentations

  14. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    J. Prouty

    2006-01-01

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  15. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J. Prouty

    2006-07-14

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment (TSPA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers advective transport and diffusive transport

  16. Istanbul Earthquake Early Warning System

    Science.gov (United States)

    Alcik, H.; Mert, A.; Ozel, O.; Erdik, M.

    2007-12-01

    . More complex algorithms based on artificial neural networks (ANN) can also be used [Boese et al., 2003]. ANN approach considers the problem of earthquake early-warning as a pattern recognition task. The seismic patterns can be defined by the shape and frequency content of the parts of accelerograms that are available at each time step. ANN can extract the engineering parameters PGA, CAV and instrumental intensity from these patterns, and map them to any location in the surrounded area. Boese M., Erdik, M., Wenzel, F. (2003), Artificial Neural Networks for Earthquake Early Warning, Proceedings AGU2003 Abstracts, S42B-0155

  17. Responses to the 2011 Earthquake on Facebook

    DEFF Research Database (Denmark)

    Hansen, Annette Skovsted

    In my investigation of how Japanese ODA policies and practices have engendered global networks, I have frequented the Association of Overseas Technical Scholarships (AOTS)' Facebook group. In the wake of the earthquake on March 11, 2011, many greetings came in from alumni who have within the last...

  18. Program and abstracts

    International Nuclear Information System (INIS)

    1978-01-01

    This volume contains the program and abstracts of the conference. The following topics are included: metal vapor molecular lasers, magnetohydrodynamics, rare gas halide and nuclear pumped lasers, transfer mechanisms in arcs, kinetic processes in rare gas halide lasers, arcs and flows, XeF kinetics and lasers, fundamental processes in excimer lasers, electrode effects and vacuum arcs, electron and ion transport, ion interactions and mobilities, glow discharges, diagnostics and afterglows, dissociative recombination, electron ionization and excitation, rare gas excimers and group VI lasers, breakdown, novel laser pumping techniques, electrode-related discharge phenomena, photon interactions, attachment, plasma chemistry and infrared lasers, electron scattering, and reactions of excited species

  19. IPR 2016. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-05-15

    The volume on the meeting of pediatric radiology includes abstract on the following issues: chest, cardiovascular system, neuroradiology, CT radiation DRs (diagnostic reference levels) and dose reporting guidelines, genitourinary imaging, gastrointestinal radiology, oncology an nuclear medicine, whole body imaging, fetal/neonates imaging, child abuse, oncology and hybrid imaging, value added imaging, muscoskeletal imaging, dose and radiation safety, imaging children - immobilization and distraction techniques, information - education - QI and healthcare policy, ALARA, the knowledge skills and competences for a technologist/radiographer in pediatric radiology, full exploitation of new technological features in pediatric CT, image quality issues in pediatrics, abdominal imaging, interventional radiology, MR contrast agents, tumor - mass imaging, cardiothoracic imaging, ultrasonography.

  20. SPR 2017. Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2017-05-15

    The conference proceedings SPR 2017 include abstracts on the following issues: gastrointestinal radiography - inflammatory bowel diseases, cardiovascular CTA, general muscoskeletal radiology, muscoskeletal congenital development diseases, general pediatric radiology - chest, muscoskeletal imaging - marrow and infectious disorders, state-of-the-art body MR imaging, practical pediatric sonography, quality and professionalism, CT imaging in congenital heart diseases, radiographic courses, body MT techniques, contrast enhanced ultrasound, machine learning, forensic imaging, the radiation dos conundrum - reconciling imaging, imagining and managing, the practice of radiology, interventional radiology, neuroradiology, PET/MR.

  1. Beyond the abstractions?

    DEFF Research Database (Denmark)

    Olesen, Henning Salling

    2006-01-01

      The anniversary of the International Journal of Lifelong Education takes place in the middle of a conceptual landslide from lifelong education to lifelong learning. Contemporary discourses of lifelong learning etc are however abstractions behind which new functions and agendas for adult education...... are set. The ideological discourse of recent policies seems to neglect the fact that history and resources for lifelong learning are different across Europe, and also neglects the multiplicity of adult learners. Instead of refusing the new agendas, however, adult education research should try to dissolve...... learning. Adult education research must fulfil it's potential conversion from normative philosophy to critical and empirical social science....

  2. Parameterized Dataflow (Extended Abstract

    Directory of Open Access Journals (Sweden)

    Dominic Duggan

    2016-10-01

    Full Text Available Dataflow networks have application in various forms of stream processing, for example for parallel processing of multimedia data. The description of dataflow graphs, including their firing behavior, is typically non-compositional and not amenable to separate compilation. This article considers a dataflow language with a type and effect system that captures the firing behavior of actors. This system allows definitions to abstract over actor firing rates, supporting the definition and safe composition of actor definitions where firing rates are not instantiated until a dataflow graph is launched.

  3. ABSTRACTION OF DRIFT SEEPAGE

    International Nuclear Information System (INIS)

    Wilson, Michael L.

    2001-01-01

    Drift seepage refers to flow of liquid water into repository emplacement drifts, where it can potentially contribute to degradation of the engineered systems and release and transport of radionuclides within the drifts. Because of these important effects, seepage into emplacement drifts is listed as a ''principal factor for the postclosure safety case'' in the screening criteria for grading of data in Attachment 1 of AP-3.15Q, Rev. 2, ''Managing Technical Product Inputs''. Abstraction refers to distillation of the essential components of a process model into a form suitable for use in total-system performance assessment (TSPA). Thus, the purpose of this analysis/model is to put the information generated by the seepage process modeling in a form appropriate for use in the TSPA for the Site Recommendation. This report also supports the Unsaturated-Zone Flow and Transport Process Model Report. The scope of the work is discussed below. This analysis/model is governed by the ''Technical Work Plan for Unsaturated Zone Flow and Transport Process Model Report'' (CRWMS MandO 2000a). Details of this activity are in Addendum A of the technical work plan. The original Work Direction and Planning Document is included as Attachment 7 of Addendum A. Note that the Work Direction and Planning Document contains tasks identified for both Performance Assessment Operations (PAO) and Natural Environment Program Operations (NEPO). Only the PAO tasks are documented here. The planning for the NEPO activities is now in Addendum D of the same technical work plan and the work is documented in a separate report (CRWMS MandO 2000b). The Project has been reorganized since the document was written. The responsible organizations in the new structure are the Performance Assessment Department and the Unsaturated Zone Department, respectively. The work plan for the seepage abstraction calls for determining an appropriate abstraction methodology, determining uncertainties in seepage, and providing

  4. ORGANIZED SYMPOSIA: ABSTRACTS

    OpenAIRE

    Anonymous

    2001-01-01

    Agricultural Economics Via Distance Education: Challenges and Opportunities, Allen F. Wysocki; Comparison of Agriculture and Business Practices in Southern United States and Europe and Ideas for Collaboration, Wes Harrison; Responses to the Farm Family Financial Crisis, Steve Isaacs; Global Economic Turmoil and Prospects for Recovery: Impacts, Issues and Prospects for Southern Agriculture, Mary Marchant; USDA/EPA Unified National Strategy for AFO's -Status and Implications for Southern Produc...

  5. Development of an Earthquake Impact Scale

    Science.gov (United States)

    Wald, D. J.; Marano, K. D.; Jaiswal, K. S.

    2009-12-01

    With the advent of the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, domestic (U.S.) and international earthquake responders are reconsidering their automatic alert and activation levels as well as their response procedures. To help facilitate rapid and proportionate earthquake response, we propose and describe an Earthquake Impact Scale (EIS) founded on two alerting criteria. One, based on the estimated cost of damage, is most suitable for domestic events; the other, based on estimated ranges of fatalities, is more appropriate for most global events. Simple thresholds, derived from the systematic analysis of past earthquake impact and response levels, turn out to be quite effective in communicating predicted impact and response level of an event, characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (major disaster, necessitating international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses exceeding 1M, 10M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness dominate in countries where vernacular building practices typically lend themselves to high collapse and casualty rates, and it is these impacts that set prioritization for international response. In contrast, it is often financial and overall societal impacts that trigger the level of response in regions or countries where prevalent earthquake resistant construction practices greatly reduce building collapse and associated fatalities. Any newly devised alert protocols, whether financial or casualty based, must be intuitive and consistent with established lexicons and procedures. In this analysis, we make an attempt

  6. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Science.gov (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  7. EBS Radionuclide Transport Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Schreiber

    2005-08-25

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport

  8. EBS Radionuclide Transport Abstraction

    International Nuclear Information System (INIS)

    J.D. Schreiber

    2005-01-01

    The purpose of this report is to develop and analyze the engineered barrier system (EBS) radionuclide transport abstraction model, consistent with Level I and Level II model validation, as identified in ''Technical Work Plan for: Near-Field Environment and Transport: Engineered Barrier System: Radionuclide Transport Abstraction Model Report Integration'' (BSC 2005 [DIRS 173617]). The EBS radionuclide transport abstraction (or EBS RT Abstraction) is the conceptual model used in the total system performance assessment for the license application (TSPA-LA) to determine the rate of radionuclide releases from the EBS to the unsaturated zone (UZ). The EBS RT Abstraction conceptual model consists of two main components: a flow model and a transport model. Both models are developed mathematically from first principles in order to show explicitly what assumptions, simplifications, and approximations are incorporated into the models used in the TSPA-LA. The flow model defines the pathways for water flow in the EBS and specifies how the flow rate is computed in each pathway. Input to this model includes the seepage flux into a drift. The seepage flux is potentially split by the drip shield, with some (or all) of the flux being diverted by the drip shield and some passing through breaches in the drip shield that might result from corrosion or seismic damage. The flux through drip shield breaches is potentially split by the waste package, with some (or all) of the flux being diverted by the waste package and some passing through waste package breaches that might result from corrosion or seismic damage. Neither the drip shield nor the waste package survives an igneous intrusion, so the flux splitting submodel is not used in the igneous scenario class. The flow model is validated in an independent model validation technical review. The drip shield and waste package flux splitting algorithms are developed and validated using experimental data. The transport model considers

  9. Interrelation of geomagnetic storms and earthquakes: Insight from lab experiments and field observations

    Science.gov (United States)

    Ruzhin, Yuri; Kamogawa, Masashi; Novikov, Victor

    Investigations of possible relations between variations of geomagnetic field and seismicity, including Sq-variations and geomagnetic storms, are overviewed and discussed. There are many papers demonstrating positive correlations between geomagnetic field variations and subsequent earthquake occurrence that allows to authors to talk about triggering impact on earthquake source provided by ionospheric disturbances [e.g., 1]. Nevertheless, there is another opinion on negligible impact of geomagnetic disturbances on the earthquake source supported by statistical analysis of correlation between variations of geomagnetic field and global and regional seismicity. In general, the both points of view on this problem are based on statistical research without detailed consideration of possible physical mechanisms which may be involved into the supposed earthquake triggering, or very rough estimations of possible increase of stresses in the faults under critical (near-to-failure) state were made. It is clear that verification of hypothesis of earthquake triggering by geomagnetic storms should be based on physical mechanisms of generation of additional stresses in the earthquake source or some secondary mechanisms resulted in change of the fault properties. Recently it was shown that the fluids may play very important role in the electromagnetic earthquake triggering [2], and the secondary triggering mechanism should be considered when the fluid migrating into the fault under electromagnetic action may provide fault weakening up to the earthquake triggering threshold. At the same time, depending on fault orientation, local hydrological structure of the crust around the fault, location of fluid reservoirs, etc. it may be possible that the fluid migration from the fault may provide the fault strengthening, and in this case the impact of variation of geomagnetic field may provide an opposite effect, and earthquake will not occur. In so doing, it is useless to apply only

  10. Learning from Earthquakes: 2014 Napa Valley Earthquake Reconnaissance Report

    OpenAIRE

    Fischer, Erica

    2014-01-01

    Structural damage was observed during reconnaissance after the 2014 South Napa Earthquake, and included damage to wine storage and fermentation tanks, collapse of wine storage barrel racks, unreinforced masonry building partial or full collapse, and residential building damage. This type of damage is not unique to the South Napa Earthquake, and was observed after other earthquakes such as the 1977 San Juan Earthquake, and the 2010 Maule Earthquake. Previous research and earthquakes have demon...

  11. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  12. Abstract: Implementing Infection Control Measures in Neonatology ...

    African Journals Online (AJOL)

    Abstract. Background Neonatal infection is a primary cause of morbidity and mortality globally. Objective The project's objective is to facilitate quality improvement by reduction of hospital-acquired infection (HAI) in hospitalized neonates. Methods Current infection control practices were surveyed and three main areas were ...

  13. Promoting Economic Security through Information Technology Abstract

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    2013-12-01

    Dec 1, 2013 ... Abstract. The problem of economic insecurity is a global threat to national security. In Nigeria today, we have witness a lot of national security issues that risks the continued existence of the country as one indivisible political entity with many calling for disintegration. Hitherto, many terrorist networks have ...

  14. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  15. IEEE conference record -- Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1994-01-01

    This conference covers the following areas: computational plasma physics; vacuum electronic; basic phenomena in fully ionized plasmas; plasma, electron, and ion sources; environmental/energy issues in plasma science; space plasmas; plasma processing; ball lightning/spherical plasma configurations; plasma processing; fast wave devices; magnetic fusion; basic phenomena in partially ionized plasma; dense plasma focus; plasma diagnostics; basic phenomena in weakly ionized gases; fast opening switches; MHD; fast z-pinches and x-ray lasers; intense ion and electron beams; laser-produced plasmas; microwave plasma interactions; EM and ETH launchers; solid state plasmas and switches; intense beam microwaves; and plasmas for lighting. Separate abstracts were prepared for 416 papers in this conference.

  16. IEEE conference record -- Abstracts

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    This conference covers the following areas: computational plasma physics; vacuum electronic; basic phenomena in fully ionized plasmas; plasma, electron, and ion sources; environmental/energy issues in plasma science; space plasmas; plasma processing; ball lightning/spherical plasma configurations; plasma processing; fast wave devices; magnetic fusion; basic phenomena in partially ionized plasma; dense plasma focus; plasma diagnostics; basic phenomena in weakly ionized gases; fast opening switches; MHD; fast z-pinches and x-ray lasers; intense ion and electron beams; laser-produced plasmas; microwave plasma interactions; EM and ETH launchers; solid state plasmas and switches; intense beam microwaves; and plasmas for lighting. Separate abstracts were prepared for 416 papers in this conference

  17. Problems in abstract algebra

    CERN Document Server

    Wadsworth, A R

    2017-01-01

    This is a book of problems in abstract algebra for strong undergraduates or beginning graduate students. It can be used as a supplement to a course or for self-study. The book provides more variety and more challenging problems than are found in most algebra textbooks. It is intended for students wanting to enrich their learning of mathematics by tackling problems that take some thought and effort to solve. The book contains problems on groups (including the Sylow Theorems, solvable groups, presentation of groups by generators and relations, and structure and duality for finite abelian groups); rings (including basic ideal theory and factorization in integral domains and Gauss's Theorem); linear algebra (emphasizing linear transformations, including canonical forms); and fields (including Galois theory). Hints to many problems are also included.

  18. ICENES 2007 Abstracts

    International Nuclear Information System (INIS)

    Sahin, S.

    2007-01-01

    In this book Conference Program and Abstracts were included 13th International Conference on Emerging Nuclear Energy Systems which held between 03-08 June 2007 in Istanbul, Turkey. The main objective of International Conference series on Emerging Nuclear Energy Systems (ICENES) is to provide an international scientific and technical forum for scientists, engineers, industry leaders, policy makers, decision makers and young professionals who will shape future energy supply and technology , for a broad review and discussion of various advanced, innovative and non-conventional nuclear energy production systems. The main topics of 159 accepted papers from 35 countries are fusion science and technology, fission reactors, accelerator driven systems, transmutation, laser in nuclear technology, radiation shielding, nuclear reactions, hydrogen energy, solar energy, low energy physics and societal issues

  19. Global teaching of global seismology

    Science.gov (United States)

    Stein, S.; Wysession, M.

    2005-12-01

    Our recent textbook, Introduction to Seismology, Earthquakes, & Earth Structure (Blackwell, 2003) is used in many countries. Part of the reason for this may be our deliberate attempt to write the book for an international audience. This effort appears in several ways. We stress seismology's long tradition of global data interchange. Our brief discussions of the science's history illustrate the contributions of scientists around the world. Perhaps most importantly, our discussions of earthquakes, tectonics, and seismic hazards take a global view. Many examples are from North America, whereas others are from other areas. Our view is that non-North American students should be exposed to North American examples that are type examples, and that North American students should be similarly exposed to examples elsewhere. For example, we illustrate how the Euler vector geometry changes a plate boundary from spreading, to strike-slip, to convergence using both the Pacific-North America boundary from the Gulf of California to Alaska and the Eurasia-Africa boundary from the Azores to the Mediterranean. We illustrate diffuse plate boundary zones using western North America, the Andes, the Himalayas, the Mediterranean, and the East Africa Rift. The subduction zone discussions examine Japan, Tonga, and Chile. We discuss significant earthquakes both in the U.S. and elsewhere, and explore hazard mitigation issues in different contexts. Both comments from foreign colleagues and our experience lecturing overseas indicate that this approach works well. Beyond the specifics of our text, we believe that such a global approach is facilitated by the international traditions of the earth sciences and the world youth culture that gives students worldwide common culture. For example, a video of the scene in New Madrid, Missouri that arose from a nonsensical earthquake prediction in 1990 elicits similar responses from American and European students.

  20. Abstract: Body Work

    DEFF Research Database (Denmark)

    Otto, Lene

    2012-01-01

    social and age groups are regarded? In what ways has different practices limited or extended its involvement in the body? Has work been organized hierarchically in relation to the degree of direct body work? What happened when body work became mediated by machines and technology? Has body work as forms......This panel will explore the usefulness of the term ‘body work’ in cultural history. Body work is understood as work focusing on the bodies of others as component in a range of occupations in health and social care, as well as in unpaid work in the family. How can the notion of body work inform...... cultural history of health and illness whether through a micro-social focus on the intercorporeal aspects of work in health and social care, or through clarifying our understanding of the times and spaces of work, or through highlighting the relationship between mundane body work and global processes...

  1. Book of abstracts

    International Nuclear Information System (INIS)

    1996-01-01

    The Philippine Atomic Energy Commission (PAEC), now the Philippine Nuclear Research Institute (PNRI), organized the 1st Philippine Nuclear Congress in 1976 to present the progress of R and D, nuclear S and T services in the applications of nuclear energy, and the nuclear power program. This year 1996 marks the centennial of the discovery of radioactivity and the organizing committee finds it relevant to convene the 2nd Philippine Nuclear Congress with the objectives to review the global/regional/national scenarios in nuclear energy applications; to provide a forum and discussions of the issues and concerns on public acceptance; and to propose policy recommendations to the government towards the role of nuclear science and technology in the 21st century

  2. Abstract: Body Work

    DEFF Research Database (Denmark)

    Otto, Lene

    2012-01-01

    This panel will explore the usefulness of the term ‘body work’ in cultural history. Body work is understood as work focusing on the bodies of others as component in a range of occupations in health and social care, as well as in unpaid work in the family. How can the notion of body work inform...... cultural history of health and illness whether through a micro-social focus on the intercorporeal aspects of work in health and social care, or through clarifying our understanding of the times and spaces of work, or through highlighting the relationship between mundane body work and global processes....... The British sociologist Julia Twigg has introduced and explored the term `bodywork', most recently in Body Work in Health and Social Care - Critical Themes, New Agendas (2011). She extends the term body work from applying to the work that individuals undertake on their own bodies, often as part of regimens...

  3. Exoplanets and Multiverses (Abstract)

    Science.gov (United States)

    Trimble, V.

    2016-12-01

    (Abstract only) To the ancients, the Earth was the Universe, of a size to be crossed by a god in a day, by boat or chariot, and by humans in a lifetime. Thus an exoplanet would have been a multiverse. The ideas gradually separated over centuries, with gradual acceptance of a sun-centered solar system, the stars as suns likely to have their own planets, other galaxies beyond the Milky Way, and so forth. And whenever the community divided between "just one' of anything versus "many," the "manies" have won. Discoveries beginning in 1991 and 1995 have gradually led to a battalion or two of planets orbiting other stars, very few like our own little family, and to moderately serious consideration of even larger numbers of other universes, again very few like our own. I'm betting, however, on habitable (though not necessarily inhabited) exoplanets to be found, and habitable (though again not necessarily inhabited) universes. Only the former will yield pretty pictures.

  4. Book of Abstracts

    International Nuclear Information System (INIS)

    2013-06-01

    ANIMMA 2013 is the third of a series of conferences devoted to endorsing and promoting scientific and technical activities based on nuclear instrumentation and measurements. The main objective of ANIMMA conference is to unite the various scientific communities not only involved in nuclear instrumentation and measurements, but also in nuclear medicine and radiation. The conference is all about getting scientists, engineers and the industry to meet, exchange cultures and identify new scientific and technical prospects to help overcome both current and future unresolved issues. The conference provides scientists and engineers with a veritable opportunity to compare their latest research and development in different areas: physics, nuclear energy, nuclear fuel cycle, safety, security, future energies (GEN III+, GENIV, ITER, ...). The conference topics include instrumentation and measurement methods for: Fundamental physics; Fusion diagnostics and technology; Nuclear power reactors; Research reactors; Nuclear fuel cycle; Decommissioning, dismantling and remote handling; Safeguards, homeland security; Severe accident monitoring; Environmental and medical sciences; Education, training and outreach. This document brings together the abstracts of the presentations. Each presentation (full paper) is analysed separately and entered in INIS

  5. SENSE 2010, Abstracts

    International Nuclear Information System (INIS)

    Lumsden, M.D.; Argyriou, D.N.; Inosov, D.

    2012-01-01

    The microscopic origin of unconventional superconductivity continues to attract the attention of the condensed matter community. Whereas rare-earth / actinide-based intermetallic and copper oxide-based high temperature superconductors are studied for more than twenty years, the iron-based superconductors have been in the focus of interest since their recent discovery. Inelastic neutron scattering experiments have been of particular importance for the understanding of the magnetic and superconducting properties of these compounds. With its 29 talks and 14 posters the workshop provided a forum for the 71 registered participants to review and discuss experimental achievements, recognize the observed synergy and differences as well as discuss theoretical efforts to identify the symmetry of the superconducting order parameter in addition to the coupling mechanisms of the Cooper pairs. The workshop covered different topics relevant for the study of unconventional superconductivity. Magnetization and lattice dynamics such as spin resonances, phonons, magnetic and other excitations as studied by spectroscopic methods were presented. Investigations of (doping, pressure and magnetic field dependent) phase diagrams, electronic states as well as vortex physics by the various diffraction techniques were also addressed. This document gathers only the abstracts of the papers. (authors)

  6. Automated Supernova Discovery (Abstract)

    Science.gov (United States)

    Post, R. S.

    2015-12-01

    (Abstract only) We are developing a system of robotic telescopes for automatic recognition of Supernovas as well as other transient events in collaboration with the Puckett Supernova Search Team. At the SAS2014 meeting, the discovery program, SNARE, was first described. Since then, it has been continuously improved to handle searches under a wide variety of atmospheric conditions. Currently, two telescopes are used to build a reference library while searching for PSN with a partial library. Since data is taken every night without clouds, we must deal with varying atmospheric and high background illumination from the moon. Software is configured to identify a PSN, reshoot for verification with options to change the run plan to acquire photometric or spectrographic data. The telescopes are 24-inch CDK24, with Alta U230 cameras, one in CA and one in NM. Images and run plans are sent between sites so the CA telescope can search while photometry is done in NM. Our goal is to find bright PSNs with magnitude 17.5 or less which is the limit of our planned spectroscopy. We present results from our first automated PSN discoveries and plans for PSN data acquisition.

  7. Stellar Presentations (Abstract)

    Science.gov (United States)

    Young, D.

    2015-12-01

    (Abstract only) The AAVSO is in the process of expanding its education, outreach and speakers bureau program. powerpoint presentations prepared for specific target audiences such as AAVSO members, educators, students, the general public, and Science Olympiad teams, coaches, event supervisors, and state directors will be available online for members to use. The presentations range from specific and general content relating to stellar evolution and variable stars to specific activities for a workshop environment. A presentation—even with a general topic—that works for high school students will not work for educators, Science Olympiad teams, or the general public. Each audience is unique and requires a different approach. The current environment necessitates presentations that are captivating for a younger generation that is embedded in a highly visual and sound-bite world of social media, twitter and U-Tube, and mobile devices. For educators, presentations and workshops for themselves and their students must support the Next Generation Science Standards (NGSS), the Common Core Content Standards, and the Science Technology, Engineering and Mathematics (STEM) initiative. Current best practices for developing relevant and engaging powerpoint presentations to deliver information to a variety of targeted audiences will be presented along with several examples.

  8. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  9. The Antiquity of Earthquakes

    Indian Academy of Sciences (India)

    Department of Earth. Sciences, University of. Roorkee. Her interest is in computer based solutions to geophysical and other earth science problems. If we adopt the definition that an earthquake is shaking of the earth due to natural causes, then we may argue that earthquakes have been occurring since the very beginning.

  10. Bam Earthquake in Iran

    CERN Document Server

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  11. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  12. The Antiquity of Earthquakes

    Indian Academy of Sciences (India)

    there are few estimates about this earthquake as it probably occurred in that early period of the earth's history about which astronomers, physicists, chemists and earth scientists are still sorting out their ideas. Yet, the notion of the earliest earthquake excites interest. We explore this theme here partly also because.

  13. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  14. Abstraction of Drift Seepage

    Energy Technology Data Exchange (ETDEWEB)

    J.T. Birkholzer

    2004-11-01

    This model report documents the abstraction of drift seepage, conducted to provide seepage-relevant parameters and their probability distributions for use in Total System Performance Assessment for License Application (TSPA-LA). Drift seepage refers to the flow of liquid water into waste emplacement drifts. Water that seeps into drifts may contact waste packages and potentially mobilize radionuclides, and may result in advective transport of radionuclides through breached waste packages [''Risk Information to Support Prioritization of Performance Assessment Models'' (BSC 2003 [DIRS 168796], Section 3.3.2)]. The unsaturated rock layers overlying and hosting the repository form a natural barrier that reduces the amount of water entering emplacement drifts by natural subsurface processes. For example, drift seepage is limited by the capillary barrier forming at the drift crown, which decreases or even eliminates water flow from the unsaturated fractured rock into the drift. During the first few hundred years after waste emplacement, when above-boiling rock temperatures will develop as a result of heat generated by the decay of the radioactive waste, vaporization of percolation water is an additional factor limiting seepage. Estimating the effectiveness of these natural barrier capabilities and predicting the amount of seepage into drifts is an important aspect of assessing the performance of the repository. The TSPA-LA therefore includes a seepage component that calculates the amount of seepage into drifts [''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504], Section 6.3.3.1)]. The TSPA-LA calculation is performed with a probabilistic approach that accounts for the spatial and temporal variability and inherent uncertainty of seepage-relevant properties and processes. Results are used for subsequent TSPA-LA components that may handle, for example, waste package

  15. Book of abstracts

    International Nuclear Information System (INIS)

    1987-01-01

    The document contains abstracts of 24 review papers, 24 invited papers, 24 oral contributions and 120 posters. 10 review papers summarize the status of laser fusion research and progress in high-power laser facilities in major world laboratories. Four papers review research programs (laser-matter interaction studies and X-ray source development) based on KrF laser systems. Other review papers discuss the problems of laser energy conversion into X-rays in laser-heated cavities, X-ray lasing at shorter wavelengths, optimization of targets for inertial fusion. Two review papers are devoted to light ion fusion. The subjects of most invited papers are special problems of current laser plasma research, such as hot electron generation, nonlinear resonance absorption, energy accumulation limits, pellet ignition, conversion of laser light into X-rays, high-pressure plasma generation. Three invited papers review laser plasma research in Czechoslovakia, Poland and Spain. One paper suggests a new method of producing muonic superdense matter. The remaining inivited papers deal with the progress in XUV lasers and with laser plasma applications for further laser development. Of the papers accepted for oral presentation 12 papers discuss various problems of laser-plasma interaction; 4 papers deal with laser targets, 4 papers with laser-initiated X-ray sources, 3 papers with the diagnostics of laser-produced plasma. The last oral contribution presents the main principles of the excimer laser theory. The largest group of posters is related to laser-plasma interaction and energy absorption problems, to laser-target interaction and various methods of laser plasma diagnostics. The other posters deal with plasma applications in laser development, plasma mirrors, Brillouin and Raman scattering, X-ray emission, harmonic generation, electron acceleration, production of high-Z plasmas and other related problems. (J.U.)

  16. Abstraction of Drift Seepage

    International Nuclear Information System (INIS)

    J.T. Birkholzer

    2004-01-01

    This model report documents the abstraction of drift seepage, conducted to provide seepage-relevant parameters and their probability distributions for use in Total System Performance Assessment for License Application (TSPA-LA). Drift seepage refers to the flow of liquid water into waste emplacement drifts. Water that seeps into drifts may contact waste packages and potentially mobilize radionuclides, and may result in advective transport of radionuclides through breached waste packages [''Risk Information to Support Prioritization of Performance Assessment Models'' (BSC 2003 [DIRS 168796], Section 3.3.2)]. The unsaturated rock layers overlying and hosting the repository form a natural barrier that reduces the amount of water entering emplacement drifts by natural subsurface processes. For example, drift seepage is limited by the capillary barrier forming at the drift crown, which decreases or even eliminates water flow from the unsaturated fractured rock into the drift. During the first few hundred years after waste emplacement, when above-boiling rock temperatures will develop as a result of heat generated by the decay of the radioactive waste, vaporization of percolation water is an additional factor limiting seepage. Estimating the effectiveness of these natural barrier capabilities and predicting the amount of seepage into drifts is an important aspect of assessing the performance of the repository. The TSPA-LA therefore includes a seepage component that calculates the amount of seepage into drifts [''Total System Performance Assessment (TSPA) Model/Analysis for the License Application'' (BSC 2004 [DIRS 168504], Section 6.3.3.1)]. The TSPA-LA calculation is performed with a probabilistic approach that accounts for the spatial and temporal variability and inherent uncertainty of seepage-relevant properties and processes. Results are used for subsequent TSPA-LA components that may handle, for example, waste package corrosion or radionuclide transport

  17. Change Abstract Title

    Science.gov (United States)

    Rudich, Y.

    2017-12-01

    Investigating possible health effects of transported and resuspended dusts People in regions such as the Eastern Mediterranean are often exposed to high levels of both transported desert dust and to resuspended urban dust. Due to warming and drying trends, the frequency and intensity of dust storms have increased in the Eastern Mediterranean over the last decades. High exposure to particulate matter is a known risk factor to the exposed population, but the detailed understanding of how these dusts affect health remain elusive. In this talk I will describe aspects of how dust may impact health. First, transport of bacteria by desert dust, its effects on the local microbiome and dependence on the source region will be described. Then, we will describe the biological effects due to presence of biological components on dust. Finally, we will discuss how metals from brake and tire wear in resuspended urban dust affect oxidative stress and inflammation, and lead to oxidative damage in lung tissues. The significance of these findings in light of recent measurements from the global SPARATN network will be presented.

  18. Simulating Earthquake Early Warning Systems in the Classroom as a New Approach to Teaching Earthquakes

    Science.gov (United States)

    D'Alessio, M. A.

    2010-12-01

    A discussion of P- and S-waves seems an ubiquitous part of studying earthquakes in the classroom. Textbooks from middle school through university level typically define the differences between the waves and illustrate the sense of motion. While many students successfully memorize the differences between wave types (often utilizing the first letter as a memory aide), textbooks rarely give tangible examples of how the two waves would "feel" to a person sitting on the ground. One reason for introducing the wave types is to explain how to calculate earthquake epicenters using seismograms and travel time charts -- very abstract representations of earthquakes. Even when the skill is mastered using paper-and-pencil activities or one of the excellent online interactive versions, locating an epicenter simply does not excite many of our students because it evokes little emotional impact, even in students located in earthquake-prone areas. Despite these limitations, huge numbers of students are mandated to complete the task. At the K-12 level, California requires that all students be able to locate earthquake epicenters in Grade 6; in New York, the skill is a required part of the Regent's Examination. Recent innovations in earthquake early warning systems around the globe give us the opportunity to address the same content standard, but with substantially more emotional impact on students. I outline a lesson about earthquakes focused on earthquake early warning systems. The introductory activities include video clips of actual earthquakes and emphasize the differences between the way P- and S-waves feel when they arrive (P arrives first, but is weaker). I include an introduction to the principle behind earthquake early warning (including a summary of possible uses of a few seconds warning about strong shaking) and show examples from Japan. Students go outdoors to simulate P-waves, S-waves, and occupants of two different cities who are talking to one another on cell phones

  19. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    Science.gov (United States)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  20. Icone-9 (Abstract)

    International Nuclear Information System (INIS)

    2001-01-01

    Full text of publication follows: in today's global energy environment, nuclear power plant managers need to consider many dimensions of risk in addition to nuclear safety risk. In order to stay competitive in modern energy markets, NPP managers must integrate management of production, safety, and economic risks in an effective way. This integrated risk management approach generates benefits that include the following: -) Clearer criteria for decision making; -) Levering investments made in probabilistic safety analysis (PSA) programs by applying these analyses to other areas and contexts; -) Cost consciousness and innovation in achieving nuclear safety and production goals; -) Communication improvement - more effective internal communication among all levels of the NPP operating organization, and clearer communication between the organization and its stakeholders; -) Focus on safety - ensuring an integrated focus on safety, production, and economics during times of change in the energy environment. The IAEA is now preparing a technical document that provides a comprehensive framework for Risk Management (RM) as a tool to enhance the performance of NPPs. It aims to explore the wider context of risk (safety, operations, financial/commercial, strategic), with a goal of providing a source document for use by managers of NPPs and operating organizations. This report describes the steps of the risk management process and provides examples of implementation. Because of its generic nature, this framework can be used for large-scale proposals as well as smaller ventures. The intended audience for this document encompasses all levels of operating organization management including managers responsible for setting policy on safety, operational, and commercial/financial aspects of NPP operation and the hands-on managers directly implementing the organization's policies. This report sets out a framework for application of integrated risk management to improve NPP performance

  1. Spatiotermporal correlations of earthquakes

    International Nuclear Information System (INIS)

    Farkas, J.; Kun, F.

    2007-01-01

    Complete text of publication follows. An earthquake is the result of a sudden release of energy in the Earth's crust that creates seismic waves. At the present technological level, earthquakes of magnitude larger than three can be recorded all over the world. In spite of the apparent randomness of earthquake occurrence, long term measurements have revealed interesting scaling laws of earthquake characteristics: the rate of aftershocks following major earthquakes has a power law decay (Omori law); the magnitude distribution of earthquakes exhibits a power law behavior (Gutenberg-Richter law), furthermore, it has recently been pointed out that epicenters form fractal networks in fault zones (Kagan law). The theoretical explanation of earthquakes is based on plate tectonics: the earth's crust has been broken into plates which slowly move under the action of the flowing magma. Neighboring plates touch each other along ridges (fault zones) where a large amount of energy is stored in deformation. Earthquakes occur when the stored energy exceeds a material dependent threshold value and gets released in a sudden jump of the plate. The Burridge-Knopoff (BK) model of earthquakes represents earth's crust as a coupled system of driven oscillators where nonlinearity occurs through a stick-slip frictional instability. Laboratory experiments have revealed that under a high pressure the friction of rock interfaces exhibits a weakening with increasing velocity. In the present project we extend recent theoretical studies of the BK model by taking into account a realistic velocity weakening friction force between tectonic plates. Varying the strength of weakening a broad spectrum of interesting phenomena is obtained: the model reproduces the Omori and Gutenberg-Richter laws of earthquakes, furthermore, it provides information on the correlation of earthquake sequences. We showed by computer simulations that the spatial and temporal correlations of consecutive earthquakes are very

  2. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  3. 1986 annual information meeting. Abstracts

    International Nuclear Information System (INIS)

    1986-01-01

    Abstracts are presented for the following papers: Geohydrological Research at the Y-12 Plant (C.S. Haase); Ecological Impacts of Waste Disposal Operations in Bear Creek Valley Near the Y-12 Plant (J.M. Loar); Finite Element Simulation of Subsurface Contaminant Transport: Logistic Difficulties in Handling Large Field Problems (G.T. Yeh); Dynamic Compaction of a Radioactive Waste Burial Trench (B.P. Spalding); Comparative Evaluation of Potential Sites for a High-Level Radioactive Waste Repository (E.D. Smith); Changing Priorities in Environmental Assessment and Environmental Compliance (R.M. Reed); Ecology, Ecotoxicology, and Ecological Risk Assessment (L.W. Barnthouse); Theory and Practice in Uncertainty Analysis from Ten Years of Practice (R.H. Gardner); Modeling Landscape Effects of Forest Decline (V.H. Dale); Soil Nitrogen and the Global Carbon Cycle (W.M. Post); Maximizing Wood Energy Production in Short-Rotation Plantations: Effect of Initial Spacing and Rotation Length (L.L. Wright); and Ecological Communities and Processes in Woodland Streams Exhibit Both Direct and Indirect Effects of Acidification (J.W. Elwood)

  4. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  5. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  6. Dynamic strains for earthquake source characterization

    Science.gov (United States)

    Barbour, Andrew J.; Crowell, Brendan W

    2017-01-01

    Strainmeters measure elastodynamic deformation associated with earthquakes over a broad frequency band, with detection characteristics that complement traditional instrumentation, but they are commonly used to study slow transient deformation along active faults and at subduction zones, for example. Here, we analyze dynamic strains at Plate Boundary Observatory (PBO) borehole strainmeters (BSM) associated with 146 local and regional earthquakes from 2004–2014, with magnitudes from M 4.5 to 7.2. We find that peak values in seismic strain can be predicted from a general regression against distance and magnitude, with improvements in accuracy gained by accounting for biases associated with site–station effects and source–path effects, the latter exhibiting the strongest influence on the regression coefficients. To account for the influence of these biases in a general way, we include crustal‐type classifications from the CRUST1.0 global velocity model, which demonstrates that high‐frequency strain data from the PBO BSM network carry information on crustal structure and fault mechanics: earthquakes nucleating offshore on the Blanco fracture zone, for example, generate consistently lower dynamic strains than earthquakes around the Sierra Nevada microplate and in the Salton trough. Finally, we test our dynamic strain prediction equations on the 2011 M 9 Tohoku‐Oki earthquake, specifically continuous strain records derived from triangulation of 137 high‐rate Global Navigation Satellite System Earth Observation Network stations in Japan. Moment magnitudes inferred from these data and the strain model are in agreement when Global Positioning System subnetworks are unaffected by spatial aliasing.

  7. What Googling Trends Tell Us About Public Interest in Earthquakes

    Science.gov (United States)

    Tan, Y. J.; Maharjan, R.

    2017-12-01

    Previous studies have shown that immediately after large earthquakes, there is a period of increased public interest. This represents a window of opportunity for science communication and disaster relief fundraising efforts to reach more people. However, how public interest varies for different earthquakes has not been quantified systematically on a global scale. We analyze how global search interest for the term "earthquake" on Google varies following earthquakes of magnitude ≥ 5.5 from 2004 to 2016. We find that there is a spike in search interest after large earthquakes followed by an exponential temporal decay. Preliminary results suggest that the period of increased search interest scales with death toll and correlates with the period of increased media coverage. This suggests that the relationship between the period of increased public interest in earthquakes and death toll might be an effect of differences in media coverage. However, public interest never remains elevated for more than three weeks. Therefore, to take advantage of this short period of increased public interest, science communication and disaster relief fundraising efforts have to act promptly following devastating earthquakes.

  8. Earthquakes and emergence

    Science.gov (United States)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  9. Surface rupturing earthquakes repeated in the 300 years along the ISTL active fault system, central Japan

    Science.gov (United States)

    Katsube, Aya; Kondo, Hisao; Kurosawa, Hideki

    2017-06-01

    Surface rupturing earthquakes produced by intraplate active faults generally have long recurrence intervals of a few thousands to tens of thousands of years. We here report the first evidence for an extremely short recurrence interval of 300 years for surface rupturing earthquakes on an intraplate system in Japan. The Kamishiro fault of the Itoigawa-Shizuoka Tectonic Line (ISTL) active fault system generated a Mw 6.2 earthquake in 2014. A paleoseismic trench excavation across the 2014 surface rupture showed the evidence for the 2014 event and two prior paleoearthquakes. The slip of the penultimate earthquake was similar to that of 2014 earthquake, and its timing was constrained to be after A.D. 1645. Judging from the timing, the damaged area, and the amount of slip, the penultimate earthquake most probably corresponds to a historical earthquake in A.D. 1714. The recurrence interval of the two most recent earthquakes is thus extremely short compared with intervals on other active faults known globally. Furthermore, the slip repetition during the last three earthquakes is in accordance with the time-predictable recurrence model rather than the characteristic earthquake model. In addition, the spatial extent of the 2014 surface rupture accords with the distribution of a serpentinite block, suggesting that the relatively low coefficient of friction may account for the unusually frequent earthquakes. These findings would affect long-term forecast of earthquake probability and seismic hazard assessment on active faults.

  10. Mechanism of post-seismic floods after the Wenchuan earthquake in ...

    Indian Academy of Sciences (India)

    Ding Hairong

    2017-10-06

    Oct 6, 2017 ... and sediment variations and deforestation in the upper of. Minjiang River; Master thesis, Southwest University (in. Chinese with English abstract). Zhou R J, Li Y and Densmore A L et al. 2011 The strong motion records of the Ms 8.0 Wenchuan Earthquake by the digital strong earthquake network in Sichuan ...

  11. Global scale concentrations of volcanic activity on Venus: A summary of three 23rd Lunar and Planetary Science Conference abstracts. 1: Venus volcanism: Global distribution and classification from Magellan data. 2: A major global-scale concentration of volcanic activity in the Beta-Atla-Themis region of Venus. 3: Two global concentrations of volcanism on Venus: Geologic associations and implications for global pattern of upwelling and downwelling

    Science.gov (United States)

    Crumpler, L. S.; Aubele, Jayne C.; Head, James W.; Guest, J.; Saunders, R. S.

    1992-01-01

    As part of the analysis of data from the Magellan Mission, we have compiled a global survey of the location, dimensions, and subsidiary notes of all identified volcanic features on Venus. More than 90 percent of the surface area was examined and the final catalog comprehensively identifies 1548 individual volcanic features larger than approximately 20 km in diameter. Volcanic features included are large volcanoes, intermediate volcanoes, fields of small shield volcanoes, calderas, large lava channels, and lava floods as well as unusual features first noted on Venus such as coronae, arachnoids, and novae.

  12. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  13. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  14. Earthquake education in California

    Science.gov (United States)

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  15. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  16. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  17. Injection-induced earthquakes.

    Science.gov (United States)

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  18. Adaptively smoothed seismicity earthquake forecasts for Italy

    Directory of Open Access Journals (Sweden)

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  19. Signals of ENPEMF Used in Earthquake Prediction

    Science.gov (United States)

    Hao, G.; Dong, H.; Zeng, Z.; Wu, G.; Zabrodin, S. M.

    2012-12-01

    The signals of Earth's natural pulse electromagnetic field (ENPEMF) is a combination of the abnormal crustal magnetic field pulse affected by the earthquake, the induced field of earth's endogenous magnetic field, the induced magnetic field of the exogenous variation magnetic field, geomagnetic pulsation disturbance and other energy coupling process between sun and earth. As an instantaneous disturbance of the variation field of natural geomagnetism, ENPEMF can be used to predict earthquakes. This theory was introduced by A.A Vorobyov, who expressed a hypothesis that pulses can arise not only in the atmosphere but within the Earth's crust due to processes of tectonic-to-electric energy conversion (Vorobyov, 1970; Vorobyov, 1979). The global field time scale of ENPEMF signals has specific stability. Although the wave curves may not overlap completely at different regions, the smoothed diurnal ENPEMF patterns always exhibit the same trend per month. The feature is a good reference for observing the abnormalities of the Earth's natural magnetic field in a specific region. The frequencies of the ENPEMF signals generally locate in kilo Hz range, where frequencies within 5-25 kilo Hz range can be applied to monitor earthquakes. In Wuhan, the best observation frequency is 14.5 kilo Hz. Two special devices are placed in accordance with the S-N and W-E direction. Dramatic variation from the comparison between the pulses waveform obtained from the instruments and the normal reference envelope diagram should indicate high possibility of earthquake. The proposed detection method of earthquake based on ENPEMF can improve the geodynamic monitoring effect and can enrich earthquake prediction methods. We suggest the prospective further researches are about on the exact sources composition of ENPEMF signals, the distinction between noise and useful signals, and the effect of the Earth's gravity tide and solid tidal wave. This method may also provide a promising application in

  20. Posttraumatic stress disorder: a serious post-earthquake complication

    Directory of Open Access Journals (Sweden)

    Mudassir Farooqui

    Full Text Available Abstract Objectives Earthquakes are unpredictable and devastating natural disasters. They can cause massive destruction and loss of life and survivors may suffer psychological symptoms of severe intensity. Our goal in this article is to review studies published in the last 20 years to compile what is known about posttraumatic stress disorder (PTSD occurring after earthquakes. The review also describes other psychiatric complications that can be associated with earthquakes, to provide readers with better overall understanding, and discusses several sociodemographic factors that can be associated with post-earthquake PTSD Method A search for literature was conducted on major databases such as MEDLINE, PubMed, EMBASE, and PsycINFO and in neurology and psychiatry journals, and many other medical journals. Terms used for electronic searches included, but were not limited to, posttraumatic stress disorder (PTSD, posttraumatic symptoms, anxiety, depression, major depressive disorder, earthquake, and natural disaster. The relevant information was then utilized to determine the relationships between earthquakes and posttraumatic stress symptoms. Results It was found that PTSD is the most commonly occurring mental health condition among earthquake survivors. Major depressive disorder, generalized anxiety disorder, obsessive compulsive disorder, social phobia, and specific phobias were also listed. Conclusion The PTSD prevalence rate varied widely. It was dependent on multiple risk factors in target populations and also on the interval of time that had elapsed between the exposure to the deadly incident and measurement. Females seemed to be the most widely-affected group, while elderly people and young children exhibit considerable psychosocial impact.

  1. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  2. Grounding abstractness: Abstract concepts and the activation of the mouth

    Directory of Open Access Journals (Sweden)

    Anna M Borghi

    2016-10-01

    Full Text Available One key issue for theories of cognition is how abstract concepts, such as freedom, are represented. According to the WAT (Words As social Tools proposal, abstract concepts activate both sensorimotor and linguistic/social information, and their acquisition modality involves the linguistic experience more than the acquisition of concrete concepts. We report an experiment in which participants were presented with abstract and concrete definitions followed by concrete and abstract target-words. When the definition and the word matched, participants were required to press a key, either with the hand or with the mouth. Response times and accuracy were recorded. As predicted, we found that abstract definitions and abstract words yielded slower responses and more errors compared to concrete definitions and concrete words. More crucially, there was an interaction between the target-words and the effector used to respond (hand, mouth. While responses with the mouth were overall slower, the advantage of the hand over the mouth responses was more marked with concrete than with abstract concepts. The results are in keeping with grounded and embodied theories of cognition and support the WAT proposal, according to which abstract concepts evoke linguistic-social information, hence activate the mouth. The mechanisms underlying the mouth activation with abstract concepts (re-enactment of acquisition experience, or re-explanation of the word meaning, possibly through inner talk are discussed. To our knowledge this is the first behavioral study demonstrating with real words that the advantage of the hand over the mouth is more marked with concrete than with abstract concepts, likely because of the activation of linguistic information with abstract concepts.

  3. Grounding Abstractness: Abstract Concepts and the Activation of the Mouth.

    Science.gov (United States)

    Borghi, Anna M; Zarcone, Edoardo

    2016-01-01

    One key issue for theories of cognition is how abstract concepts, such as freedom, are represented. According to the WAT (Words As social Tools) proposal, abstract concepts activate both sensorimotor and linguistic/social information, and their acquisition modality involves the linguistic experience more than the acquisition of concrete concepts. We report an experiment in which participants were presented with abstract and concrete definitions followed by concrete and abstract target-words. When the definition and the word matched, participants were required to press a key, either with the hand or with the mouth. Response times and accuracy were recorded. As predicted, we found that abstract definitions and abstract words yielded slower responses and more errors compared to concrete definitions and concrete words. More crucially, there was an interaction between the target-words and the effector used to respond (hand, mouth). While responses with the mouth were overall slower, the advantage of the hand over the mouth responses was more marked with concrete than with abstract concepts. The results are in keeping with grounded and embodied theories of cognition and support the WAT proposal, according to which abstract concepts evoke linguistic-social information, hence activate the mouth. The mechanisms underlying the mouth activation with abstract concepts (re-enactment of acquisition experience, or re-explanation of the word meaning, possibly through inner talk) are discussed. To our knowledge this is the first behavioral study demonstrating with real words that the advantage of the hand over the mouth is more marked with concrete than with abstract concepts, likely because of the activation of linguistic information with abstract concepts.

  4. Mechanical Engineering Department technical abstracts

    International Nuclear Information System (INIS)

    Denney, R.M.

    1982-01-01

    The Mechanical Engineering Department publishes listings of technical abstracts twice a year to inform readers of the broad range of technical activities in the Department, and to promote an exchange of ideas. Details of the work covered by an abstract may be obtained by contacting the author(s). Overall information about current activities of each of the Department's seven divisions precedes the technical abstracts

  5. Mechanical Engineering Department technical abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Denney, R.M. (ed.)

    1982-07-01

    The Mechanical Engineering Department publishes listings of technical abstracts twice a year to inform readers of the broad range of technical activities in the Department, and to promote an exchange of ideas. Details of the work covered by an abstract may be obtained by contacting the author(s). Overall information about current activities of each of the Department's seven divisions precedes the technical abstracts.

  6. Logical Full Abstraction and PCF

    OpenAIRE

    Longley, John R; Plotkin, Gordon

    2000-01-01

    We introduce the concept of logical full abstraction, generalising the usual equational notion. We consider the language PCF and two extensions with “parallel” operations. The main result is that, for standard interpretations, logical full abstraction is equivalent to equational full abstraction together with universality; the proof involves constructing enumeration operators. We also consider restrictions on logical complexity and on the level of types.

  7. The USGS National Earthquake Information Center's Response to the Wenchuan, China Earthquake

    Science.gov (United States)

    Earle, P. S.; Wald, D. J.; Benz, H.; Sipkin, S.; Dewey, J.; Allen, T.; Jaiswal, K.; Buland, R.; Choy, G.; Hayes, G.; Hutko, A.

    2008-12-01

    Immediately after detecting the May 12th, 2008 Mw 7.9 Wenchuan Earthquake, the USGS National Earthquake Information Center (NEIC) began a coordinated effort to understand and communicate the earthquake's seismological characteristics, tectonic context, and humanitarian impact. NEIC's initial estimates of magnitude and location were distributed within 30 minutes of the quake by e-mail and text message to 70,000 users via the Earthquake Notification System. The release of these basic parameters automatically triggered the generation of more sophisticated derivative products that were used by relief and government agencies to plan their humanitarian response to the disaster. Body-wave and centroid moment tensors identified the earthquake's mechanism. Predictive ShakeMaps provided the first estimates of the geographic extent and amplitude of shaking. The initial automated population exposure estimate generated and distributed by the Prompt Assessment of Global Earthquakes for Response (PAGER) system stated that 1.2 million people were exposed to severe-to-extreme shaking (Modified Mercalli Intensity VIII or greater), indicating a large-scale disaster had occurred. NEIC's modeling of the mainshock and aftershocks was continuously refined and expanded. The length and orientation of the fault were determined from aftershocks, finite-fault models, and back-projection source imaging. Firsthand accounts of shaking intensity were collected and mapped by the "Did You Feel It" system. These results were used to refine our ShakeMaps and PAGER exposure estimates providing a more accurate assessment of the extent and enormity of the disaster. The products were organized and distributed in an event-specific summary poster and via the USGS Earthquake Program web pages where they were viewed by millions and reproduced by major media outlets (over 1/2 billion hits were served that month). Rather than just a point showing magnitude and epicenter, several of the media's schematic maps

  8. Abstract

    African Journals Online (AJOL)

    ­E¢b

    direction, which is believed to lead to improved social life and welfare. This means that Ethiopian trade and economic ... holding better market share and customer satisfaction in their products and services. In addition, in both ...... 1995. Dominant Values and Parenting. Styles: Major Limiting Factors on the Development of.

  9. ABSTRACT

    African Journals Online (AJOL)

    their effective participation under different socio- ecological constraints (IDRC, 1993;Takyiwa, 1998;. Kinikanwo, 2000; Isiugo-Abanike, 1994; UNO, 1989). The general issue here is to estimate the extent of female participation in ruminant livestock operations with a view to establishing if stereotyping such operations along.

  10. Abstract

    African Journals Online (AJOL)

    feasible and sustainable options. ... strategies, specific to the benefits of exclusive breastfeeding as a mechanism to reduce the risk of HIV transmission is urgently needed ... Joyce Beatrice Ballidawa is a lecturer in the Department of Behavioural Sciences at Moi University School of Medicine, Eldoret, a position she has held.

  11. Abstracts

    International Nuclear Information System (INIS)

    2013-01-01

    The power point presentation is about: danger identification, caracterization, evaluation exposition, risk (CAC, 1997; FAO, 2007), European food safety authority, foodrisk organization, pathogens risk ranking, risk reduction, gubernamental responsability

  12. ABSTRACT

    Indian Academy of Sciences (India)

    Efforts have also been successfully made to include the study of rock art in the school/ college curriculum so as to help develop awareness amongst the students and general public about the need to preserve this cultural heritage for the posterity and also to highlight its importance in tourism industry. rock art and their ...

  13. ABSTRACT

    Indian Academy of Sciences (India)

    ... school/ college curriculum so as to help develop awareness amongst the students and general public about the need to preserve this cultural heritage for the posterity and also to highlight its importance in tourism industry. rock art and their chronological sequences to more applied aspects like scientific methods of dating ...

  14. Abstract

    African Journals Online (AJOL)

    Francis

    Ficus species. Journal of Ethnopharmacology 41: 71-76. Nadkarni KM (1976) Indian Materia Medica. Third edition, Vol I. Popular Prakhasan, Bombay. NCCLS (National Committee for Clinical Laboratory Standards) (1999). Performance standards for antimicrobial susceptibility testing. 9th International Supplement M100- ...

  15. Abstract

    African Journals Online (AJOL)

    Getachew

    realistic distribution of no-show data in modeling the cost function was considered using data collected from the .... the paper models the cost function based on a realistic probability distributions based on the historical data is a .... Plot of Revenue generated vs. overbooking for two class case (at $500. Compensation Cost ...

  16. Abstract

    African Journals Online (AJOL)

    Dr Osondu

    2011-10-26

    Oct 26, 2011 ... Keywords: Municipal solid waste; Geographic information system; waste bin; optimal location; developing city. Introduction. Over the years, the spatial organization and existing infrastructure of cities in developing countries pose challenges for sustainable solid waste management programs. Much of the ...

  17. Abstract

    African Journals Online (AJOL)

    ATTAMAH C. O

    Differences in Climate Change Effects and Adaptation Strategies between Male and Female Livestock Entrepreneurs in ... differed from females in the adaptation strategies used in combating climate change and also on their view on ..... also make use of the same farm road whether in good or bad shape. This is in line with.

  18. ABSTRACT

    African Journals Online (AJOL)

    Dr Obe

    inner forces (bending moments, shearing forces etc) are usually redistributed. Cracks that often appear within the walls of tall buildings during constructions point to this phenomenon. It has also been recognized that foundation engineering is complicated. (1). Also settlement has been accepted as stress induced and time ...

  19. Abstract

    African Journals Online (AJOL)

    viral activity has been observed for halofantrine, amodiaquine and mepacrine.” The clinical significance of these findings is uncertain“"”-. There is some evidence that HIV protease inhibitors may alter disease outcomes of colnfected patients.

  20. Abstract

    African Journals Online (AJOL)

    was to determine how the natural aerial connec- teracting worker ants from adjacent trees and tions affected'the viability of colonies Oro!. , observing whether fighting took~iplace :(Yar~ra,. IQffginoda in the tree crowns, in the presence of, 1992). Ants of different colonies fight aggres- inimical ants,P" megacephala, 'on the ...

  1. abstract

    Directory of Open Access Journals (Sweden)

    . user

    2016-02-01

    Full Text Available Introduction: One of the microbiological preparations used for this study was Effective Microorganisms (EM, being a commercial mixture of photosynthesizing bacteria, Actinomycetes, lactic acid bacteria, yeasts and fermenting fungi. The microbiological composition of the EM concentrateincludesStreptomyces albus, Propioni bacterium freudenreichil, Streptococcus lactis, Aspergillus oryzae, Mucor hiemalis, Saccharomycescerevisiae and Candida utilis. Moreover, EM also contains an unspecified amount of Lactobacillus sp. Rhodo pseudomonas sp. and Streptomyces griseus. Effective Microorganisms have a positive effect on the decomposition of organic matter, limiting putrefaction, increasing nitrogen content in the root medium of plants, phosphorus, improving soil fertility and as a result contributing to the growth and development of the root systems of plants. Selection of almond vegetative rootstocks for water stress tolerance is important for almond crop production in arid and semi-arid regions. The study of the eco-morphological characteristics that determine the success of a rootstock in a particular environment is a powerful tool for both agricultural management and breeding purposes. The aim of this work was to select the new rootstocks for water shortage tolerance, impact of water stress as well as Effective Microorganism (EM on morphological characteristics of almond rootstocks. Materials and Methods: In order to select the new rootstocks for water shortage tolerance, impact of water stress as well as EMonmorphologicalcharacteristics of almondrootstocks were studiedin thedepartment ofHorticulture, Ferdowsi University of Mashhad, in 2011-2012. The experiment was carried out with four replications in a completely random blockdesign to study the effects of two concentrations of EM (0 and 1%, three irrigation levels (normal irrigation 100%-control-and irrigation after depletion of 33 and 66% of available water, and four almond rootstocks including GF677 and selected natural hybrid of peach × almond (H1and H2, and almond vegetative rootstock (local control.In this study,EMtreatments for 60 days before stress treatments were applied so that in each irrigation, EM solution to a concentration of one percent was given to half of the experiment pots. Other pots were irrigated equally with normal water. Stress levels were applied from July as follow: full irrigation, watering after unloading 33% and 66% soil moisture availability. In order to evaluate the performance, seedling survival, plant growth, number of leaves, leaf area, root fresh and dry weight and leaves and root length were measured. Results and Discussion: Analysis of variance showed that between rootstock levels across all treatments were significantly differences at 0.01 level of probability. Comparison of means showed that the highest fresh and dry weight and leaf are awere observed forGF677and H1.Rootstockannualgrowth rate was also different. Most of the growth was related to the H1 Rootstocks. Thes urvival ratewas significantly different from the Rootstocks ofGF677,andH1showedthe highestpercentage of survival. The degree of adaptation to drought in varieties of almonds is different. The results showed that changes ingrowthparametersinGF677and H1were observed less often than other rootstocks. Because of strong roots,GF677and H1continue to attract more minerals under stress conditions. Analysis of variance showed that the between irrigation levels for all treatments were significantly different at 0.01 level of probability. Comparison of means showed that among the study traits, the highest amount was obtained from complete irrigation, while irrigationat66 percenthad the least amount. Water stress may directly affect photosyn thesis, through leaf photochemicalprocessorindirectly,byclosing stomata, reducingleaf area and growth. The results showed that the levels of(EM on the leaf surface, leaf number, annual growth, root dry weight and volume were significantly different (p

  2. ABSTRACT

    African Journals Online (AJOL)

    BSN

    The preservative effect of CO\\\\'pea pods. seeds, husk. and water and ethanol extracts of the seeds and ... preservation of "kindirmo" with water and ethanol extracts of seeds and husk of. CO\\\\'J)Ca for most of the ... Perhaps same may apply in the area of preservatives: plant products may be safer and biologically friendlier.

  3. Abstract

    Indian Academy of Sciences (India)

    2017-03-10

    Mar 10, 2017 ... TaqMan allelic discrimination assay. .... women). All patients fulfilled the 1987 American College of Rheumatology criteria for RA. (Arnett et al. 1988). A rheumatology university fellow reviewed all clinical data. ... The rs6457617 and rs13192471 were genotyped with a TaqMan 5' allelic discrimination.

  4. abstract

    Directory of Open Access Journals (Sweden)

    abstract abstract

    2016-07-01

    Full Text Available Introduction: Strawberry (fragaria×ananassa Duch. fruit characterized by short storage life, often estimated last less than one week even under optimum conditions at 8°C. The loss of fruit quality is often caused by gray mold (Botrytis cinerea that is the most frequent reported postharvest disease in strawberry during storage (6. In recent years, considerable attention has given to elimination of synthetic chemical and fungicides application and development of various alternative strategies for controlling fruit and vegetables diseases (2. One strategy is replacement of natural products with plant origin such as essential oil and methyl salicylate (MeSA. Essential oils are volatile, natural and complex compounds characterized by a strong odor formed by aromatic plants in form of secondary metabolites. In nature, essential similar oils that extract from lavender (Lavandula angustifolia play an important role in protection of the plants against pathogen incidence that can be replaced by synthetic fungicides (1, 4 and 14. MeSA is also a volatile natural compound synthesized from salicylic acid which has an important role in the plant defense-mechanism, as well as plant growth and development (5, 19 and 20. Therefore, the main objective of this research was to study the effects of MeSA and lavender essential oil (LEO on decay control caused by Botrytis cinerea as well as post-harvest quality indices of strawberry fruits during cold storage. Material and Methods: First, antifungal activity was studied by using a contact assay (in vitro, which produces hyphal growth inhibition. Briefly, potato dextrose agar (PDA plates were prepared using 8 cm diameter glass petri dishes and inhibitory percentage was determined. For in-vivo assessment of LEO and MeSA effects on Botrytis-caused fungal disease control, the experiment was conducted as factorial in completely randomized design (CRD with 3 replicates. The treatments were 3 concentration of LEO including 0, 500 and 1000 µl L-1 and 3 level of MeSA including 0, 0.1 and 0.2 mM. After treatment, the fruits were inoculated by Botrytis suspension and transferred to storage and quality parameters were evaluated after 7, 14 and 21 days. At each sampling time, disease incidence, weight loss, titratable acidity, pH, soluble solids content, vitamin C and antioxidant activity were measured. Results and Discussion: The results showed that both LEO and MeSA treatments had significant effects on inhibition of mycelium growth within in-vitro condition (p < 0.05. Inhibition rate of mycelium growth significantly improved by LEO and MeSA concentration increase of, (Table 1. At in-vivo assessment, diseases incidence of treated fruits with 500 µl L-1 LEO and 0.1 mM MeSA were 32% and 64% lower than untreated fruits, respectively (Fig. 1 and 2. During storage period, the percentage of infected fruits increased. In addition, LEO and MeSA treatments affected quality parameters of strawberry fruits including titratable acidity, soluble solids content, vitamin C and antioxidant activity. Treated fruits had a high content of soluble solids, vitamin C and antioxidant activity in comparison to untreated fruits (Table 3 and 4. Probably ascorbic acid decreased through fungal infection duo to cell wall break down during storage. Any factors such as essential oil and salicylate that inhibit fungal growth can help preserving vitamin C in stored products. High level of vitamin C and antioxidant activity was observed in treated fruits with 0.1 mM MeSA and 500 µl L-1 LEO. In controlling weight loss of fruits, 0.2 mM of MeSA and 500 µl L-1 of LEO had significant effects, although MeSA was more effective than LEO treatments, possibly due to elimination of respiration rates and fungi infection (Table 4. Therefore, LEO and MeSA with fungicide effects could be replaced with synthetic fungicides in controlling fungal diseases of strawberry and maintain fruits quality during storage. Conclusion: In conclusion, our results showed that LEO and MeSA treatments could be safe and used to prevent infection of strawberry during storage, although LEO was more effective than MeSA treatments. Concentration of 500 μl L-1 of LEO and 0.1 mM MeSA could control fungal infection of fruits during storage. Also, LEO and MeSA treatments can extend shelf life for over the minimum period required to transit strawberries to foreign markets and without affecting quality, adversely. However, future studies are necessary to fully understand the mechanisms by which LEO and MeSA treatments may act as a fungicide and increase their postharvest life.

  5. Abstract

    Indian Academy of Sciences (India)

    This talk deals with the geometry of Banach spaces. A non-reflexive Banach space embeds canonically in its second dual and the process continues, giving raise to a strictly increasing chain of Banach spaces. A well known example of a geometric phenomenon that is preserved in this chain, is that of being (isometric) a ...

  6. Abstract

    African Journals Online (AJOL)

    PROF. O. E. OSUAGWU

    This software template is also of immense benefits to students of different ... connection, there is the potential to track learner's action in a ..... intelligence. This software will be a direct application of artificial intelligence to develop a special authoring system for e- learning that will have the ability to learn. Intelligent authoring ...

  7. Abstract

    DEFF Research Database (Denmark)

    Tafdrup, Oliver

    2013-01-01

    Udgivet som en del af Tidskrifts specialudgivelse om Adorno. http://tidskrift.dk/data/50/Aforismesamling.pdf......Udgivet som en del af Tidskrifts specialudgivelse om Adorno. http://tidskrift.dk/data/50/Aforismesamling.pdf...

  8. Abstract

    African Journals Online (AJOL)

    Maru Shete

    of voice, power and representation. To avert this situation, poor women in the research area require equal participation in resources sharing and power of decision making, better employment, housing, education, health care and other opportunities for social service opportunities through savings and credit cooperatives.

  9. ABSTRACT

    African Journals Online (AJOL)

    Chylous leakage is an unusual complication following anterior spinal surgery. This leakage can occur as a result of traumatic injury to the thoracic duct, the cisterna chyli, or the retroperitoneal lymphatic vessels. We report a case of a 56 year old female with thoracic spine disc prolapses with cord compression. She.

  10. Abstract

    Indian Academy of Sciences (India)

    2017-03-10

    Mar 10, 2017 ... Significant p-values were corrected (pc) by the number of alleles tested or subgroups analysed according to Bonferroni's ... LD in healthy controls between both rs13192471/rs6457617 with a value of D'=0.99 and ..... Radstake T.R., Gorlova O., Rueda B., Martin J.E., Alizadeh B.Z., Palomino-Morales R. et al.

  11. ABSTRACT

    African Journals Online (AJOL)

    -. 1111'. Cl". SO41-. (11-. $0}-. C1; sof-. SW' s-j'. (11; so}. New-_ Blank 'spa§cS imply rhelal levels below cleteczim. ,1 z,m1:> .Qu11>»»»: mtalive analysis _ . 1 mined for individual 1ni11cral. §_'r['§>f:é€lY.l.'.l.fi;'u'- we Table l 006%. 1 ii'!

  12. Abstract

    African Journals Online (AJOL)

    Getachew

    request made by a customer for a reservation of a certain class at time T. Although dynamic .... to both customer reaction upon denied boarding and profit loss. .... Sabanci University. http://www.optimization- online.org. Bailey, J. 2007. Bumped fliers and no plan B. The New York Times. Beckman, M.J & Bobkoski, F. 1985.

  13. ABSTRACT

    Directory of Open Access Journals (Sweden)

    Michelle de Stefano Sabino

    2011-12-01

    Full Text Available This paper aims to describe and to analyze the integration observed in the Sintonia project with respect to the comparison of project management processes to the model of the Stage-Gate ®. The literature addresses these issues conceptually, but lack an alignment between them that is evident in practice. As a method was used single case study. The report is as if the Sintonia project, developed by PRODESP - Data Processing Company of São Paulo. The results show the integration of project management processes with the Stage-Gate model developed during the project life cycle. The formalization of the project was defined in stages in which allowed the exploitation of economies of repetition and recombination to the development of new projects. This study contributes to the technical vision in dealing with the integration of project management processes. It was concluded that this system represents an attractive way, in terms of creating economic value and technological innovation for the organization.

  14. Abstract

    African Journals Online (AJOL)

    UDS-CAPTURED

    The high cost of delivering financial services to small and widely dispersed customers as well as difficult financial terrain in rural settings characterized by high covariant risks, missing markets for risk management instruments and ... Improving the extent of access to credit for low income households is a vital part of any rural ...

  15. Abstract

    African Journals Online (AJOL)

    dell

    the curriculum in higher education. In a similar way, major advances in biological, health sciences, social sciences, physical and life sciences, business and economics, and technology lead to revision of courses in the field. In line, with the everlasting explosion of knowledge and increasing sophistication of technology ...

  16. ABSTRACT

    African Journals Online (AJOL)

    production alone cannot provide the animal protein needs of about 100 million Nigerians. This, therefore calls for ... Fish contributes about 12 percent of the total animal protein supply of the World population (Borgstorm, ..... motivation for extension personnel (7.5%), inadequate transport facilities (5.5%), absence of strong ...

  17. Abstracts

    OpenAIRE

    Revista, Innovar

    2011-01-01

    New approaches towards Efficiency, Productivity and Quality in Management Theory / New approaches towards Efficiency, Productivity and Quality in Management Theory / The new paradigm regarding Science and Management Theory / Game Theory as applied to Administration / A Systemic approach to Territorial Diagnosis /  A prolile 0f Technological Capacity in the Graphical Art, Printing and Publishing Industry / Colombian Industrialisation: a Heterodox Vision /Determinant factors in environmental po...

  18. Abstract

    African Journals Online (AJOL)

    Implementing this collaborative e-learning environment on a Linux thin-client system makes it possible for this environment to be available in most schools and companies because the Linux thin-clients are less expensive than other conventional computing systems. Developing a. Collaborative E-Learning Environment on ...

  19. abstract

    African Journals Online (AJOL)

    communication and institutions activities in removing constraints which impede the acceptance and continued usage .... With farmers' feedback, scientists cannot misinterpret a problem or attribute wrong causes to it. ..... Protection and Environmental Management, University of lbadan,. Ibadan. Ashby, J. (1990): Small-farmer ...

  20. Abstract ~. ,

    African Journals Online (AJOL)

    Governmenf to educate dairy farmers, milli vendors and the consume~s on the importance of producing, selling and consuming respectively un-adulterated milk. Key words: Milk, water adulteration, Morogoro Municipality. Introduction. Total annual milk production in Tanzania is estimated to be at 724,000 metric tons (F AO,.

  1. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  2. a Collaborative Cyberinfrastructure for Earthquake Seismology

    Science.gov (United States)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  3. Ionospheric earthquake effects detection based on Total Electron Content (TEC) GPS Correlation

    Science.gov (United States)

    Sunardi, Bambang; Muslim, Buldan; Eka Sakya, Andi; Rohadi, Supriyanto; Sulastri; Murjaya, Jaya

    2018-03-01

    Advances in science and technology showed that ground-based GPS receiver was able to detect ionospheric Total Electron Content (TEC) disturbances caused by various natural phenomena such as earthquakes. One study of Tohoku (Japan) earthquake, March 11, 2011, magnitude M 9.0 showed TEC fluctuations observed from GPS observation network spread around the disaster area. This paper discussed the ionospheric earthquake effects detection using TEC GPS data. The case studies taken were Kebumen earthquake, January 25, 2014, magnitude M 6.2, Sumba earthquake, February 12, 2016, M 6.2 and Halmahera earthquake, February 17, 2016, M 6.1. TEC-GIM (Global Ionosphere Map) correlation methods for 31 days were used to monitor TEC anomaly in ionosphere. To ensure the geomagnetic disturbances due to solar activity, we also compare with Dst index in the same time window. The results showed anomalous ratio of correlation coefficient deviation to its standard deviation upon occurrences of Kebumen and Sumba earthquake, but not detected a similar anomaly for the Halmahera earthquake. It was needed a continous monitoring of TEC GPS data to detect the earthquake effects in ionosphere. This study giving hope in strengthening the earthquake effect early warning system using TEC GPS data. The method development of continuous TEC GPS observation derived from GPS observation network that already exists in Indonesia is needed to support earthquake effects early warning systems.

  4. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  5. Modal abstractions of concurrent behavior

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nanz, Sebastian; Nielson, Hanne Riis

    2011-01-01

    We present an effective algorithm for the automatic construction of finite modal transition systems as abstractions of potentially infinite concurrent processes. Modal transition systems are recognized as valuable abstractions for model checking because they allow for the validation as well as re...

  6. Pattern-Based Graph Abstraction

    NARCIS (Netherlands)

    Rensink, Arend; Zambon, Eduardo; Ehrig, H; Engels, G.; Kreowski, H.J.; Rozenberg, G.

    We present a new abstraction technique for the exploration of graph transformation systems with infinite state spaces. This technique is based on patterns, simple graphs describing structures of interest that should be preserved by the abstraction. Patterns are collected into pattern graphs, layered

  7. Abstraction by Set-Membership

    DEFF Research Database (Denmark)

    Mödersheim, Sebastian Alexander

    2010-01-01

    that the set of true facts does not monotonically grow with the transitions. We extend the scope of these over-approximation methods by defining a new way of abstraction that can handle such databases, and we formally prove that the abstraction is sound. We realize a translator from a convenient specification...

  8. Abstract concepts in grounded cognition

    NARCIS (Netherlands)

    Lakens, D.

    2010-01-01

    When people think about highly abstract concepts, they draw upon concrete experiences to structure their thoughts. For example, black knights in fairytales are evil, and knights in shining armor are good. The sensory experiences black and white are used to represent the abstract concepts of good and

  9. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  10. Technical abstracts: Mechanical engineering, 1990

    International Nuclear Information System (INIS)

    Broesius, J.Y.

    1991-01-01

    This document is a compilation of the published, unclassified abstracts produced by mechanical engineers at Lawrence Livermore National Laboratory (LLNL) during the calendar year 1990. Many abstracts summarize work completed and published in report form. These are UCRL-JC series documents, which include the full text of articles to be published in journals and of papers to be presented at meetings, and UCID reports, which are informal documents. Not all UCIDs contain abstracts: short summaries were generated when abstracts were not included. Technical Abstracts also provides descriptions of those documents assigned to the UCRL-MI (miscellaneous) category. These are generally viewgraphs or photographs presented at meetings. An author index is provided at the back of this volume for cross referencing

  11. Metaphor: Bridging embodiment to abstraction.

    Science.gov (United States)

    Jamrozik, Anja; McQuire, Marguerite; Cardillo, Eileen R; Chatterjee, Anjan

    2016-08-01

    Embodied cognition accounts posit that concepts are grounded in our sensory and motor systems. An important challenge for these accounts is explaining how abstract concepts, which do not directly call upon sensory or motor information, can be informed by experience. We propose that metaphor is one important vehicle guiding the development and use of abstract concepts. Metaphors allow us to draw on concrete, familiar domains to acquire and reason about abstract concepts. Additionally, repeated metaphoric use drawing on particular aspects of concrete experience can result in the development of new abstract representations. These abstractions, which are derived from embodied experience but lack much of the sensorimotor information associated with it, can then be flexibly applied to understand new situations.

  12. Technical abstracts: Mechanical engineering, 1990

    Energy Technology Data Exchange (ETDEWEB)

    Broesius, J.Y. (comp.)

    1991-03-01

    This document is a compilation of the published, unclassified abstracts produced by mechanical engineers at Lawrence Livermore National Laboratory (LLNL) during the calendar year 1990. Many abstracts summarize work completed and published in report form. These are UCRL-JC series documents, which include the full text of articles to be published in journals and of papers to be presented at meetings, and UCID reports, which are informal documents. Not all UCIDs contain abstracts: short summaries were generated when abstracts were not included. Technical Abstracts also provides descriptions of those documents assigned to the UCRL-MI (miscellaneous) category. These are generally viewgraphs or photographs presented at meetings. An author index is provided at the back of this volume for cross referencing.

  13. Abstract Interpretation and Attribute Gramars

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    The objective of this thesis is to explore the connections between abstract interpretation and attribute grammars as frameworks in program analysis. Abstract interpretation is a semantics-based program analysis method. A large class of data flow analysis problems can be expressed as non-standard ......The objective of this thesis is to explore the connections between abstract interpretation and attribute grammars as frameworks in program analysis. Abstract interpretation is a semantics-based program analysis method. A large class of data flow analysis problems can be expressed as non......-standard semantics where the ``meaning'' contains information about the runtime behaviour of programs. In an abstract interpretation the analysis is proved correct by relating it to the usual semantics for the language. Attribute grammars provide a method and notation to specify code generation and program analysis...

  14. Least-Total-Cost Analysis for Earthquake Design Levels.

    Science.gov (United States)

    1980-06-01

    aid. I -n~..r d Idenify by blorA ... ber) Earthquakes, Structural design, Costs, Damae, Least cost, Optimal design Seismic risk. M0 ABSTRACT ( Continuo ...Oct 2-3, 1975, University of Illinois, Urbana, Ill. 16. University of California. EERC 75-27: Identification of research needs for improving aseismic

  15. The September 2011 Sikkim Himalaya earthquake Mw 6.9: is it a plane of detachment earthquake?

    Directory of Open Access Journals (Sweden)

    Santanu Baruah

    2016-01-01

    Full Text Available The 18 September 2011 Sikkim Himalaya earthquake of Mw 6.9 (focal depth 50 km, NEIC report with maximum intensity of VII on MM scale (www.usgs.gov occurred in the Himalayan seismic belt (HSB, to the north of the main central thrust. Neither this thrust nor the plane of detachment envisaged in the HSB model, however, caused this strong devastating earthquake. The Engdahl–Hilst–Buland (EHB relocated past earthquakes recorded during 1965–2007 and the available global centroid moment tensor solutions are critically examined to identify the source zone and stress regime of the September 2011 earthquake. The depth section plot of these earthquakes shows that a deeper (10–50 km vertical fault zone caused the main shock in the Sikkim Himalaya. The NW (North-West and NE (North-East trending transverse fault zones cutting across the eastern Himalaya are the source zones of the earthquakes. Stress inversion shows that the region is dominated by horizontal NNW-SSE (North of North-West-South of South-East compressional stress and low angle or near horizontal ENE-WSW (East of North-East-West of South-West tensional stress; this stress regime is conducive for strike-slip faulting earthquakes in Sikkim Himalaya and its vicinity. The Coulomb stress transfer analysis indicates positive values of Coulomb stress change for failure in the intersecting deeper fault zone that produced the four immediate felt aftershocks (M ≥ 4.0.

  16. Is there a relationship between solar activity and earthquakes?

    Science.gov (United States)

    L'Huissier, P.; Dominguez, M.; Gallo, N.; Tapia, M.; Pinto, V. A.; Moya, P. S.; Stepanova, M. V.; Munoz, V.; Rogan, J.; Valdivia, J. A.

    2012-12-01

    Several statistical studies have suggested a connection between solar and geomagnetic activity, and seismicity. Some studies claim there are global effects, relating solar activity, for instance, with earthquake occurrence on the Earth. Other studies intend to find effects on a local scale, where perturbations in the geomagnetic activity are followed by seismic events. We intend to investigate this issue by means of a surrogates method. First, we analyze the statistical validity of reported correlations between the number of sunspots and the annual number of earthquakes during the last century. On the other hand, in relation to local geomagnetic variations prior to an important earthquake, we carry out a study of the magnetic field fluctuations using the SAMBA array in a window of two years centered in the February 27th, 2010 M = 8.8 earthquake at Chile. We expect these studies to be useful in order to find measurable precursors before an important seismic event.

  17. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  18. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  19. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  20. The SMART CLUSTER METHOD - adaptive earthquake cluster analysis and declustering

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2016-04-01

    the dataset receives a unique cluster-ID which links it to its respective cluster. Utilising the knowledge of cluster classification, the method has been adjusted to provide an earthquake declustering algorithm, which is compared to common existing methods used for declustering in PSHA in detail. The SCM provides a significantly more reliable declustering method in comparison to existing approaches. Instead of simply removing events from the dataset, it first identifies clusters and analyses them to avoid artefacts and non-physical implications. Furthermore, the SCM is the foundation of a more detailed analysis of global earthquake cluster activity. Using the knowledge of identified clusters, a detailed analysis of cluster properties can be undertaken.

  1. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  2. People’s Republic of China Scientific Abstracts, Number 195.

    Science.gov (United States)

    1978-08-01

    76 15 CHIH-WU HSUEH-PAO [ACTA BOTANICA SINICA] No 1, March 1978 16 TI-CHEN CHAN-HSIEN [EARTHQUAKE FRONT] No 2, April 1978 26 K𔃺-HSUEH...steril^UMs! and Consolidate the contradiction between the hsien and 6168 CSO: 8111/1528 15 ACTA BOTANICA SINICA AUTHOR: CHANG Te-i [l72B 1795...of Cultured Tobacco Cells" SOURCE: Peking CHIH-WU H5UEH-PA0 [_ACTA BOTANICA SINICÄ] in Chinese No 1, Mar 78 pp 1-5 TEXT OF ENGLISH ABSTRACT

  3. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  4. Selected Translated Abstracts of Chinese-Language Climate Change Publications

    Energy Technology Data Exchange (ETDEWEB)

    Cushman, R.M.; Burtis, M.D.

    1999-05-01

    This report contains English-translated abstracts of important Chinese-language literature concerning global climate change for the years 1995-1998. This body of literature includes the topics of adaptation, ancient climate change, climate variation, the East Asia monsoon, historical climate change, impacts, modeling, and radiation and trace-gas emissions. In addition to the biological citations and abstracts translated into English, this report presents the original citations and abstracts in Chinese. Author and title indexes are included to assist the reader in locating abstracts of particular interest.

  5. Nuclear energy and environment: abstracts

    International Nuclear Information System (INIS)

    1999-01-01

    In this meeting on nuclear energy and environment, abstracts on the following subjects were presented: nuclear fuels; materials; radioisotopes and its applications; reactors and nuclear power plants; regulations, energy and environment; radioactive wastes; and analytical techniques

  6. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  7. Stable Continental Region Earthquakes in South China

    Science.gov (United States)

    Liu, L.

    This paper reviews some remarkable characteristics of earthquakes in a Stable Continental Region (SCR) of the South China Block (SCB). The kernel of the SCB is the Yangtze platform solidified in late Proterozoic time, with continental growth to the southeast by a series of fold belts in Paleozoic time. The facts that the deviatoric stress is low, the orientations of the major tectonic features in the SCB are substantially normal to the maximum horizontal principal stress, and a relatively uniform crust, seem to be the major reasons for lack of significant seismicity in most regions of the SCB. Earthquakes in this region are mainly associated with three seismic zones: (1) the Southeast China Coast seismic zone related to Guangdong-Fujian coastal folding belt (associated with Eurasia-Philippine Sea plate collision); (2) the Southern Yellow Sea seismic zone associated with continental shelf rifts and basins; and (3) the Downstream Yangtze River seismic zone spatially coinciding with Tertiary rifts and basin development. All three seismic zones are close to one or two major economic and population centers in the SCB so that they pose significant seismic hazards. Earthquake focal mechanisms in the SCB are consistent with strike-slip to normal faulting stress regimes. Because of the global and national economic significance of the SCB and its dense population, the seismic hazard of the region is of outstanding importance. Comparing the SCB with another less developed region, a pending earthquake with the same size and tectonic setting would cause substantially more severe social and economic losses in the SCB. This paper also compiles an inventory of historic moderate to great earthquakes in the SCB; most of the data are not widely available in English literature.

  8. Homogeneous catalogs of earthquakes.

    Science.gov (United States)

    Knopoff, L; Gardner, J K

    1969-08-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967.

  9. HOMOGENEOUS CATALOGS OF EARTHQUAKES*

    Science.gov (United States)

    Knopoff, Leon; Gardner, J. K.

    1969-01-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967. PMID:16578700

  10. Earthquake in Haiti

    DEFF Research Database (Denmark)

    Holm, Isak Winkel

    2012-01-01

    In the vocabulary of modern disaster research, Heinrich von Kleist's seminal short story "The Earthquake in Chile" from 1806 is a tale of disaster vulnerability. The story is not just about a natural disaster destroying the innocent city of Santiago but also about the ensuing social disaster...

  11. Earthquake-proof plants

    International Nuclear Information System (INIS)

    Francescutti, P.

    2008-01-01

    In the wake of the damage suffered by the Kashiwazaki-Kariwa nuclear power plant as a result of an earthquake last July, this article looks at the seismic risk affecting the Spanish plants and the safety measures in place to prevent it. (Author)

  12. Earthquakes and market crashes

    Indian Academy of Sciences (India)

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  13. Systematic Detection of Remotely Triggered Seismicity in Africa Following Recent Large Earthquakes

    Science.gov (United States)

    Ayorinde, A. O.; Peng, Z.; Yao, D.; Bansal, A. R.

    2016-12-01

    It is well known that large distant earthquakes can trigger micro-earthquakes/tectonic tremors during or immediately following their surface waves. Globally, triggered earthquakes have been mostly found in active plate boundary regions. It is not clear whether they could occur within stable intraplate regions in Africa as well as the active East African Rift Zone. In this study we conduct a systematic study of remote triggering in Africa following recent large earthquakes, including the 2004 Mw9.1 Sumatra and 2012 Mw8.6 Indian Ocean earthquakes. In particular, the 2012 Indian Ocean earthquake is the largest known strike slip earthquake and has triggered a global increase of magnitude larger than 5.5 earthquakes as well as numerous micro-earthquakes/tectonic tremors around the world. The entire Africa region was examined for possible remotely triggered seismicity using seismic data downloaded from the Incorporated Research Institutes for Seismology (IRIS) Data Management Center (DMC) and GFZ German Research Center for Geosciences. We apply a 5-Hz high-pass-filter to the continuous waveforms and visually identify high-frequency signals during and immediately after the large amplitude surface waves. Spectrograms are computed as additional tools to identify triggered seismicities and we further confirm them by statistical analysis comparing the high-frequency signals before and after the distant mainshocks. So far we have identified possible triggered seismicity in Botswana and northern Madagascar. This study could help to understand dynamic triggering in diverse tectonic settings of the African continent.

  14. 2002 Industry Analysis Research Paper: Global Environment, Global Industry, and Global Security: Managing the Crossroads

    Science.gov (United States)

    2002-01-01

    2002 Industry Analysis Research Paper Global Environment, Global Industry, and Global Security: Managing the Crossroads Abstract. The events of...Industry Analysis Research Paper : Global Environment, Global Industry, and Global Security: Managing the Crossroads 5a. CONTRACT NUMBER 5b. GRANT

  15. A review on remotely sensed land surface temperature anomaly as an earthquake precursor

    Science.gov (United States)

    Bhardwaj, Anshuman; Singh, Shaktiman; Sam, Lydia; Joshi, P. K.; Bhardwaj, Akanksha; Martín-Torres, F. Javier; Kumar, Rajesh

    2017-12-01

    The low predictability of earthquakes and the high uncertainty associated with their forecasts make earthquakes one of the worst natural calamities, capable of causing instant loss of life and property. Here, we discuss the studies reporting the observed anomalies in the satellite-derived Land Surface Temperature (LST) before an earthquake. We compile the conclusions of these studies and evaluate the use of remotely sensed LST anomalies as precursors of earthquakes. The arrival times and the amplitudes of the anomalies vary widely, thus making it difficult to consider them as universal markers to issue earthquake warnings. Based on the randomness in the observations of these precursors, we support employing a global-scale monitoring system to detect statistically robust anomalous geophysical signals prior to earthquakes before considering them as definite precursors.

  16. Rupture evolution of the 2006 Java tsunami earthquake and the possible role of splay faults

    Science.gov (United States)

    Fan, Wenyuan; Bassett, Dan; Jiang, Junle; Shearer, Peter M.; Ji, Chen

    2017-11-01

    The 2006 Mw 7.8 Java earthquake was a tsunami earthquake, exhibiting frequency-dependent seismic radiation along strike. High-frequency global back-projection results suggest two distinct rupture stages. The first stage lasted ∼65 s with a rupture speed of ∼1.2 km/s, while the second stage lasted from ∼65 to 150 s with a rupture speed of ∼2.7 km/s. High-frequency radiators resolved with back-projection during the second stage spatially correlate with splay fault traces mapped from residual free-air gravity anomalies. These splay faults also colocate with a major tsunami source associated with the earthquake inferred from tsunami first-crest back-propagation simulation. These correlations suggest that the splay faults may have been reactivated during the Java earthquake, as has been proposed for other tsunamigenic earthquakes, such as the 1944 Mw 8.1 Tonankai earthquake in the Nankai Trough.

  17. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    Science.gov (United States)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  18. 88 hours: the U.S. Geological Survey National Earthquake Information Center response to the March 11, 2011 Mw 9.0 Tohoku earthquake

    Science.gov (United States)

    Wald, David J.; Hayes, Gavin P.; Benz, Harley M.; Earle, Paul S.; Briggs, Richard W.

    2011-01-01

    The M 9.0 11 March 2011 Tohoku, Japan, earthquake and associated tsunami near the east coast of the island of Honshu caused tens of thousands of deaths and potentially over one trillion dollars in damage, resulting in one of the worst natural disasters ever recorded. The U.S. Geological Survey National Earthquake Information Center (USGS NEIC), through its responsibility to respond to all significant global earthquakes as part of the National Earthquake Hazards Reduction Program, quickly produced and distributed a suite of earthquake information products to inform emergency responders, the public, the media, and the academic community of the earthquake's potential impact and to provide scientific background for the interpretation of the event's tectonic context and potential for future hazard. Here we present a timeline of the NEIC response to this devastating earthquake in the context of rapidly evolving information emanating from the global earthquake-response community. The timeline includes both internal and publicly distributed products, the relative timing of which highlights the inherent tradeoffs between the requirement to provide timely alerts and the necessity for accurate, authoritative information. The timeline also documents the iterative and evolutionary nature of the standard products produced by the NEIC and includes a behind-the-scenes look at the decisions, data, and analysis tools that drive our rapid product distribution.

  19. The HayWired earthquake scenario—Earthquake hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  20. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  1. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  2. Use of GPS and InSAR Technology and its Further Development in Earthquake Modeling

    Science.gov (United States)

    Donnellan, A.; Lyzenga, G.; Argus, D.; Peltzer, G.; Parker, J.; Webb, F.; Heflin, M.; Zumberge, J.

    1999-01-01

    Global Positioning System (GPS) data are useful for understanding both interseismic and postseismic deformation. Models of GPS data suggest that the lower crust, lateral heterogeneity, and fault slip, all provide a role in the earthquake cycle.

  3. On the reported magnetic precursor of the 1993 guam earthquake

    Science.gov (United States)

    Thomas, J.N.; Love, J.J.; Johnston, M.J.S.; Yumoto, K.

    2009-01-01

    Using 1-second magnetometer data recorded 67 km from the epicenter of the 1993 Mw 7.7 Guam earthquake, Hayakawa et al. (1996) and Miyahara et al. (1999) identify anomalous precursory changes in ultra-low frequency magnetic polarization (the ratio of vertical to horizontal field components). In a check of their results, we compare their data (GAM) with 1-second data from the Kakioka observatory (KAK) in Japan and the global magnetic activity index Kp. We also examine log books kept by USGS staff working on the Guam magnetic observatory. We find (1) analysis problems with both Hayakawa et al. and Miyahara et al., (2) significant correlation between the GAM, KAK, and Kp data, and (3) an absence of identifiable localized anomalous signals occurring prior to the earthquake. The changes we do find in polarization are part of normal global magnetic activity; they are unrelated to the earthquake. Copyright 2009 by the American Geophysical Union.

  4. Large earthquake rates from geologic, geodetic, and seismological perspectives

    Science.gov (United States)

    Jackson, D. D.

    2017-12-01

    Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes

  5. Mechanical Engineering Department technical abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1984-07-01

    The Mechanical Engineering Department publishes abstracts twice a year to inform readers of the broad range of technical activities in the Department, and to promote an exchange of ideas. Details of the work covered by an abstract may be obtained by contacting the author(s). General information about the current role and activities of each of the Department's seven divisions precedes the technical abstracts. Further information about a division's work may be obtained from the division leader, whose name is given at the end of each divisional summary. The Department's seven divisions are as follows: Nuclear Test Engineering Division, Nuclear Explosives Engineering Division, Weapons Engineering Division, Energy Systems Engineering Division, Engineering Sciences Division, Magnetic Fusion Engineering Division and Materials Fabrication Division.

  6. Mechanical Engineering Department technical abstracts

    International Nuclear Information System (INIS)

    1984-01-01

    The Mechanical Engineering Department publishes abstracts twice a year to inform readers of the broad range of technical activities in the Department, and to promote an exchange of ideas. Details of the work covered by an abstract may be obtained by contacting the author(s). General information about the current role and activities of each of the Department's seven divisions precedes the technical abstracts. Further information about a division's work may be obtained from the division leader, whose name is given at the end of each divisional summary. The Department's seven divisions are as follows: Nuclear Test Engineering Division, Nuclear Explosives Engineering Division, Weapons Engineering Division, Energy Systems Engineering Division, Engineering Sciences Division, Magnetic Fusion Engineering Division and Materials Fabrication Division

  7. Elements of abstract harmonic analysis

    CERN Document Server

    Bachman, George

    2013-01-01

    Elements of Abstract Harmonic Analysis provides an introduction to the fundamental concepts and basic theorems of abstract harmonic analysis. In order to give a reasonably complete and self-contained introduction to the subject, most of the proofs have been presented in great detail thereby making the development understandable to a very wide audience. Exercises have been supplied at the end of each chapter. Some of these are meant to extend the theory slightly while others should serve to test the reader's understanding of the material presented. The first chapter and part of the second give

  8. Abstracts from Rambam Research Day

    Directory of Open Access Journals (Sweden)

    Shraga Blazer

    2015-01-01

    Full Text Available [Extract] This Supplement of Rambam Maimonides Medical Journal presents the abstracts from the Eleventh Rambam Research Day. These abstracts represent the newest basic and clinical research coming out of Rambam Health Care Campus—research that is the oxygen for education and development of today’s generation of physicians. Hence, the research presented on Rambam Research Day is a foundation for future generations to understand patient needs and improve treatment modalities. Bringing research from the bench to the bedside and from the bedside to the community is at the heart of Maimonides’ scholarly and ethical legacy.

  9. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  10. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  11. Iranian earthquakes, a uniform catalog with moment magnitudes

    Science.gov (United States)

    Karimiparidari, Sepideh; Zaré, Mehdi; Memarian, Hossein; Kijko, Andrzej

    2013-07-01

    A uniform earthquake catalog is an essential tool in any seismic hazard analysis. In this study, an earthquake catalog of Iran and adjacent areas was compiled, using international and national databanks. The following priorities were applied in selecting magnitude and earthquake location: (a) local catalogs were given higher priority for establishing the location of an earthquake and (b) global catalogs were preferred for determining earthquake magnitudes. Earthquakes that have occurred within the bounds between 23-42° N and 42-65° E, with a magnitude range of M W 3.5-7.9, from the third millennium BC until April 2010 were included. In an effort to avoid the "boundary effect," since the newly compiled catalog will be mainly used for seismic hazard assessment, the study area includes the areas adjacent to Iran. The standardization of the catalog in terms of magnitude was achieved by the conversion of all types of magnitude into moment magnitude, M W, by using the orthogonal regression technique. In the newly compiled catalog, all aftershocks were detected, based on the procedure described by Gardner and Knopoff (Bull Seismol Soc Am 64:1363-1367, 1974). The seismicity parameters were calculated for the six main tectonic seismic zones of Iran, i.e., the Zagros Mountain Range, the Alborz Mountain Range, Central Iran, Kope Dagh, Azerbaijan, and Makran.

  12. 107 SPOUSAL RAPE IN A GLOBALIZED WORLD Abstract Sexual ...

    African Journals Online (AJOL)

    Fr. Ikenga

    comradeship, the ideal union between the sexes that shall result in the highest development of the race'1. From time ... Justice in England in his book, History of the Pleas of the Crown 1736 and often quoted that: 'The .... posttraumatic stress disorder (PTSD), depression, substance abuse, she may have suicidal impulses,.

  13. Page | 107 SPOUSAL RAPE IN A GLOBALIZED WORLD Abstract ...

    African Journals Online (AJOL)

    Fr. Ikenga

    After the sexual violation, such woman may suffer insomnia, posttraumatic stress disorder (PTSD), depression, substance abuse, she may have suicidal impulses, chronic physical health problems and be exposed to varying degrees of victimization, she may be disorganized at first because of the rape and later reorganize ...

  14. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  15. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    Science.gov (United States)

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global

  16. Abstract Interpretation of Mobile Ambients

    DEFF Research Database (Denmark)

    Hansen, René Rydhof; Jensen, J. G.; Nielson, Flemming

    1999-01-01

    We demonstrate that abstract interpretation is useful for analysing calculi of computation such as the ambient calculus (which is based on the p-calculus); more importantly, we show that the entire development can be expressed in a constraint-based formalism that is becoming exceedingly popular...

  17. Abstract Résumé

    African Journals Online (AJOL)

    Abstract. To describe the infant feeding practices in the general population in Uganda, and to assess the impact of maternal HIV status on these ... to-child transmission of HIV should re-enforce counselling activities to address the issue of early weaning by HIV-infected women, ..... A study in Zimbabwe yielded similar results,.

  18. Biocards and Level of Abstraction

    DEFF Research Database (Denmark)

    Lenau, Torben Anker; Keshwani, Sonal; Chakrabarti, Amaresh

    2015-01-01

    Biocards are formal descriptions of biological phenomena and their underlying functional principles. They are used in bioinspired design to document search results and to communicate the findings for use in the further design process. The present study explored the effect of abstraction level use...

  19. IRAP 2006, Book of Abstracts

    International Nuclear Information System (INIS)

    2006-01-01

    This publications related with Hacettepe University, Turkish Atomic Energy Authority, The Scientific and Technological Research Council of Turkey, International Atomic Energy Agency, CEA-Saclay, CEA-Saclay Drecam, ANKAmall Shopping Center and Ion Beam Applications Industrial that was held in Antalya, Turkey, 23-28 September 2006. A separate abstract was prepared for each paper

  20. Abstract Résumé

    African Journals Online (AJOL)

    1 juil. 2013 ... Abstract. In Senegal, where HIV prevalence is less than 1% and stigma remains important, 40% of marriages are polygamic. The purpose of this article is to describe and analyze the motivations, benefits and constraints related to HIV disclosure, and to explore specific situations related to polygamy.

  1. Metaphoric Images from Abstract Concepts.

    Science.gov (United States)

    Vizmuller-Zocco, Jana

    1992-01-01

    Discusses children's use of metaphors to create meaning, using as an example the pragmatic and "scientific" ways in which preschool children explain thunder and lightning to themselves. Argues that children are being shortchanged by modern scientific notions of abstractness and that they should be encouraged to create their own explanations of…

  2. The Complexity of Abstract Machines

    Directory of Open Access Journals (Sweden)

    Beniamino Accattoli

    2017-01-01

    Full Text Available The lambda-calculus is a peculiar computational model whose definition does not come with a notion of machine. Unsurprisingly, implementations of the lambda-calculus have been studied for decades. Abstract machines are implementations schema for fixed evaluation strategies that are a compromise between theory and practice: they are concrete enough to provide a notion of machine and abstract enough to avoid the many intricacies of actual implementations. There is an extensive literature about abstract machines for the lambda-calculus, and yet—quite mysteriously—the efficiency of these machines with respect to the strategy that they implement has almost never been studied. This paper provides an unusual introduction to abstract machines, based on the complexity of their overhead with respect to the length of the implemented strategies. It is conceived to be a tutorial, focusing on the case study of implementing the weak head (call-by-name strategy, and yet it is an original re-elaboration of known results. Moreover, some of the observation contained here never appeared in print before.

  3. Rolloff Roof Observatory Construction (Abstract)

    Science.gov (United States)

    Ulowetz, J. H.

    2015-12-01

    (Abstract only) Lessons learned about building an observatory by someone with limited construction experience, and the advantages of having one for imaging and variable star studies. Sample results shown of composite light curves for cataclysmic variables UX UMa and V1101 Aql with data from my observatory combined with data from others around the world.

  4. Abstract Expressionism. Clip and Save.

    Science.gov (United States)

    Hubbard, Guy

    2002-01-01

    Provides information on the art movement, Abstract Expressionism, and includes learning activities. Focuses on the artist Jackson Pollock, offering a reproduction of his artwork, "Convergence: Number 10." Includes background information on the life and career of Pollock and a description of the included artwork. (CMK)

  5. Original Abstracts - Supplementary | Conference Contributors ...

    African Journals Online (AJOL)

    All abstracts from the The Annual Medical Research Day (AMRD) held at the University of Zimbabwe. Full Text: EMAIL FULL TEXT EMAIL FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT · AJOL African Journals Online. HOW TO USE AJOL... for Researchers · for Librarians · for Authors · FAQ's · More about ...

  6. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  7. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  8. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  9. The global aftershock zone

    Science.gov (United States)

    Parsons, Thomas E.; Margaret Segou,; Warner Marzocchi,

    2014-01-01

    The aftershock zone of each large (M ≥ 7) earthquake extends throughout the shallows of planet Earth. Most aftershocks cluster near the mainshock rupture, but earthquakes send out shivers in the form of seismic waves, and these temporary distortions are large enough to trigger other earthquakes at global range. The aftershocks that happen at great distance from their mainshock are often superposed onto already seismically active regions, making them difficult to detect and understand. From a hazard perspective we are concerned that this dynamic process might encourage other high magnitude earthquakes, and wonder if a global alarm state is warranted after every large mainshock. From an earthquake process perspective we are curious about the physics of earthquake triggering across the magnitude spectrum. In this review we build upon past studies that examined the combined global response to mainshocks. Such compilations demonstrate significant rate increases during, and immediately after (~ 45 min) M > 7.0 mainshocks in all tectonic settings and ranges. However, it is difficult to find strong evidence for M > 5 rate increases during the passage of surface waves in combined global catalogs. On the other hand, recently published studies of individual large mainshocks associate M > 5 triggering at global range that is delayed by hours to days after surface wave arrivals. The longer the delay between mainshock and global aftershock, the more difficult it is to establish causation. To address these questions, we review the response to 260 M ≥ 7.0 shallow (Z ≤ 50 km) mainshocks in 21 global regions with local seismograph networks. In this way we can examine the detailed temporal and spatial response, or lack thereof, during passing seismic waves, and over the 24 h period after their passing. We see an array of responses that can involve immediate and widespread seismicity outbreaks, delayed and localized earthquake clusters, to no response at all. About 50% of the

  10. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  11. Education for Earthquake Disaster Prevention in the Tokyo Metropolitan Area

    Science.gov (United States)

    Oki, S.; Tsuji, H.; Koketsu, K.; Yazaki, Y.

    2008-12-01

    Japan frequently suffers from all types of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. In the first half of this year, we already had three big earthquakes and heavy rainfall, which killed more than 30 people. This is not just for Japan but Asia is the most disaster-afflicted region in the world, accounting for about 90% of all those affected by disasters, and more than 50% of the total fatalities and economic losses. One of the most essential ways to reduce the damage of natural disasters is to educate the general public to let them understand what is going on during those desasters. This leads individual to make the sound decision on what to do to prevent or reduce the damage. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools, and ERI, the Earthquake Research Institute, is qualified to develop education for earthquake disaster prevention in the Tokyo metropolitan area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1703 Genroku earthquake (M 8.0) and the 1923 Kanto earthquake (M 7.9) which had 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global economic repercussion. To better understand earthquakes in this region, "Special Project for Earthquake Disaster Mitigation in Tokyo Metropolitan Area" has been conducted mainly by ERI. It is a 4-year

  12. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    Science.gov (United States)

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    , and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.

  13. Earthquake Forecasting System in Italy

    Science.gov (United States)

    Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

    2017-12-01

    In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

  14. Earthquake forecasting and its verification

    Directory of Open Access Journals (Sweden)

    J. R. Holliday

    2005-01-01

    Full Text Available No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months. However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ('hotspots'' where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver operating characteristic (ROC diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances.

  15. Abstract Cauchy problems three approaches

    CERN Document Server

    Melnikova, Irina V

    2001-01-01

    Although the theory of well-posed Cauchy problems is reasonably understood, ill-posed problems-involved in a numerous mathematical models in physics, engineering, and finance- can be approached in a variety of ways. Historically, there have been three major strategies for dealing with such problems: semigroup, abstract distribution, and regularization methods. Semigroup and distribution methods restore well-posedness, in a modern weak sense. Regularization methods provide approximate solutions to ill-posed problems. Although these approaches were extensively developed over the last decades by many researchers, nowhere could one find a comprehensive treatment of all three approaches.Abstract Cauchy Problems: Three Approaches provides an innovative, self-contained account of these methods and, furthermore, demonstrates and studies some of the profound connections between them. The authors discuss the application of different methods not only to the Cauchy problem that is not well-posed in the classical sense, b...

  16. Learning abstract algebra with ISETL

    CERN Document Server

    Dubinsky, Ed

    1994-01-01

    Most students in abstract algebra classes have great difficulty making sense of what the instructor is saying. Moreover, this seems to remain true almost independently of the quality of the lecture. This book is based on the constructivist belief that, before students can make sense of any presentation of abstract mathematics, they need to be engaged in mental activities which will establish an experiential base for any future verbal explanation. No less, they need to have the opportunity to reflect on their activities. This approach is based on extensive theoretical and empirical studies as well as on the substantial experience of the authors in teaching astract algebra. The main source of activities in this course is computer constructions, specifically, small programs written in the mathlike programming language ISETL; the main tool for reflections is work in teams of 2-4 students, where the activities are discussed and debated. Because of the similarity of ISETL expressions to standard written mathematics...

  17. Abstract Interpretation Using Attribute Grammar

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1990-01-01

    This paper deals with the correctness proofs of attribute grammars using methods from abstract interpretation. The technique will be described by defining a live-variable analysis for a small flow-chart language and proving it correct with respect to a continuation style semantics. The proof...... technique is based on fixpoint induction and introduces an extended class of attribute grammars as to express a standard semantics....

  18. Norddesign 2012 - Book of Abstract

    DEFF Research Database (Denmark)

    has been organized in line with the original ideas. The topics mentioned in the call for abstracts were: Product Development: Integrated, Multidisciplinary, Product life oriented and Distributed. Multi-product Development. Innovation and Business Models. Engineering Design and Industrial Design....... Conceptualisation and Innovative thinking. Research approaches and topics: Human Behaviour and Cognition. Cooperation and Multidisciplinary Design. Staging and Management of Design. Communication in Design. Design education and teaching: Programmes and Syllabuses. New Courses. Integrated and Multi-disciplinary. We...

  19. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    Science.gov (United States)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  20. Searching for evidence of a preferred rupture direction in small earthquakes at Parkfield

    Science.gov (United States)

    Kane, D. L.; Shearer, P. M.; Allmann, B.; Vernon, F. L.

    2009-12-01

    Theoretical modeling of strike-slip ruptures along a bimaterial interface suggests that the interface will have a preferred rupture direction and will produce asymmetric ground motion (Shi and Ben-Zion, 2006). This could have widespread implications for earthquake source physics and for hazard analysis on mature faults because larger ground motions would be expected in the direction of rupture propagation. Studies have shown that many large global earthquakes exhibit unilateral rupture, but a consistently preferred rupture direction along faults has not been observed. Some researchers have argued that the bimaterial interface model does not apply to natural faults, noting that the rupture of the M 6 2004 Parkfield earthquake propagated in the opposite direction from previous M 6 earthquakes along that section of the San Andreas Fault (Harris and Day, 2005). We analyze earthquake spectra from the Parkfield area to look for evidence of consistent rupture directivity along the San Andreas Fault. We separate the earthquakes into spatially defined clusters and quantify the differences in high-frequency energy among earthquakes recorded at each station. Propagation path effects are minimized in this analysis because we compare earthquakes located within a small volume and recorded by the same stations. By considering a number of potential end-member models, we seek to determine if a preferred rupture direction is present among small earthquakes at Parkfield.

  1. The U.S. Geological Survey's Earthquake Summary Posters: A GIS-based Education and Communication Product for Presenting Consolidated Post-Earthquake Information

    Science.gov (United States)

    Tarr, A.; Benz, H.; Earle, P.; Wald, D. J.

    2003-12-01

    are generated, the poster-size format is the most popular for display, outreach, and use as a working map for project scientists (JPEG format for web; PDF for download, editing, and printing) whereas the other (smaller) formats are suitable for briefing packages. We will soon make both GIS and PDF files of individual elements of the posters available online. ESP's provide an unprecedented opportunity for college earth-science faculty to take advantage of current events for timely lessons in global tectonics. They are also invaluable to communicate with the media and with government officials. ESP's will be used as a vehicle to present other products now under development under the auspices of NEIC and the ANSS, including rapid finite-fault models, global predictive ShakeMaps, "Did You Feel It?", and Rapid Assessments of Global Earthquakes (RAGE, Earle and others, this meeting).

  2. Distribution of scientific raw data of human-caused earthquakes

    Science.gov (United States)

    Klose, C. D.

    2012-12-01

    The second catalog edition on earthquakes caused by humans is presented that was recently published in the Journal of Seismology. The earthquakes with seismic magnitudes of up to M=7.9 have been documented and published since the first half of the 20th century. They were caused by geomechanical pollution due to artificial water reservoir impoundments, underground and open-pit mining, coastal management, hydrocarbon production and fluid injections/extractions. The overall data set contains physical properties that were collected and initially summarized in an unpublished earthquake catalog presented at a meeting of Seismological Society of America in 2006. The earthquakes result from a larger set of more than 500 scientific research papers, conference proceedings and abstracts. The overall data set is made available for the public at www.cdklose.com, which includes actual properties and features, figures that explain physical relationships, and statistical correlation and regression tests. The data set can be used by students, educators, the general public or scientists from other disciplines who are interested in the environmental hazard of human-caused earthquakes.

  3. Seismo-Ionospheric Precursor in the GIM TEC of the 24 August 2014 M6 Napa Earthquake

    Science.gov (United States)

    Wu, T. Y.; Liu, T. J. Y.; Liu, J. Y.

    2015-12-01

    This study examines seismo-ionospheric precursors (SIPs) in the global ionosphere map (GIM) of the total electron content (TEC) associated with the 24 August 2014 M6 South Napa earthquake and statistical evidence of SIPs of the GPS TEC in western USA during 2000-2014. The temporal SIP in the GIM TEC around the epicenter significantly decreasing (negative anomaly) on 22 August. To discriminate the global effect, such as solar flares, magnetic storms, etc., and the local effect, such as earthquakes, 5183 lattices on the GIM are employed to conduct a global search of the SIP distribution. Anomalies of both GIM TEC and associated gradients specifically and continuously appearing over the epicenter suggest that the SIP relate to the 2014 South Napa earthquake. A simulation is further carried out to produce the SIP in GIM TEC before the earthquake. Results indicate that the eastward electric field generated over the epicenter area during the earthquake preparation period to be essential.

  4. High-resolution earthquake relocation in the Fort Worth and Permian Basins using regional seismic stations

    Science.gov (United States)

    Ogwari, P.; DeShon, H. R.; Hornbach, M.

    2017-12-01

    Post-2008 earthquake rate increases in the Central United States have been associated with large-scale subsurface disposal of waste-fluids from oil and gas operations. The beginning of various earthquake sequences in Fort Worth and Permian basins have occurred in the absence of seismic stations at local distances to record and accurately locate hypocenters. Most typically, the initial earthquakes have been located using regional seismic network stations (>100km epicentral distance) and using global 1D velocity models, which usually results in large location uncertainty, especially in depth, does not resolve magnitude filters and regional relative location when local data becomes available. We use the local distance data for high-resolution earthquake location, identifying earthquake templates and accurate source-station raypath velocities for the Pg and Lg phases at regional stations. A matched-filter analysis is then applied to seismograms recorded at US network stations and at adopted TA stations that record the earthquakes before and during the local network deployment period. Positive detections are declared based on manual review of associated with P and S arrivals on local stations. We apply hierarchical clustering to distinguish earthquakes that are both spatially clustered and spatially separated. Finally, we conduct relative earthquake and earthquake cluster location using regional station differential times. Initial analysis applied to the 2008-2009 DFW airport sequence in north Texas results in time continuous imaging of epicenters extending into 2014. Seventeen earthquakes in the USGS earthquake catalog scattered across a 10km2 area near DFW airport are relocated onto a single fault using these approaches. These techniques will also be applied toward imaging recent earthquakes in the Permian Basin near Pecos, TX.

  5. Seismic imaging under the 2013 Ms 7.0 Lushan Earthquake, China

    Science.gov (United States)

    Wang, Z.

    2013-12-01

    On 20 April 2013, a large earthquake (Ms 7.0) occurred at southern end of the Longmen-Shan fault zone. More than 200 people were killed and about 14,000 people were hurt by the earthquake. The earthquake occurred with some distinct features: 1) The hypocenter of the Lushan earthquake is located close to the devastating 2008 M 8.0 Wenchuan earthquake occurred at the Longmen-Shan fault zone; 2) The time scale of the earthquake generation is not more than five years after the M 8.0 earthquake; 3) The magnitude of the Lushan earthquake is as large as 7.0 within such a close time of the Wenchuan earthquake; 4) The hypocenter of the Lushan earthquake seems to be located at the southern part of the Longmen-Shan faut zone with a 70-km distance away from the Wenchuan source. These features of the Lushan earthquake leads to a number of researcher wonder of its nucleation mechanism, rupture process and the correlation of the wenchuan earthquakes. Global seismic waveform data analyzing shows where the rupture initiated and how it expanded for the 2013 Ms 7.0 Lushan earthquake. Our seismic imaging and crustal stress analyzing indicates that the hypocenter of the Lushan earthquake occurred at a strong high-velocity (Vp, Vs) and low-Poisson's ratio zone with high crustal stress. Similarly, high-velocity (Vp, Vs) and low-Poisson's ratio with high crustal stress is revealed under the 2008 Wenchuan earthquake (Ms 8.0) source area. However, a sharp contrast gap zone with low-velocity, high-Poisson's ratio anomalies is clearly imaged under the conjunction area between the two earthquake sources. We suggest that the strong structural variation and high crustal stress together with high coseismic stress by the Wenchuan Earthquake triggered the 2013 Lushan Earthquake (Ms 7.0) and controlling its rupture process. We believe that the rapid seismic imaging together with the crustal stress analysis could help to understand the Lushan earthquake generation and to evaluate the possibility of

  6. IEEE conference record--Abstracts

    Energy Technology Data Exchange (ETDEWEB)

    1992-01-01

    The following topics were covered in this meeting: basic plasma phenomena and plasma waves; plasma diagnostics; space plasma diagnostics; magnetic fusion; electron, ion and plasma sources; intense electron and ion beams; intense beam microwaves; fast wave M/W devices; microwave plasma interactions; plasma focus; ultrafast Z-pinches; plasma processing; electrical gas discharges; fast opening switches; magnetohydrodynamics; electromagnetic and electrothermal launchers; x-ray lasers; computational plasma science; solid state plasmas and switches; environmental/energy issues in plasma science; vacuum electronics; plasmas for lighting; gaseous electronics; and ball lightning and other spherical plasmas. Separate abstracts were prepared for 278 papers of this conference.

  7. National Physics Conference. Paper Abstracts

    International Nuclear Information System (INIS)

    Marinela Dumitriu, Editorial Coordination.

    1995-01-01

    This book contains the abstracts of the proceedings of the annual Romanian Physics Conference organized by Romanian Physics Society. The conference was held on November 30 to December 2, 1995 in the city of Baia Mare. It was organized in the following nine sections: 1 - Astrophysics, Particle Physics, Nuclear Physics, Molecular and Atomic Physics; 2 - Plasma Physics; 3 - Biophysics; 4 - Technical Physics; 5 - Theoretical Physics; 6 -The Physics of Energy; 7 - The Physics of Environment 8 - Solid State Physics; 9 - Optical and Quantum Electronics. The full texts can be obtained on request from the Romanian Physical Society or directly from authors

  8. WIPR-2010 Book of abstracts

    International Nuclear Information System (INIS)

    2015-01-01

    The main objective of the workshop was to review advanced and preclinical studies on innovative positron emitting radionuclides to assess their usefulness and potentials. Presentations were organized around 4 issues: 1) preclinical and clinical point of view, 2) production of innovative PET radionuclides, 3) from complexation chemistry to PET imaging and 4) from research to clinic. Emphasis has been put on 64 Cu, 68 Ga, 89 Zr, 44 Sc but specific aspects such as production or purification have been considered for 66 Ga, 67 Ga, 52 Fe, 86 Y, and 68 Ge radionuclides. This document gathers the abstracts of most contributions

  9. In-Package Chemistry Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    P.S. Domski

    2003-07-21

    The work associated with the development of this model report was performed in accordance with the requirements established in ''Technical Work Plan for Waste Form Degradation Modeling, Testing, and Analyses in Support of SR and LA'' (BSC 2002a). The in-package chemistry model and in-package chemistry model abstraction are developed to predict the bulk chemistry inside of a failed waste package and to provide simplified expressions of that chemistry. The purpose of this work is to provide the abstraction model to the Performance Assessment Project and the Waste Form Department for development of geochemical models of the waste package interior. The scope of this model report is to describe the development and validation of the in-package chemistry model and in-package chemistry model abstraction. The in-package chemistry model will consider chemical interactions of water with the waste package materials and the waste form for commercial spent nuclear fuel (CSNF) and codisposed high-level waste glass (HLWG) and N Reactor spent fuel (CDNR). The in-package chemistry model includes two sub-models, the first a water vapor condensation (WVC) model, where water enters a waste package as vapor and forms a film on the waste package components with subsequent film reactions with the waste package materials and waste form--this is a no-flow model, the reacted fluids do not exit the waste package via advection. The second sub-model of the in-package chemistry model is the seepage dripping model (SDM), where water, water that may have seeped into the repository from the surrounding rock, enters a failed waste package and reacts with the waste package components and waste form, and then exits the waste package with no accumulation of reacted water in the waste package. Both of the submodels of the in-package chemistry model are film models in contrast to past in-package chemistry models where all of the waste package pore space was filled with water. The

  10. Operating System Abstraction Layer (OSAL)

    Science.gov (United States)

    Yanchik, Nicholas J.

    2007-01-01

    This viewgraph presentation reviews the concept of the Operating System Abstraction Layer (OSAL) and its benefits. The OSAL is A small layer of software that allows programs to run on many different operating systems and hardware platforms It runs independent of the underlying OS & hardware and it is self-contained. The benefits of OSAL are that it removes dependencies from any one operating system, promotes portable, reusable flight software. It allows for Core Flight software (FSW) to be built for multiple processors and operating systems. The presentation discusses the functionality, the various OSAL releases, and describes the specifications.

  11. Shoestring Budget Radio Astronomy (Abstract)

    Science.gov (United States)

    Hoot, J. E.

    2017-12-01

    (Abstract only) The commercial exploitation of microwave frequencies for cellular, WiFi, Bluetooth, HDTV, and satellite digital media transmission has brought down the cost of the components required to build an effective radio telescope to the point where, for the cost of a good eyepiece, you can construct and operate a radio telescope. This paper sets forth a family of designs for 1421 MHz telescopes. It also proposes a method by which operators of such instruments can aggregate and archive data via the Internet. With 90 or so instruments it will be possible to survey the entire radio sky for transients with a 24 hour cadence.

  12. Abstract decomposition theorem and applications

    CERN Document Server

    Grossberg, R; Grossberg, Rami; Lessmann, Olivier

    2005-01-01

    Let K be an Abstract Elementary Class. Under the asusmptions that K has a nicely behaved forking-like notion, regular types and existence of some prime models we establish a decomposition theorem for such classes. The decomposition implies a main gap result for the class K. The setting is general enough to cover \\aleph_0-stable first-order theories (proved by Shelah in 1982), Excellent Classes of atomic models of a first order tehory (proved Grossberg and Hart 1987) and the class of submodels of a large sequentially homogenuus \\aleph_0-stable model (which is new).

  13. Indico CONFERENCE: Define the Call for Abstracts

    CERN Multimedia

    CERN. Geneva; Ferreira, Pedro

    2017-01-01

    In this tutorial, you will learn how to define and open a call for abstracts. When defining a call for abstracts, you will be able to define settings related to the type of questions asked during a review of an abstract, select the users who will review the abstracts, decide when to open the call for abstracts, and more.

  14. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  15. Earthquake Preparedness Checklist for Schools.

    Science.gov (United States)

    1999

    A brochure provides a checklist highlighting the important questions and activities that should be addressed and undertaken as part of a school safety and preparedness program for earthquakes. It reminds administrators and other interested parties on what not to forget in preparing schools for earthquakes, such as staff knowledge needs, evacuation…

  16. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  17. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  18. Tectonic tremor activity associated with teleseismic and nearby earthquakes

    Science.gov (United States)

    Chao, K.; Obara, K.; Peng, Z.; Pu, H. C.; Frank, W.; Prieto, G. A.; Wech, A.; Hsu, Y. J.; Yu, C.; Van der Lee, S.; Apley, D. W.

    2016-12-01

    Tectonic tremor is an extremely stress-sensitive seismic phenomenon located in the brittle-ductile transition section of a fault. To better understand the stress interaction between tremor and earthquake, we conduct the following studies: (1) search for triggered tremor globally, (2) examine ambient tremor activities associated with distant earthquakes, and (3) quantify the temporal variation of ambient tremor activity before and after nearby earthquakes. First, we developed a Matlab toolbox to enhance the searching of triggered tremor globally. We have discovered new tremor sources in the inland faults in Kyushu, Kanto, and Hokkaido in Japan, southern Chile, Ecuador, and central Colombia in South America, and in South Italy. Our findings suggest that tremor is more common than previously believed and indicate the potential existence of ambient tremor in the triggered tremor active regions. Second, we adapt the statistical analysis to examine whether the long-term ambient tremor rate may affect by the dynamic stress of teleseismic earthquakes. We analyzed the data in Nankai, Hokkaido, Cascadia, and Taiwan. Our preliminary results did not show an apparent increase of ambient tremor rate after the passing of surface waves. Third, we quantify temporal changes in ambient tremor activity before and after the occurrence of local earthquakes under the southern Central Range of Taiwan with magnitudes of >=5.5 from 2004 to 2016. For a particular case, we found a temporal variation of tremor rate before and after the 2010/03/04 Mw6.3 earthquake, located about 20 km away from the active tremor source. The long-term increase in the tremor rate after the earthquake could have been caused by an increase in static stress following the mainshock. For comparison, clear evidence from seismic and GPS observations indicate a short-term increase in the tremor rate a few weeks before the mainshock. The increase in the tremor rate before the mainshock could correlate with stress changes

  19. Neoliberalism and criticisms of earthquake insurance arrangements in New Zealand.

    Science.gov (United States)

    Hay, I

    1996-03-01

    Global collapse of the Fordist-Keynesian regime of accumulation and an attendant philosophical shift in New Zealand politics to neoliberalism have prompted criticisms of, and changes to, the Earthquake and War Damage Commission. Earthquake insurance arrangements made 50 years ago in an era of collectivist, welfarist political action are now set in an environment in which emphasis is given to competitive relations and individualism. Six specific criticisms of the Commission are identified, each of which is founded in the rhetoric and ideology of a neoliberal political project which has underpinned radical social and economic changes in New Zealand since the early 1980s. On the basis of those criticisms, and in terms of the Earthquake Commission Act 1993, the Commission has been restructured. The new Commission is withdrawing from its primary position as the nation's non-residential property hazards insurer and is restricting its coverage of residential properties.

  20. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  1. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  2. Early Earthquakes of the Americas

    Science.gov (United States)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  3. Are Earthquakes a Critical Phenomenon?

    Science.gov (United States)

    Ramos, O.

    2014-12-01

    Earthquakes, granular avalanches, superconducting vortices, solar flares, and even stock markets are known to evolve through power-law distributed events. During decades, the formalism of equilibrium phase transition has coined these phenomena as critical, which implies that they are also unpredictable. This work revises these ideas and uses earthquakes as the paradigm to demonstrate that slowly driven systems evolving through uncorrelated and power-law distributed avalanches (UPLA) are not necessarily critical systems, and therefore not necessarily unpredictable. By linking the correlation length to the pdf of the distribution, and comparing it with the one obtained at a critical point, a condition of criticality is introduced. Simulations in the classical Olami-Feder-Christensen (OFC) earthquake model confirm the findings, showing that earthquakes are not a critical phenomenon. However, one single catastrophic earthquake may show critical properties and, paradoxically, the emergence of this temporal critical behaviour may eventually carry precursory signs of catastrophic events.

  4. An introduction to abstract algebra

    CERN Document Server

    Robinson, Derek JS

    2003-01-01

    This is a high level introduction to abstract algebra which is aimed at readers whose interests lie in mathematics and in the information and physical sciences. In addition to introducing the main concepts of modern algebra, the book contains numerous applications, which are intended to illustrate the concepts and to convince the reader of the utility and relevance of algebra today. In particular applications to Polya coloring theory, latin squares, Steiner systems and error correcting codes are described. Another feature of the book is that group theory and ring theory are carried further than is often done at this level. There is ample material here for a two semester course in abstract algebra. The importance of proof is stressed and rigorous proofs of almost all results are given. But care has been taken to lead the reader through the proofs by gentle stages. There are nearly 400 problems, of varying degrees of difficulty, to test the reader''s skill and progress. The book should be suitable for students ...

  5. Reconstruction of abstract quantum theory

    International Nuclear Information System (INIS)

    Drieschner, M.; Goernitz, T.; von Weizsaecker, C.F.

    1988-01-01

    Understanding quantum theory as a general theory of prediction, we reconstruct abstract quantum theory. Abstract means the general frame of quantum theory, without reference to a three-dimensional position space, to concepts like particle or field, or to special laws of dynamics. Reconstruction is the attempt to do this by formulating simple and plausible postulates on prediction in order to derive the basic concepts of quantum theory from them. Thereby no law of classical physics is presupposed which would then have to be quantized. We briefly discuss the relationship of theory and interpretation in physics and the fundamental role of time as a basic concept for physics. Then a number of assertions are given, formulated as succinctly as possible in order to make them easily quotable and comparable. The assertations are arranged in four groups: heuristic principles, verbal definitions of some terms, three basic postulates, and consequences. The three postulates of separable alternatives, indeterminism, and kinematics are the central points of this work. These brief assertions are commented upon, and their relationship with the interpretation of quantum theory is discussed. Also given are an outlook on the further development into concrete quantum theory and some philosophical reflections

  6. Regional dependence in earthquake early warning and real time seismology

    International Nuclear Information System (INIS)

    Caprio, M.

    2013-01-01

    An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

  7. Hepatitis E virus seroepidemiology: a post-earthquake study among blood donors in Nepal

    Directory of Open Access Journals (Sweden)

    Ashish C. Shrestha

    2016-11-01

    Full Text Available Abstract Background As one of the causative agents of viral hepatitis, hepatitis E virus (HEV has gained public health attention globally. HEV epidemics occur in developing countries, associated with faecal contamination of water and poor sanitation. In industrialised nations, HEV infections are associated with travel to countries endemic for HEV, however, autochthonous infections, mainly through zoonotic transmission, are increasingly being reported. HEV can also be transmitted by blood transfusion. Nepal has experienced a number of HEV outbreaks, and recent earthquakes resulted in predictions raising the risk of an HEV outbreak to very high. This study aimed to measure HEV exposure in Nepalese blood donors after large earthquakes. Methods Samples (n = 1,845 were collected from blood donors from Kathmandu, Chitwan, Bhaktapur and Kavre. Demographic details, including age and sex along with possible risk factors associated with HEV exposure were collected via a study-specific questionnaire. Samples were tested for HEV IgM, IgG and antigen. The proportion of donors positive for HEV IgM or IgG was calculated overall, and for each of the variables studied. Chi square and regression analyses were performed to identify factors associated with HEV exposure. Results Of the donors residing in earthquake affected regions (Kathmandu, Bhaktapur and Kavre, 3.2% (54/1,686; 95% CI 2.7–4.0% were HEV IgM positive and two donors were positive for HEV antigen. Overall, 41.9% (773/1,845; 95% CI 39.7–44.2% of donors were HEV IgG positive, with regional variation observed. Higher HEV IgG and IgM prevalence was observed in donors who reported eating pork, likely an indicator of zoonotic transmission. Previous exposure to HEV in Nepalese blood donors is relatively high. Conclusion Detection of recent markers of HEV infection in healthy donors suggests recent asymptomatic HEV infection and therefore transfusion-transmission in vulnerable patients is a risk in

  8. Analyses of surface motions caused by the magnitude 9.0 2004 Sumatra earthquake

    DEFF Research Database (Denmark)

    Khan, Shfaqat Abbas; Gudmundsson, Ó.

    The Sumatra, Indonesia, earthquake on December 26th was one of the most devastating earthquakes in history. With a magnitude of Mw = 9.0 it is the forth largest earthquake recorded since 1900. It occurred about one hundred kilometers off the west coast of northern Sumatra, where the relatively thin...... of years. The result was a devastating tsunami hitting coastlines across the Indian Ocean killing more than 225,000 people in Sri Lanka, India, Indonesia, Thailand and Malaysia. An earthquake of this magnitude is expected to involve a displacement on the fault on the order of 10 meters. But, what...... was the actual amplitude of the surface motions that triggered the tsunami? This can be constrained using the amplitudes of elastic waves radiated from the earthquake, or by direct measurements of deformation. Here we present estimates of the deformation based on continuous Global Positioning System (GPS...

  9. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    Science.gov (United States)

    Delorey, Andrew; Van Der Elst, Nicholas; Johnson, Paul

    2017-01-01

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering of earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.

  10. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    International Nuclear Information System (INIS)

    Delorey, Andrew A.; Elst, Nicholas J. van der; Johnson, Paul Allan

    2016-01-01

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering of earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Lastly, our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.

  11. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip.

    Science.gov (United States)

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne

    2018-01-01

    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 M w (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes.

  12. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip

    Science.gov (United States)

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A.; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne

    2018-01-01

    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 Mw (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes. PMID:29404404

  13. Assessment of earthquake effects - contribution from online communication

    Science.gov (United States)

    D'Amico, Sebastiano; Agius, Matthew; Galea, Pauline

    2014-05-01

    The rapid increase of social media and online newspapers in the last years have given the opportunity to make a national investigation on macroseismic effects on the Maltese Islands based on felt earthquake reports. A magnitude 4.1 earthquake struck close to Malta on Sunday 24th April 2011 at 13:02 GMT. The earthquake was preceded and followed by a series of smaller magnitude quakes throughout the day, most of which were felt by the locals on the island. The continuous news media coverage during the day and the extensive sharing of the news item on social media resulted in a strong public response to fill in the 'Did you feel it?' online form on the website of the Seismic Monitoring and Research Unit (SMRU) at the University of Malta (http://seismic.research.um.edu.mt/). The results yield interesting information about the demographics of the island, and the different felt experiences possibly relating to geological settings and diverse structural and age-classified buildings. Based on this case study, the SMRU is in the process of developing a mobile phone application dedicated to share earthquake information to the local community. The application will automatically prompt users to fill in a simplified 'Did you feel it?' report to potentially felt earthquakes. Automatic location using Global Positioning Systems can be incorporated to provide a 'real time' intensity map that can be used by the Civil Protection Department.

  14. Leveraging geodetic data to reduce losses from earthquakes

    Science.gov (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    event response products and by expanded use of geodetic imaging data to assess fault rupture and source parameters.Uncertainties in the NSHM, and in regional earthquake models, are reduced by fully incorporating geodetic data into earthquake probability calculations.Geodetic networks and data are integrated into the operations and earthquake information products of the Advanced National Seismic System (ANSS).Earthquake early warnings are improved by more rapidly assessing ground displacement and the dynamic faulting process for the largest earthquakes using real-time geodetic data.Methodology for probabilistic earthquake forecasting is refined by including geodetic data when calculating evolving moment release during aftershock sequences and by better understanding the implications of transient deformation for earthquake likelihood.A geodesy program that encompasses a balanced mix of activities to sustain missioncritical capabilities, grows new competencies through the continuum of fundamental to applied research, and ensures sufficient resources for these endeavors provides a foundation by which the EHP can be a leader in the application of geodesy to earthquake science. With this in mind the following objectives provide a framework to guide EHP efforts:Fully utilize geodetic information to improve key products, such as the NSHM and EEW, and to address new ventures like the USGS Subduction Zone Science Plan.Expand the variety, accuracy, and timeliness of post-earthquake information products, such as PAGER (Prompt Assessment of Global Earthquakes for Response), through incorporation of geodetic observations.Determine if geodetic measurements of transient deformation can significantly improve estimates of earthquake probability.Maintain an observational strategy aligned with the target outcomes of this document that includes continuous monitoring, recording of ephemeral observations, focused data collection for use in research, and application-driven data processing and

  15. Abstract algebra an introductory course

    CERN Document Server

    Lee, Gregory T

    2018-01-01

    This carefully written textbook offers a thorough introduction to abstract algebra, covering the fundamentals of groups, rings and fields. The first two chapters present preliminary topics such as properties of the integers and equivalence relations. The author then explores the first major algebraic structure, the group, progressing as far as the Sylow theorems and the classification of finite abelian groups. An introduction to ring theory follows, leading to a discussion of fields and polynomials that includes sections on splitting fields and the construction of finite fields. The final part contains applications to public key cryptography as well as classical straightedge and compass constructions. Explaining key topics at a gentle pace, this book is aimed at undergraduate students. It assumes no prior knowledge of the subject and contains over 500 exercises, half of which have detailed solutions provided.

  16. Abstract Expression Grammar Symbolic Regression

    Science.gov (United States)

    Korns, Michael F.

    This chapter examines the use of Abstract Expression Grammars to perform the entire Symbolic Regression process without the use of Genetic Programming per se. The techniques explored produce a symbolic regression engine which has absolutely no bloat, which allows total user control of the search space and output formulas, which is faster, and more accurate than the engines produced in our previous papers using Genetic Programming. The genome is an all vector structure with four chromosomes plus additional epigenetic and constraint vectors, allowing total user control of the search space and the final output formulas. A combination of specialized compiler techniques, genetic algorithms, particle swarm, aged layered populations, plus discrete and continuous differential evolution are used to produce an improved symbolic regression sytem. Nine base test cases, from the literature, are used to test the improvement in speed and accuracy. The improved results indicate that these techniques move us a big step closer toward future industrial strength symbolic regression systems.

  17. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  18. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  19. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  20. Statistical Evaluations of Variations in Dairy Cows’ Milk Yields as a Precursor of Earthquakes

    Science.gov (United States)

    Yamauchi, Hiroyuki; Hayakawa, Masashi; Asano, Tomokazu; Ohtani, Nobuyo; Ohta, Mitsuaki

    2017-01-01

    Simple Summary There are many reports of abnormal changes occurring in various natural systems prior to earthquakes. Unusual animal behavior is one of these abnormalities; however, there are few objective indicators and to date, reliability has remained uncertain. We found that milk yields of dairy cows decreased prior to an earthquake in our previous case study. In this study, we examined the reliability of decreases in milk yields as a precursor for earthquakes using long-term observation data. In the results, milk yields decreased approximately three weeks before earthquakes. We have come to the conclusion that dairy cow milk yields have applicability as an objectively observable unusual animal behavior prior to earthquakes, and dairy cows respond to some physical or chemical precursors of earthquakes. Abstract Previous studies have provided quantitative data regarding unusual animal behavior prior to earthquakes; however, few studies include long-term, observational data. Our previous study revealed that the milk yields of dairy cows decreased prior to an extremely large earthquake. To clarify whether the milk yields decrease prior to earthquakes, we examined the relationship between earthquakes of various magnitudes and daily milk yields. The observation period was one year. In the results, cross-correlation analyses revealed a significant negative correlation between earthquake occurrence and milk yields approximately three weeks beforehand. Approximately a week and a half beforehand, a positive correlation was revealed, and the correlation gradually receded to zero as the day of the earthquake approached. Future studies that use data from a longer observation period are needed because this study only considered ten earthquakes and therefore does not have strong statistical power. Additionally, we compared the milk yields with the subionospheric very low frequency/low frequency (VLF/LF) propagation data indicating ionospheric perturbations. The results showed

  1. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China.

    Science.gov (United States)

    Liu, Xu; Tang, Bihan; Yang, Hongyang; Liu, Yuan; Xue, Chen; Zhang, Lulu

    2015-12-04

    Performance assessments of earthquake medical rapid response teams (EMRRTs), particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  2. Designing plants to withstand earthquakes

    International Nuclear Information System (INIS)

    Nedderman, J.

    1995-01-01

    The approach used in Japan to design nuclear plants capable of withstanding earthquakes is described. Earthquakes are classified into two types S 1 and S 2 . In an S 1 earthquake a nuclear plant must be capable of surviving essentially undamaged. In the more severe S 2 earthquake, some damage may occur but there should be no release of radioactivity to the outside world. The starting point for the designer is the ground response spectrum of the earthquake which shows both the ground acceleration and the frequencies of the vibrations. From the ground response spectra synthetic seismic waves for S 1 and S 2 earthquakes are developed which can then be used to analyse a ''lumped-mass'' model of the reactor building to arrive at floor response spectra. These spectra are then used in further analyses of the design of reactor equipment, piping systems and instrument racks and supports. When a plant is constructed, results from tests with a vibration exciter are used to verify the floor response spectra and principle building resonances. Much of the equipment can be tested on vibrating tables. One large table with a maximum loading capacity of 1000 t is used to test large-scale models of containment vessels, pressure vessels and steam generators. Such tests have shown that the plants have considerable safety margins in their ability to withstand the design basis earthquakes. (UK)

  3. Coseismic and postseismic slip of the 2011 magnitude-9 Tohoku-Oki earthquake.

    Science.gov (United States)

    Ozawa, Shinzaburo; Nishimura, Takuya; Suito, Hisashi; Kobayashi, Tomokazu; Tobita, Mikio; Imakiire, Tetsuro

    2011-06-15

    Most large earthquakes occur along an oceanic trench, where an oceanic plate subducts beneath a continental plate. Massive earthquakes with a moment magnitude, M(w), of nine have been known to occur in only a few areas, including Chile, Alaska, Kamchatka and Sumatra. No historical records exist of a M(w) = 9 earthquake along the Japan trench, where the Pacific plate subducts beneath the Okhotsk plate, with the possible exception of the ad 869 Jogan earthquake, the magnitude of which has not been well constrained. However, the strain accumulation rate estimated there from recent geodetic observations is much higher than the average strain rate released in previous interplate earthquakes. This finding raises the question of how such areas release the accumulated strain. A megathrust earthquake with M(w) = 9.0 (hereafter referred to as the Tohoku-Oki earthquake) occurred on 11 March 2011, rupturing the plate boundary off the Pacific coast of northeastern Japan. Here we report the distributions of the coseismic slip and postseismic slip as determined from ground displacement detected using a network based on the Global Positioning System. The coseismic slip area extends approximately 400 km along the Japan trench, matching the area of the pre-seismic locked zone. The afterslip has begun to overlap the coseismic slip area and extends into the surrounding region. In particular, the afterslip area reached a depth of approximately 100 km, with M(w) = 8.3, on 25 March 2011. Because the Tohoku-Oki earthquake released the strain accumulated for several hundred years, the paradox of the strain budget imbalance may be partly resolved. This earthquake reminds us of the potential for M(w) ≈ 9 earthquakes to occur along other trench systems, even if no past evidence of such events exists. Therefore, it is imperative that strain accumulation be monitored using a space geodetic technique to assess earthquake potential.

  4. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  5. Fracking, wastewater disposal, and earthquakes

    Science.gov (United States)

    McGarr, Arthur

    2016-03-01

    In the modern oil and gas industry, fracking of low-permeability reservoirs has resulted in a considerable increase in the production of oil and natural gas, but these fluid-injection activities also can induce earthquakes. Earthquakes induced by fracking are an inevitable consequence of the injection of fluid at high pressure, where the intent is to enhance permeability by creating a system of cracks and fissures that allow hydrocarbons to flow to the borehole. The micro-earthquakes induced during these highly-controlled procedures are generally much too small to be felt at the surface; indeed, the creation or reactivation of a large fault would be contrary to the goal of enhancing permeability evenly throughout the formation. Accordingly, the few case histories for which fracking has resulted in felt earthquakes have been due to unintended fault reactivation. Of greater consequence for inducing earthquakes, modern techniques for producing hydrocarbons, including fracking, have resulted in considerable quantities of coproduced wastewater, primarily formation brines. This wastewater is commonly disposed by injection into deep aquifers having high permeability and porosity. As reported in many case histories, pore pressure increases due to wastewater injection were channeled from the target aquifers into fault zones that were, in effect, lubricated, resulting in earthquake slip. These fault zones are often located in the brittle crystalline rocks in the basement. Magnitudes of earthquakes induced by wastewater disposal often exceed 4, the threshold for structural damage. Even though only a small fraction of disposal wells induce earthquakes large enough to be of concern to the public, there are so many of these wells that this source of seismicity contributes significantly to the seismic hazard in the United States, especially east of the Rocky Mountains where standards of building construction are generally not designed to resist shaking from large earthquakes.

  6. An Experimental Study of a Midbroken 2-Bay 6-Storey Reinforced Concrete Frame subject to Earthquakes

    DEFF Research Database (Denmark)

    Skjærbæk, P. S.; Taskin, B.; Kirkegaard, Poul Henning

    1997-01-01

    A 2-bay, 6-storey model test reinforced concrete frame (scale 1:5) subjected to sequential earthquakes of increasing magnitude is considered in this paper. The frame was designed with a weak storey, in which the columns are weakened by using thinner and weaker reinforcement bars. The aim of the w......A 2-bay, 6-storey model test reinforced concrete frame (scale 1:5) subjected to sequential earthquakes of increasing magnitude is considered in this paper. The frame was designed with a weak storey, in which the columns are weakened by using thinner and weaker reinforcement bars. The aim...... of the work is to study global response to a damaging strong motion earthquake event of such buildings. Special emphasis is put on examining to what extent damage in the weak storey can be identified from global response measurements during an earthquake where the structure survives, and what level...

  7. Development of fragility functions to estimate homelessness after an earthquake

    Science.gov (United States)

    Brink, Susan A.; Daniell, James; Khazai, Bijan; Wenzel, Friedemann

    2014-05-01

    used to estimate homelessness as a function of information that is readily available immediately after an earthquake. These fragility functions could be used by relief agencies and governments to provide an initial assessment of the need for allocation of emergency shelter immediately after an earthquake. Daniell JE (2014) The development of socio-economic fragility functions for use in worldwide rapid earthquake loss estimation procedures, Ph.D. Thesis (in publishing), Karlsruhe, Germany. Daniell, J. E., Khazai, B., Wenzel, F., & Vervaeck, A. (2011). The CATDAT damaging earthquakes database. Natural Hazards and Earth System Science, 11(8), 2235-2251. doi:10.5194/nhess-11-2235-2011 Daniell, J.E., Wenzel, F. and Vervaeck, A. (2012). "The Normalisation of socio-economic losses from historic worldwide earthquakes from 1900 to 2012", 15th WCEE, Lisbon, Portugal, Paper No. 2027. Jaiswal, K., & Wald, D. (2010). An Empirical Model for Global Earthquake Fatality Estimation. Earthquake Spectra, 26(4), 1017-1037. doi:10.1193/1.3480331

  8. On the reported ionospheric precursor of the 1999 Hector Mine, California earthquake

    Science.gov (United States)

    Thomas, Jeremy N.; Love, Jeffrey J.; Komjathy, Attila; Verkhoglyadova, Olga P.; Butala, Mark; Rivera, Nicholas

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identify as anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  9. On the reported ionospheric precursor of the Hector Mine, California earthquake

    Science.gov (United States)

    Thomas, J.N.; Love, J.J.; Komjathy, A.; Verkhoglyadova, O.P.; Butala, M.; Rivera, N.

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identified as being anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  10. Ionospheric disturbances associated with the 2015 M7.8 Nepal earthquake

    Directory of Open Access Journals (Sweden)

    Yiyan Zhou

    2017-07-01

    Full Text Available Based on the total electron content (TEC derived from Global Positioning System (GPS observations of the Crustal Movement Observation Network of China (CMONOC and the Global Ionosphere Map (GIM from the Center for Orbit Determination in Europe (CODE, we detected and analyzed the ionospheric variations during the 2015 M7.8 Nepal earthquake (including the pre-earthquake ionospheric anomalies and coseismic ionospheric disturbances (CIDs following the main shock. The analysis of vertical total electron content (VTEC time series shows that the large-scale ionospheric anomalies appeared near the epicenter two days prior to the earthquake. Moreover, the pre-earthquake ionospheric anomalies were also observed in the geomagnetically conjugated region. In view of solar-terrestrial environment, the pre-earthquake ionospheric anomalies could be associated with the Nepal earthquake. In addition, we also detected the CIDs through the high-frequency GPS observation stations. The CIDs had obvious oscillated waveforms with the peak-to-peak disturbance amplitudes of about 1 TECu and 0.4 TECu, which propagated approximately with the horizontal velocities of 877 ± 75 m/s and 319 ± 30 m/s, respectively. The former is triggered directly by the acoustic waves which originated from the energy release of the earthquake near the epicenter, while the latter could be stimulated by the acoustic-gravity waves from the partial transformation of the acoustic waves.

  11. Earthquakes, July-August 1992

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤Mearthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  12. Earthquake Zoning Maps of Turkey

    International Nuclear Information System (INIS)

    Pampal, S.

    2007-01-01

    Earthquake Zoning Maps (1945, 1947, 1963, 1972 and 1996) and Specifications for Construction in Disaster Areas (1947, 1953, 1962, 1968, 1975, 1996, 1997 and 2006) have been changed many times following the developments in engineering seismology, tectonic and seismo-tectonic invention and improved earthquake data collection. The aim of this study is to give information about this maps, which come into force at different dates since the introduction of the firs official Earthquake Zoning Map published in 1945 and is to assist for better understanding of the development phases of these maps

  13. Seismology: dynamic triggering of earthquakes.

    Science.gov (United States)

    Gomberg, Joan; Johnson, Paul

    2005-10-06

    After an earthquake, numerous smaller shocks are triggered over distances comparable to the dimensions of the mainshock fault rupture, although they are rare at larger distances. Here we analyse the scaling of dynamic deformations (the stresses and strains associated with seismic waves) with distance from, and magnitude of, their triggering earthquake, and show that they can cause further earthquakes at any distance if their amplitude exceeds several microstrain, regardless of their frequency content. These triggering requirements are remarkably similar to those measured in the laboratory for inducing dynamic elastic nonlinear behaviour, which suggests that the underlying physics is similar.

  14. Retrospective evaluation of the five-year and ten-year CSEP-Italy earthquake forecasts

    Directory of Open Access Journals (Sweden)

    Stefan Wiemer

    2010-11-01

    Full Text Available On August 1, 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP launched a prospective and comparative earthquake predictability experiment in Italy. The goal of this CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented 18 five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We have considered here the twelve time-independent earthquake forecasts among this set, and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. We present the results of the tests that measure the consistencies of the forecasts according to past observations. As well as being an evaluation of the time-independent forecasts submitted, this exercise provides insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between robustness of results and experiment duration. We conclude with suggestions for the design of future earthquake predictability experiments.

  15. Macroseismic manifestations of the Romanian earthquake (September 24, 2016) on the territory of Ukraine.

    Science.gov (United States)

    Kulynych, Anna; Illyenko, Volodymyr

    2017-04-01

    presents the results of seismic interpretation of the earthquake that occurred on September 24, 2016 at 02h. 11min (local time) in Romania with magnitude 5.6 and was felt in different regions in Ukraine. The essence of the work was, based on the spatial distribution of instrumental and macroseismic manifestation, to study the nature of investigated earthquake, its origin, parameters of focus and local attenuation laws of seismic shaking energy, which are necessary factors for predicting the magnitude and intensity of the future earthquakes. The seismotectonic situation in the epicenter of the earthquake was analyzed. The registered records of seismic waves and their spectrogram on the territory of Ukraine and global seismic networks were thoroughly processed. A scientific expedition collected data of macroseismic survey to study the distribution of seismic displays and detect the tangible effects of the earthquake. The maps of earthquake`s manifestations in Kyiv region were plotted by visual observation and processing of information, which was obtained by a population survey and field studies. The research results are important for seismic protection of industrial facilities and people`s lives against potential earthquakes from Vrancea area that can have a significant impact on Ukraine and border areas.

  16. The finite, kinematic rupture properties of great-sized earthquakes since 1990

    Science.gov (United States)

    Hayes, Gavin P.

    2017-06-01

    Here, I present a database of >160 finite fault models for all earthquakes of M 7.5 and above since 1990, created using a consistent modeling approach. The use of a common approach facilitates easier comparisons between models, and reduces uncertainties that arise when comparing models generated by different authors, data sets and modeling techniques. I use this database to verify published scaling relationships, and for the first time show a clear and intriguing relationship between maximum potency (the product of slip and area) and average potency for a given earthquake. This relationship implies that earthquakes do not reach the potential size given by the tectonic load of a fault (sometimes called ;moment deficit,; calculated via a plate rate over time since the last earthquake, multiplied by geodetic fault coupling). Instead, average potency (or slip) scales with but is less than maximum potency (dictated by tectonic loading). Importantly, this relationship facilitates a more accurate assessment of maximum earthquake size for a given fault segment, and thus has implications for long-term hazard assessments. The relationship also suggests earthquake cycles may not completely reset after a large earthquake, and thus repeat rates of such events may appear shorter than is expected from tectonic loading. This in turn may help explain the phenomenon of ;earthquake super-cycles; observed in some global subduction zones.

  17. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  18. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    Science.gov (United States)

    Boyd, Oliver Salz; Magistrale, Harold

    2011-01-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  19. An Atlas of ShakeMaps and population exposure catalog for earthquake loss modeling

    Science.gov (United States)

    Allen, T.I.; Wald, D.J.; Earle, P.S.; Marano, K.D.; Hotovec, A.J.; Lin, K.; Hearne, M.G.

    2009-01-01

    We present an Atlas of ShakeMaps and a catalog of human population exposures to moderate-to-strong ground shaking (EXPO-CAT) for recent historical earthquakes (1973-2007). The common purpose of the Atlas and exposure catalog is to calibrate earthquake loss models to be used in the US Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER). The full ShakeMap Atlas currently comprises over 5,600 earthquakes from January 1973 through December 2007, with almost 500 of these maps constrained-to varying degrees-by instrumental ground motions, macroseismic intensity data, community internet intensity observations, and published earthquake rupture models. The catalog of human exposures is derived using current PAGER methodologies. Exposure to discrete levels of shaking intensity is obtained by correlating Atlas ShakeMaps with a global population database. Combining this population exposure dataset with historical earthquake loss data, such as PAGER-CAT, provides a useful resource for calibrating loss methodologies against a systematically-derived set of ShakeMap hazard outputs. We illustrate two example uses for EXPO-CAT; (1) simple objective ranking of country vulnerability to earthquakes, and; (2) the influence of time-of-day on earthquake mortality. In general, we observe that countries in similar geographic regions with similar construction practices tend to cluster spatially in terms of relative vulnerability. We also find little quantitative evidence to suggest that time-of-day is a significant factor in earthquake mortality. Moreover, earthquake mortality appears to be more systematically linked to the population exposed to severe ground shaking (Modified Mercalli Intensity VIII+). Finally, equipped with the full Atlas of ShakeMaps, we merge each of these maps and find the maximum estimated peak ground acceleration at any grid point in the world for the past 35 years. We subsequently compare this "composite ShakeMap" with existing global

  20. A Modal-Logic Based Graph Abstraction

    NARCIS (Netherlands)

    Bauer, J.; Boneva, I.B.; Kurban, M.E.; Rensink, Arend; Ehrig, H; Heckel, R.; Rozenberg, G.; Taentzer, G.

    2008-01-01

    Infinite or very large state spaces often prohibit the successful verification of graph transformation systems. Abstract graph transformation is an approach that tackles this problem by abstracting graphs to abstract graphs of bounded size and by lifting application of productions to abstract

  1. Argonne Code Center: compilation of program abstracts

    International Nuclear Information System (INIS)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.

    1976-08-01

    This publication is the tenth supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the document are as follows: preface; history and acknowledgements; abstract format; recommended program package contents; program classification guide and thesaurus; and abstract collection

  2. Argonne Code Center: compilation of program abstracts

    International Nuclear Information System (INIS)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.; Harrison, C. Jr.; Hughes, C.E.; Jorgensen, R.; Legan, M.; Menozzi, T.; Ranzini, L.; Strecok, A.J.

    1977-08-01

    This publication is the eleventh supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the complete document ANL-7411 are as follows: preface, history and acknowledgements, abstract format, recommended program package contents, program classification guide and thesaurus, and the abstract collection

  3. Efficient abstractions for visualization and interaction

    NARCIS (Netherlands)

    van der Ploeg, A.J.

    2015-01-01

    Abstractions, such as functions and methods, are an essential tool for any programmer. Abstractions encapsulate the details of a computation: the programmer only needs to know what the abstraction achieves, not how it achieves it. However, using abstractions can come at a cost: the resulting program

  4. Argonne Code Center: compilation of program abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.

    1976-08-01

    This publication is the tenth supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the document are as follows: preface; history and acknowledgements; abstract format; recommended program package contents; program classification guide and thesaurus; and abstract collection. (RWR)

  5. Argonne Code Center: compilation of program abstracts

    Energy Technology Data Exchange (ETDEWEB)

    Butler, M.K.; DeBruler, M.; Edwards, H.S.; Harrison, C. Jr.; Hughes, C.E.; Jorgensen, R.; Legan, M.; Menozzi, T.; Ranzini, L.; Strecok, A.J.

    1977-08-01

    This publication is the eleventh supplement to, and revision of, ANL-7411. It contains additional abstracts and revisions to some earlier abstracts and other pages. Sections of the complete document ANL-7411 are as follows: preface, history and acknowledgements, abstract format, recommended program package contents, program classification guide and thesaurus, and the abstract collection. (RWR)

  6. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    Science.gov (United States)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

  7. EURORIB 2010, Book of abstracts

    International Nuclear Information System (INIS)

    Tsoneva, N.; Lenske, H.; Casten, R.

    2012-01-01

    The second international EURORIB conference 'EURORIB'10' will be held from June 6. to June 11. 2010 in Lamoura (France). Our nuclear physics community is eagerly awaiting the construction of the next generation of Radioactive Ion Beam (RIB) facilities in Europe: HIE-ISOLDE at CERN, NUSTAR at FAIR, SPES at LNL, SPIRAL2 at GANIL and the future EURISOL. The collaborations built around these facilities are exploring new experimental and theoretical ideas that will advance our understanding of nuclear structure through studies of exotic nuclei. Following in the spirit of the conference held in Giens in 2008, EURORIB'10 will provide the opportunity for the different collaborations to come together and present these ideas, and explore the synergy between the research programmes based around the hypothetical severe acprojects. The main topics to be discussed at the conference are: 1) At and beyond the drip line, 2) Shell structure far from stability, 3) Fusion reactions and synthesis of heavy and superheavy nuclei, 4) Dynamics and thermodynamics of exotic nuclear systems, 5) Radioactive ion beams in nuclear astrophysics, 6) New modes of radioactivity, 7) Fundamental interactions, 8) Applications in other fields, 9) Future RIB facilities, 10) Production and manipulation of RIB, and 11) Working group meetings on synergy in instrumentation and data acquisition. This document gathers only the abstracts of the papers. (authors)

  8. WD1145+017 (Abstract)

    Science.gov (United States)

    Motta, M.

    2017-12-01

    (Abstract only) WD1145 is a 17th magnitude white dwarf star 570 light years away in Virgo that was discovered to have a disintegrating planetoid in close orbit by Andrew Vanderburg, a graduate student at Harvard CfA, while data mining the elucidate the nature of its rather bizarre transit light curves. I obtained multiple observations of WD1145 over the course of a year, and found a series of complex transit light curves that could only be interpreted as a ring complex or torus in close orbit around WD1145. Combined with data from other amateur astronomers, professional observations, and satellite data, it became clear that WD1145 has a small planetoid in close orbit at the Roche limit and is breaking apart, forming a ring of debris material that is then raining down on the white dwarf. The surface of the star is "polluted" by heavy metals, determined by spectroscopic data. Given that in the intense gravitational field of a white dwarf any heavy metals could not for long last on the surface, this confirms that we are tracking in real time the destruction of a small planet by its host star.

  9. ABSTRACT MODELS FOR SYSTEM VIRTUALIZATION

    Directory of Open Access Journals (Sweden)

    M. G. Koveshnikov

    2015-05-01

    Full Text Available The paper is dedicated to issues of system objects securing (system files and user system or application configuration files against unauthorized access including denial of service attacks. We have suggested the method and developed abstract system virtualization models, which are used toresearch attack scenarios for different virtualization modes. Estimation for system tools virtualization technology effectiveness is given. Suggested technology is based on redirection of access requests to system objects shared among access subjects. Whole and partial system virtualization modes have been modeled. The difference between them is the following: in the whole virtualization mode all copies of access system objects are created whereon subjects’ requests are redirected including corresponding application objects;in the partial virtualization mode corresponding copies are created only for part of a system, for example, only system objects for applications. Alternative solutions effectiveness is valued relating to different attack scenarios. We consider proprietary and approved technical solution which implements system virtualization method for Microsoft Windows OS family. Administrative simplicity and capabilities of correspondingly designed system objects security tools are illustrated on this example. Practical significance of the suggested security method has been confirmed.

  10. An abstract approach to music.

    Energy Technology Data Exchange (ETDEWEB)

    Kaper, H. G.; Tipei, S.

    1999-04-19

    In this article we have outlined a formal framework for an abstract approach to music and music composition. The model is formulated in terms of objects that have attributes, obey relationships, and are subject to certain well-defined operations. The motivation for this approach uses traditional terms and concepts of music theory, but the approach itself is formal and uses the language of mathematics. The universal object is an audio wave; partials, sounds, and compositions are special objects, which are placed in a hierarchical order based on time scales. The objects have both static and dynamic attributes. When we realize a composition, we assign values to each of its attributes: a (scalar) value to a static attribute, an envelope and a size to a dynamic attribute. A composition is then a trajectory in the space of aural events, and the complex audio wave is its formal representation. Sounds are fibers in the space of aural events, from which the composer weaves the trajectory of a composition. Each sound object in turn is made up of partials, which are the elementary building blocks of any music composition. The partials evolve on the fastest time scale in the hierarchy of partials, sounds, and compositions. The ideas outlined in this article are being implemented in a digital instrument for additive sound synthesis and in software for music composition. A demonstration of some preliminary results has been submitted by the authors for presentation at the conference.

  11. Real-time earthquake monitoring at the Indian Tsunami Early Warning System for tsunami advisories in the Indian Ocean

    Directory of Open Access Journals (Sweden)

    E Uma Devi

    2016-04-01

    Full Text Available The Indian Tsunami Early Warning System situated at Indian National Center for Ocean Information Services, Hyderabad, India, monitors real-time earthquake activity throughout the Indian Ocean to evaluate potential tsunamigenic earthquakes. The functions of the Indian Tsunami Early Warning System earthquake monitoring system include detection, location and determination of the magnitude of potentially tsunamigenic earthquakes occurring in the Indian Ocean. The real-time seismic monitoring network comprises 17 broadband Indian seismic stations transmitting real-time earthquake data through VSAT communication to the central receiving stations located at the Indian Meteorological Department, New Delhi, and the Indian National Center for Ocean Information Services, Hyderabad, simultaneously for processing and interpretation. In addition to this, earthquake data from around 300 global seismic stations are also received at the Indian National Center for Ocean Information Services in near-real-time. Most of these data are provided by IRIS Global Seismographic Network and GEOFON Extended Virtual Network through Internet. The Indian National Center for Ocean Information Services uses SeisComP3 software for auto-location of earthquake parameters (location, magnitude, focal depth and origin time. All earthquakes of Mw >5.0 are auto-located within 5–10 minutes of the occurrence of the earthquake. Since its inception in October 2007 to date, the warning centre has monitored and reported 55 tsunamigenic earthquakes (under-sea and near coast earthquakes of magnitude ⩾6.5 in the Indian Ocean region. Comparison of the earthquake parameters (elapsed time, magnitude, focal depth and location estimated by the Indian Tsunami Early Warning System with the US Geological Survey suggests that the Indian Tsunami Early Warning System is performing well and has achieved the target set up by the Intergovernmental Oceanographic Commission.

  12. Plate-boundary deformation associated with the great Sumatra–Andaman earthquake

    OpenAIRE

    Subarya, Cecep; Chlieh, Mohamed; Prawirodirdjo, Linette; Avouac, Jean-Philippe; Bock, Yehuda; Sieh, Kerry; Meltzner, Aron J.; Natawidjaja, Danny H.; McCaffrey, Robert

    2006-01-01

    The Sumatra–Andaman earthquake of 26 December 2004 is the first giant earthquake (moment magnitude M_w > 9.0) to have occurred since the advent of modern space-based geodesy and broadband seismology. It therefore provides an unprecedented opportunity to investigate the characteristics of one of these enormous and rare events. Here we report estimates of the ground displacement associated with this event, using near-field Global Positioning System (GPS) surveys in northwestern Sumatra combined...

  13. Change of Japanese risk perception after Tohoku earthquake March 11, 2011

    International Nuclear Information System (INIS)

    Nakajima, Reiko

    2011-01-01

    The present study reports change of Japanese risk perception based on the results of national surveys carried out in 2010 and 2011. Major earthquake and nuclear power plant risk items were perceived much more serious, while other risks such as global warming, illicit drugs, terrorism were perceiver less serious, after Tohoku earthquake. Anxiety about radiological material was differently according to the distance from the Fukushima No. 1 nuclear power plant. (author)

  14. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  15. Automated Determination of Magnitude and Source Length of Large Earthquakes

    Science.gov (United States)

    Wang, D.; Kawakatsu, H.; Zhuang, J.; Mori, J. J.; Maeda, T.; Tsuruoka, H.; Zhao, X.

    2017-12-01

    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus

  16. Automated Determination of Magnitude and Source Extent of Large Earthquakes

    Science.gov (United States)

    Wang, Dun

    2017-04-01

    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus

  17. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  18. Haiti Earthquake: Crisis and Response

    Science.gov (United States)

    2010-02-19

    years ago, in 1860. Haitian ministries are addressing issues such as long-term housing for those left homeless by the earthquake as they operate out...CRS Report for Congress Prepared for Members and Committees of Congress Haiti Earthquake: Crisis and Response Rhoda Margesson... Crisis and Response 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK

  19. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  20. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  1. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  2. Earthquake damage to underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    Pratt, H.R.; Hustrulid, W.A. Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository.

  3. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  4. Nuclear works. Book of abstracts

    International Nuclear Information System (INIS)

    Candel, Danielle; Calberg-Challot, Marie; Alexander, Catherine; Bergsman, Anne; Meyer, Morgan; Taebi, Behnam; Kloosterman, Jan Leen; Kelfaoui, Mahdi; Gingras, Yves; Laborie, Leonard; Beltran, Alain; Bouvier, Yves; Raineau, Laurence; Poirot-Delpech, Sophie; Ollivon, Franck; Mueller, Birgit; Lemarchand, Frederick; Rivat, Emmanuel; Mormont, Marc; Aparicio, Luis; Fassert, Christine; Lehtonen, Markku; Billet, Philippe; Girard, Berenice; Fournier, Pierre; Marion, Richard; Lot, Nicolas

    2012-01-01

    the conception of a LILW repository (Marc Mormont, Anne Bergmans), The contribution of Social Sciences and Humanities to the scientific program for radioactive waste management of Andra (Luis Aparicio), The public expert and the nuclear catastrophe (Christine Fassert); 5 - Nuclear governance: Did Fukushima put an end to nuclear revival? A post-Fukushima debates analysis in Finnish, French and British media (Markku Lehtonen), Nuclear secrecy at the test of the right to participation (Philippe Billet), Nuclear science, politics and national construction: what remains from Nehru's India in these times of uncertainty? (Berenice Girard); 6 - Working in the nuclear industry. Training and work collectives: Nuclear industry: a workers-less world? (Pierre Fournier), Ambiguity dynamics at NPPs, a pluri-disciplinary approach (Nicolas Lot), Sino-French nuclear engineering curriculums: what kind of innovation configuration? (Richard Marion). This document brings together the French and English abstracts of the different talks

  5. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  6. Impaired psychological recovery in the elderly after the Niigata-Chuetsu Earthquake in Japan:a population-based study

    Directory of Open Access Journals (Sweden)

    Someya Toshiyuki

    2006-09-01

    Full Text Available Abstract Background An earthquake measuring 6.8 on the Richter scale struck the Niigata-Chuetsu region of Japan at 5.56 P.M. on the 23rd of October, 2004. The earthquake was followed by sustained occurrence of numerous aftershocks, which delayed reconstruction of community lifelines. Even one year after the earthquake, 9,160 people were living in temporary housing. Such a devastating earthquake and life after the earthquake in an unfamiliar environment should cause psychological distress, especially among the elderly. Methods Psychological distress was measured using the 12-item General Health Questionnaire (GHQ-12 in 2,083 subjects (69% response rate who were living in transient housing five months after the earthquake. GHQ-12 was scored using the original method, Likert scoring and corrected method. The subjects were asked to assess their psychological status before the earthquake, their psychological status at the most stressful time after the earthquake and their psychological status at five months after the earthquake. Exploratory and confirmatory factor analysis was used to reveal the factor structure of GHQ12. Multiple regression analysis was performed to analyze the relationship between various background factors and GHQ-12 score and its subscale. Results GHQ-12 scores were significantly elevated at the most stressful time and they were significantly high even at five months after the earthquake. Factor analysis revealed that a model consisting of two factors (social dysfunction and dysphoria using corrected GHQ scoring showed a high level of goodness-of-fit. Multiple regression analysis revealed that age of subjects affected GHQ-12 scores. GHQ-12 score as well as its factor 'social dysfunction' scale were increased with increasing age of subjects at five months after the earthquake. Conclusion Impaired psychological recovery was observed even at five months after the Niigata-Chuetsu Earthquake in the elderly. The elderly were more

  7. Ecological Management and the Cosmogenic Mechanism of Earthquakes

    Directory of Open Access Journals (Sweden)

    Mogiljuk Zhanna

    2016-01-01

    Full Text Available Critical issue of ecological risk management in urban areas is to predict the evolution of the dangerous natural processes intensity. The special situation in the realization of these risks take the earthquake threat and the stresses emergency fluctuations in the geological environment of the buildings and structures bases. This article is devoted to one of the main problems of earthquake engineering - verification of the dominant mechanisms and causality of the earthquakes intensity dangerous evolution. In it discusses the comparative analysis results of the Earth gravitational interaction energy variations amplitudes with the Sun, with the Moon and the solar system planets. Also presented the comparative evaluations results of the Earth geospheres gravitational perturbations amplitudes with the Earth solar radiation energy with the energy of its own heat of the Earth. It is shown that the energy of his own heat and Sun exposure of the Earth much less energy to gravitational perturbations in the near-earth space. In the article presents the spectral analysis results of earthquakes global daily energy on the Earth before and after the Shoemaker-Levy comet explosion on Jupiter. It is shown that the seismic events number on Earth with magnitude greater than 2.5 on the Richter scale after the comet explosion increased in 10 times. In the earthquakes global daily energy spectrum shows the spectral manifestations of solar system planets gravitational resonances. In given article the researches results of natural disasters cosmogenic sources power allow us to argue that ecological risk effective management is impossible without the evolution forecast of the cosmogenic effects intensity on natural processes for sustainable urban development.

  8. Preliminary Results from SCEC Earthquake Simulator Comparison Project

    Science.gov (United States)

    Tullis, T. E.; Barall, M.; Richards-Dinger, K. B.; Ward, S. N.; Heien, E.; Zielke, O.; Pollitz, F. F.; Dieterich, J. H.; Rundle, J. B.; Yikilmaz, M. B.; Turcotte, D. L.; Kellogg, L. H.; Field, E. H.

    2010-12-01

    Earthquake simulators are computer programs that simulate long sequences of earthquakes. If such simulators could be shown to produce synthetic earthquake histories that are good approximations to actual earthquake histories they could be of great value in helping to anticipate the probabilities of future earthquakes and so could play an important role in helping to make public policy decisions. Consequently it is important to discover how realistic are the earthquake histories that result from these simulators. One way to do this is to compare their behavior with the limited knowledge we have from the instrumental, historic, and paleoseismic records of past earthquakes. Another, but slow process for large events, is to use them to make predictions about future earthquake occurrence and to evaluate how well the predictions match what occurs. A final approach is to compare the results of many varied earthquake simulators to determine the extent to which the results depend on the details of the approaches and assumptions made by each simulator. Five independently developed simulators, capable of running simulations on complicated geometries containing multiple faults, are in use by some of the authors of this abstract. Although similar in their overall purpose and design, these simulators differ from one another widely in their details in many important ways. They require as input for each fault element a value for the average slip rate as well as a value for friction parameters or stress reduction due to slip. They share the use of the boundary element method to compute stress transfer between elements. None use dynamic stress transfer by seismic waves. A notable difference is the assumption different simulators make about the constitutive properties of the faults. The earthquake simulator comparison project is designed to allow comparisons among the simulators and between the simulators and past earthquake history. The project uses sets of increasingly detailed

  9. Source finiteness of large earthquakes measured from long-period Rayleigh waves

    Science.gov (United States)

    Zhang, Jiajun; Kanamori, Hiroo

    1988-10-01

    The source-finiteness parameters of 11 large shallow earthquakes were determined from long-period Rayleigh waves recorded by the Global Digital Seismograph Network and International Deployment of Accelerometers Networks. The basic data sets are the seismic spectra of periods from 150 to 300 s. In the determination of source-process times, we used Furumoto's phase method and a linear inversion method, in which we simultaneously inverted the spectra and determined the source-process time that minimizes the error in the inversion. These two methods yielded consistent results. The source-process times of the Sumbawa (Indonesia), Colombia-Ecuador, Valparaiso (Chile) and Michoacan (Mexico) earthquakes were estimated to be 79, 118, 69 and 77 s, respectively, from the linear inversion method. The source-process times determined from long-period surface waves were in general longer than those obtained from body waves. Source finiteness of large shallow earthquakes with rupture on a fault plane with a large aspect ratio was modeled with the source-finiteness function introduced by Ben-Menahem. The spectra were inverted to find the extent and direction of the rupture of the earthquake that minimize the error in the inversion. For a rupture velocity of 2.5 km s -1, the estimated rupture was unilateral, 100 km long and along the strike, N26°W, for the May 26, 1983 Akita-Oki, Japan earthquake; 165 km and S57°E for the September 19, 1985 Michoacan, Mexico earthquake; 256 km and N31°E for the December 12, 1979 Colombia-Ecuador earthquake; and 149 km and S15°W for the March 3, 1985 Valparaiso, Chile earthquake. The results for the August 19, 1977 Sumbawa, Indonesia earthquake showed that the rupture was bilateral and in the direction N60°E. These results are, in general, consistent with the rupture extent inferred from the aftershock area of these earthquakes.

  10. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  11. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  12. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  13. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  14. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  15. Abstract methods in partial differential equations

    CERN Document Server

    Carroll, Robert W

    2012-01-01

    Detailed, self-contained treatment examines modern abstract methods in partial differential equations, especially abstract evolution equations. Suitable for graduate students with some previous exposure to classical partial differential equations. 1969 edition.

  16. Conference Abstracts | Manda Taylor | Malawi Medical Journal

    African Journals Online (AJOL)

    Abstract. Award-winning abstracts from the first Paediatric and Child Health Association of Malawi Conference. Theme: Using a multidisciplinary team approach to improve child health outcomes throughout Malawi ...

  17. Earthquake fault superhighways

    Science.gov (United States)

    Robinson, D. P.; Das, S.; Searle, M. P.

    2010-10-01

    Motivated by the observation that the rare earthquakes which propagated for significant distances at supershear speeds occurred on very long straight segments of faults, we examine every known major active strike-slip fault system on land worldwide and identify those with long (> 100 km) straight portions capable not only of sustained supershear rupture speeds but having the potential to reach compressional wave speeds over significant distances, and call them "fault superhighways". The criteria used for identifying these are discussed. These superhighways include portions of the 1000 km long Red River fault in China and Vietnam passing through Hanoi, the 1050 km long San Andreas fault in California passing close to Los Angeles, Santa Barbara and San Francisco, the 1100 km long Chaman fault system in Pakistan north of Karachi, the 700 km long Sagaing fault connecting the first and second cities of Burma, Rangoon and Mandalay, the 1600 km Great Sumatra fault, and the 1000 km Dead Sea fault. Of the 11 faults so classified, nine are in Asia and two in North America, with seven located near areas of very dense populations. Based on the current population distribution within 50 km of each fault superhighway, we find that more than 60 million people today have increased seismic hazards due to them.

  18. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  19. Abstract Objects in a Metaphysical Perspective

    Directory of Open Access Journals (Sweden)

    Aleksandr Kulieshov

    2018-02-01

    Full Text Available The article presents an unconventional although not absolutely unprecedented view on abstract objects defending the position of metaphysical realism. It is argued that abstract objects taken in purely ontological sense are the forms of objects. The forms possess some common characteristics of abstract objects, they can exist not in physical space and time and play a grounding role in their relation to concrete objects. It is stated that commonly discussed abstract objects – properties, kinds, mathematical objects – are forms.

  20. 2013 SYR Accepted Poster Abstracts.

    Science.gov (United States)

    2013-01-01

    SYR 2013 Accepted Poster abstracts: 1. Benefits of Yoga as a Wellness Practice in a Veterans Affairs (VA) Health Care Setting: If You Build It, Will They Come? 2. Yoga-based Psychotherapy Group With Urban Youth Exposed to Trauma. 3. Embodied Health: The Effects of a Mind�Body Course for Medical Students. 4. Interoceptive Awareness and Vegetable Intake After a Yoga and Stress Management Intervention. 5. Yoga Reduces Performance Anxiety in Adolescent Musicians. 6. Designing and Implementing a Therapeutic Yoga Program for Older Women With Knee Osteoarthritis. 7. Yoga and Life Skills Eating Disorder Prevention Among 5th Grade Females: A Controlled Trial. 8. A Randomized, Controlled Trial Comparing the Impact of Yoga and Physical Education on the Emotional and Behavioral Functioning of Middle School Children. 9. Feasibility of a Multisite, Community based Randomized Study of Yoga and Wellness Education for Women With Breast Cancer Undergoing Chemotherapy. 10. A Delphi Study for the Development of Protocol Guidelines for Yoga Interventions in Mental Health. 11. Impact Investigation of Breathwalk Daily Practice: Canada�India Collaborative Study. 12. Yoga Improves Distress, Fatigue, and Insomnia in Older Veteran Cancer Survivors: Results of a Pilot Study. 13. Assessment of Kundalini Mantra and Meditation as an Adjunctive Treatment With Mental Health Consumers. 14. Kundalini Yoga Therapy Versus Cognitive Behavior Therapy for Generalized Anxiety Disorder and Co-Occurring Mood Disorder. 15. Baseline Differences in Women Versus Men Initiating Yoga Programs to Aid Smoking Cessation: Quitting in Balance Versus QuitStrong. 16. Pranayam Practice: Impact on Focus and Everyday Life of Work and Relationships. 17. Participation in a Tailored Yoga Program is Associated With Improved Physical Health in Persons With Arthritis. 18. Effects of Yoga on Blood Pressure: Systematic Review and Meta-analysis. 19. A Quasi-experimental Trial of a Yoga based Intervention to Reduce Stress and