WorldWideScience

Sample records for global earthquake proportional

  1. Global earthquake fatalities and population

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  2. Global Drought Proportional Economic Loss Risk Deciles

    National Aeronautics and Space Administration — Global Drought Proportional Economic Loss Risk Deciles is a 2.5 minute grid of drought hazard economic loss as proportions of Gross Domestic Product (GDP) per...

  3. Global Earthquake Hazard Frequency and Distribution

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  4. GEM - The Global Earthquake Model

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  5. Global Significant Earthquake Database, 2150 BC to present

    National Oceanic and Atmospheric Administration, Department of Commerce — The Significant Earthquake Database is a global listing of over 5,700 earthquakes from 2150 BC to the present. A significant earthquake is classified as one that...

  6. Crowd-Sourced Global Earthquake Early Warning

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  7. Conflicts between domestic inequality and global poverty: lexicality versus proportionality

    Francisco García Gibson

    2016-12-01

    Full Text Available Current views on global justice often hold that affluent states are under at least two duties: a duty to reduce socioeconomic inequalities at home and a duty to reduce extreme poverty abroad. Potential duty conflicts deriving from resource scarcity can be solved in broadly two principled ways. The ‘lexical’ principle requires all disputed resources to be allocated to the weightiest duty. The ‘proportionality’ principle requires resources to be distributed between the two duties according to their relative weight (the weightiest duty receives the largest resource share, but the less weighty duty receives a share too. I argue that the proportionality principle is morally preferable. I show that it is sensitive to a number of factors that are intuitively relevant when solving duty conflicts: the number of affected individuals, the size of the benefits each individual could get, and the time it could take to eventually comply with the less weighty duty. Some argue that the lexical principle should nevertheless be preferred because domestic egalitarian duties are duties of justice, and they are therefore lexically prior to mere humanitarian duties to reduce global poverty. I reject this view by showing that duties of justice are not necessarily lexically prior to humanitarian duties, and that (even if they were duties to reduce global poverty can be regarded as duties of justice too.

  8. Substantial proportion of global streamflow less than three months old

    Jasechko, Scott; Kirchner, James W.; Welker, Jeffrey M.; McDonnell, Jeffrey J.

    2016-02-01

    Biogeochemical cycles, contaminant transport and chemical weathering are regulated by the speed at which precipitation travels through landscapes and reaches streams. Streamflow is a mixture of young and old precipitation, but the global proportions of these young and old components are not known. Here we analyse seasonal cycles of oxygen isotope ratios in rain, snow and streamflow compiled from 254 watersheds around the world, and calculate the fraction of streamflow that is derived from precipitation that fell within the past two or three months. This young streamflow accounts for about a third of global river discharge, and comprises at least 5% of discharge in about 90% of the catchments we investigated. We conclude that, although typical catchments have mean transit times of years or even decades, they nonetheless can rapidly transmit substantial fractions of soluble contaminant inputs to streams. Young streamflow is less prevalent in steeper landscapes, which suggests they are characterized by deeper vertical infiltration. Because young streamflow is derived from less than 0.1% of global groundwater storage, we conclude that this thin veneer of aquifer storage will have a disproportionate influence on stream water quality.

  9. Global risk of big earthquakes has not recently increased.

    Shearer, Peter M; Stark, Philip B

    2012-01-17

    The recent elevated rate of large earthquakes has fueled concern that the underlying global rate of earthquake activity has increased, which would have important implications for assessments of seismic hazard and our understanding of how faults interact. We examine the timing of large (magnitude M≥7) earthquakes from 1900 to the present, after removing local clustering related to aftershocks. The global rate of M≥8 earthquakes has been at a record high roughly since 2004, but rates have been almost as high before, and the rate of smaller earthquakes is close to its historical average. Some features of the global catalog are improbable in retrospect, but so are some features of most random sequences--if the features are selected after looking at the data. For a variety of magnitude cutoffs and three statistical tests, the global catalog, with local clusters removed, is not distinguishable from a homogeneous Poisson process. Moreover, no plausible physical mechanism predicts real changes in the underlying global rate of large events. Together these facts suggest that the global risk of large earthquakes is no higher today than it has been in the past.

  10. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  11. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  12. Global variations of large megathrust earthquake rupture characteristics

    Kanamori, Hiroo

    2018-01-01

    Despite the surge of great earthquakes along subduction zones over the last decade and advances in observations and analysis techniques, it remains unclear whether earthquake complexity is primarily controlled by persistent fault properties or by dynamics of the failure process. We introduce the radiated energy enhancement factor (REEF), given by the ratio of an event’s directly measured radiated energy to the calculated minimum radiated energy for a source with the same seismic moment and duration, to quantify the rupture complexity. The REEF measurements for 119 large [moment magnitude (Mw) 7.0 to 9.2] megathrust earthquakes distributed globally show marked systematic regional patterns, suggesting that the rupture complexity is strongly influenced by persistent geological factors. We characterize this as the existence of smooth and rough rupture patches with varying interpatch separation, along with failure dynamics producing triggering interactions that augment the regional influences on large events. We present an improved asperity scenario incorporating both effects and categorize global subduction zones and great earthquakes based on their REEF values and slip patterns. Giant earthquakes rupturing over several hundred kilometers can occur in regions with low-REEF patches and small interpatch spacing, such as for the 1960 Chile, 1964 Alaska, and 2011 Tohoku earthquakes, or in regions with high-REEF patches and large interpatch spacing as in the case for the 2004 Sumatra and 1906 Ecuador-Colombia earthquakes. Thus, combining seismic magnitude Mw and REEF, we provide a quantitative framework to better represent the span of rupture characteristics of great earthquakes and to understand global seismicity. PMID:29750186

  13. Global building inventory for earthquake loss estimation and risk management

    Jaiswal, Kishor; Wald, David; Porter, Keith

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey’s Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat’s demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature.

  14. Rapid estimation of the economic consequences of global earthquakes

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  15. How complete is the ISC-GEM Global Earthquake Catalog?

    Michael, Andrew J.

    2014-01-01

    The International Seismological Centre, in collaboration with the Global Earthquake Model effort, has released a new global earthquake catalog, covering the time period from 1900 through the end of 2009. In order to use this catalog for global earthquake studies, I determined the magnitude of completeness (Mc) as a function of time by dividing the earthquakes shallower than 60 km into 7 time periods based on major changes in catalog processing and data availability and applying 4 objective methods to determine Mc, with uncertainties determined by non-parametric bootstrapping. Deeper events were divided into 2 time periods. Due to differences between the 4 methods, the final Mc was determined subjectively by examining the features that each method focused on in both the cumulative and binned magnitude frequency distributions. The time periods and Mc values for shallow events are: 1900-1917, Mc=7.7; 1918-1939, Mc=7.0; 1940-1954, Mc=6.8; 1955-1963, Mc=6.5; 1964-1975, Mc=6.0; 1976-2003, Mc=5.8; and 2004-2009, Mc=5.7. Using these Mc values for the longest time periods they are valid for (e.g. 1918-2009, 1940-2009,…) the shallow data fits a Gutenberg-Richter distribution with b=1.05 and a=8.3, within 1 standard deviation, with no declustering. The exception is for time periods that include 1900-1917 in which there are only 33 events with M≥ Mc and for those few data b=2.15±0.46. That result calls for further investigations for this time period, ideally having a larger number of earthquakes. For deep events, the results are Mc=7.1 for 1900-1963, although the early data are problematic; and Mc=5.7 for 1964-2009. For that later time period, b=0.99 and a=7.3.

  16. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  17. Global assessment of human losses due to earthquakes

    Silva, Vitor; Jaiswal, Kishor; Weatherill, Graeme; Crowley, Helen

    2014-01-01

    Current studies have demonstrated a sharp increase in human losses due to earthquakes. These alarming levels of casualties suggest the need for large-scale investment in seismic risk mitigation, which, in turn, requires an adequate understanding of the extent of the losses, and location of the most affected regions. Recent developments in global and uniform datasets such as instrumental and historical earthquake catalogues, population spatial distribution and country-based vulnerability functions, have opened an unprecedented possibility for a reliable assessment of earthquake consequences at a global scale. In this study, a uniform probabilistic seismic hazard assessment (PSHA) model was employed to derive a set of global seismic hazard curves, using the open-source software OpenQuake for seismic hazard and risk analysis. These results were combined with a collection of empirical fatality vulnerability functions and a population dataset to calculate average annual human losses at the country level. The results from this study highlight the regions/countries in the world with a higher seismic risk, and thus where risk reduction measures should be prioritized.

  18. Global volcanic earthquake swarm database and preliminary analysis of volcanic earthquake swarm duration

    S. R. McNutt

    1996-06-01

    Full Text Available Global data from 1979 to 1989 pertaining to volcanic earthquake swarms have been compiled into a custom-designed relational database. The database is composed of three sections: 1 a section containing general information on volcanoes, 2 a section containing earthquake swarm data (such as dates of swarm occurrence and durations, and 3 a section containing eruption information. The most abundant and reliable parameter, duration of volcanic earthquake swarms, was chosen for preliminary analysis. The distribution of all swarm durations was found to have a geometric mean of 5.5 days. Precursory swarms were then separated from those not associated with eruptions. The geometric mean precursory swarm duration was 8 days whereas the geometric mean duration of swarms not associated with eruptive activity was 3.5 days. Two groups of precursory swarms are apparent when duration is compared with the eruption repose time. Swarms with durations shorter than 4 months showed no clear relationship with the eruption repose time. However, the second group, lasting longer than 4 months, showed a significant positive correlation with the log10 of the eruption repose period. The two groups suggest that different suites of physical processes are involved in the generation of volcanic earthquake swarms.

  19. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    W. F. Peng

    2012-03-01

    Full Text Available The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC from the global ionosphere map (GIM. We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0–2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time. Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  20. A new reference global instrumental earthquake catalogue (1900-2009)

    Di Giacomo, D.; Engdahl, B.; Bondar, I.; Storchak, D. A.; Villasenor, A.; Bormann, P.; Lee, W.; Dando, B.; Harris, J.

    2011-12-01

    For seismic hazard studies on a global and/or regional scale, accurate knowledge of the spatial distribution of seismicity, the magnitude-frequency relation and the maximum magnitudes is of fundamental importance. However, such information is normally not homogeneous (or not available) for the various seismically active regions of the Earth. To achieve the GEM objectives (www.globalquakemodel.org) of calculating and communicating earthquake risk worldwide, an improved reference global instrumental catalogue for large earthquakes spanning the entire 100+ years period of instrumental seismology is an absolute necessity. To accomplish this task, we apply the most up-to-date techniques and standard observatory practices for computing the earthquake location and magnitude. In particular, the re-location procedure benefits both from the depth determination according to Engdahl and Villaseñor (2002), and the advanced technique recently implemented at the ISC (Bondár and Storchak, 2011) to account for correlated error structure. With regard to magnitude, starting from the re-located hypocenters, the classical surface and body-wave magnitudes are determined following the new IASPEI standards and by using amplitude-period data of phases collected from historical station bulletins (up to 1970), which were not available in digital format before the beginning of this work. Finally, the catalogue will provide moment magnitude values (including uncertainty) for each seismic event via seismic moment, via surface wave magnitude or via other magnitude types using empirical relationships. References Engdahl, E.R., and A. Villaseñor (2002). Global seismicity: 1900-1999. In: International Handbook of Earthquake and Engineering Seismology, eds. W.H.K. Lee, H. Kanamori, J.C. Jennings, and C. Kisslinger, Part A, 665-690, Academic Press, San Diego. Bondár, I., and D. Storchak (2011). Improved location procedures at the International Seismological Centre, Geophys. J. Int., doi:10.1111/j

  1. A global probabilistic tsunami hazard assessment from earthquake sources

    Davies, Gareth; Griffin, Jonathan; Lovholt, Finn; Glimsdal, Sylfest; Harbitz, Carl; Thio, Hong Kie; Lorito, Stefano; Basili, Roberto; Selva, Jacopo; Geist, Eric L.; Baptista, Maria Ana

    2017-01-01

    Large tsunamis occur infrequently but have the capacity to cause enormous numbers of casualties, damage to the built environment and critical infrastructure, and economic losses. A sound understanding of tsunami hazard is required to underpin management of these risks, and while tsunami hazard assessments are typically conducted at regional or local scales, globally consistent assessments are required to support international disaster risk reduction efforts, and can serve as a reference for local and regional studies. This study presents a global-scale probabilistic tsunami hazard assessment (PTHA), extending previous global-scale assessments based largely on scenario analysis. Only earthquake sources are considered, as they represent about 80% of the recorded damaging tsunami events. Globally extensive estimates of tsunami run-up height are derived at various exceedance rates, and the associated uncertainties are quantified. Epistemic uncertainties in the exceedance rates of large earthquakes often lead to large uncertainties in tsunami run-up. Deviations between modelled tsunami run-up and event observations are quantified, and found to be larger than suggested in previous studies. Accounting for these deviations in PTHA is important, as it leads to a pronounced increase in predicted tsunami run-up for a given exceedance rate.

  2. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  3. Global Asymptotic Stability of Impulsive CNNs with Proportional Delays and Partially Lipschitz Activation Functions

    Xueli Song

    2014-01-01

    Full Text Available This paper researches global asymptotic stability of impulsive cellular neural networks with proportional delays and partially Lipschitz activation functions. Firstly, by means of the transformation vi(t=ui(et, the impulsive cellular neural networks with proportional delays are transformed into impulsive cellular neural networks with the variable coefficients and constant delays. Secondly, we provide novel criteria for the uniqueness and exponential stability of the equilibrium point of the latter by relative nonlinear measure and prove that the exponential stability of equilibrium point of the latter implies the asymptotic stability of one of the former. We furthermore obtain a sufficient condition to the uniqueness and global asymptotic stability of the equilibrium point of the former. Our method does not require conventional assumptions on global Lipschitz continuity, boundedness, and monotonicity of activation functions. Our results are generalizations and improvements of some existing ones. Finally, an example and its simulations are provided to illustrate the correctness of our analysis.

  4. Global observation of Omori-law decay in the rate of triggered earthquakes

    Parsons, T.

    2001-12-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the 1999 shocks in Turkey and the 2001 events in El Salvador. In this study, earthquakes with M greater than 7.0 from the Harvard CMT catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near the main shocks are associated with calculated shear stress increases, while ~39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, triggered earthquakes obey an Omori-law rate decay that lasts between ~7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main-shock centroid. Earthquakes triggered by smaller quakes (foreshocks) also obey Omori's law, which is one of the few time-predictable patterns evident in the global occurrence of earthquakes. These observations indicate that earthquake probability calculations which include interactions from previous shocks should incorporate a transient Omori-law decay with time. In addition, a very simple model using the observed global rate change with time and spatial distribution of triggered earthquakes can be applied to immediately assess the likelihood of triggered earthquakes following large events, and can be in place until more sophisticated analyses are conducted.

  5. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  6. The Global Earthquake Model and Disaster Risk Reduction

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  7. Earthquakes

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  8. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1997-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  9. Prediction of Global and Localized Damage and Future Reliability for RC Structures subject to Earthquakes

    Köyluoglu, H.U.; Nielsen, Søren R.K.; Cakmak, A.S.

    1994-01-01

    the arrival of the first earthquake from non-destructive vibration tests or via structural analysis. The previous excitation and displacement response time series is employed for the identification of the instantaneous softening using an ARMA model. The hysteresis parameters are updated after each earthquake....... The proposed model is next generalized for the MDOF system. Using the adapted models for the structure and the global damage state, the global damage in a future earthquake can then be estimated when a suitable earthquake model is applied. The performance of the model is illustrated on RC frames which were...

  10. Global asymptotic stabilization of large-scale hydraulic networks using positive proportional controls

    Jensen, Tom Nørgaard; Wisniewski, Rafal

    2014-01-01

    An industrial case study involving a large-scale hydraulic network underlying a district heating system subject to structural changes is considered. The problem of controlling the pressure drop across the so-called end-user valves in the network to a designated vector of reference values under...... directional actuator constraints is addressed. The proposed solution consists of a set of decentralized positively constrained proportional control actions. The results show that the closed-loop system always has a globally asymptotically stable equilibrium point independently on the number of end......-users. Furthermore, by a proper design of controller gains the closed-loop equilibrium point can be designed to belong to an arbitrarily small neighborhood of the desired equilibrium point. Since there exists a globally asymptotically stable equilibrium point independently on the number of end-users in the system...

  11. Global Omori law decay of triggered earthquakes: Large aftershocks outside the classical aftershock zone

    Parsons, Tom

    2002-09-01

    Triggered earthquakes can be large, damaging, and lethal as evidenced by the1999 shocks in Turkey and the 2001 earthquakes in El Salvador. In this study, earthquakes with Ms ≥ 7.0 from the Harvard centroid moment tensor (CMT) catalog are modeled as dislocations to calculate shear stress changes on subsequent earthquake rupture planes near enough to be affected. About 61% of earthquakes that occurred near (defined as having shear stress change ∣Δτ∣ ≥ 0.01 MPa) the Ms ≥ 7.0 shocks are associated with calculated shear stress increases, while ˜39% are associated with shear stress decreases. If earthquakes associated with calculated shear stress increases are interpreted as triggered, then such events make up at least 8% of the CMT catalog. Globally, these triggered earthquakes obey an Omori law rate decay that lasts between ˜7-11 years after the main shock. Earthquakes associated with calculated shear stress increases occur at higher rates than background up to 240 km away from the main shock centroid. Omori's law is one of the few time-predictable patterns evident in the global occurrence of earthquakes. If large triggered earthquakes habitually obey Omori's law, then their hazard can be more readily assessed. The characteristic rate change with time and spatial distribution can be used to rapidly assess the likelihood of triggered earthquakes following events of Ms ≥ 7.0. I show an example application to the M = 7.7 13 January 2001 El Salvador earthquake where use of global statistics appears to provide a better rapid hazard estimate than Coulomb stress change calculations.

  12. Global error estimation based on the tolerance proportionality for some adaptive Runge-Kutta codes

    Calvo, M.; González-Pinto, S.; Montijano, J. I.

    2008-09-01

    Modern codes for the numerical solution of Initial Value Problems (IVPs) in ODEs are based in adaptive methods that, for a user supplied tolerance [delta], attempt to advance the integration selecting the size of each step so that some measure of the local error is [similar, equals][delta]. Although this policy does not ensure that the global errors are under the prescribed tolerance, after the early studies of Stetter [Considerations concerning a theory for ODE-solvers, in: R. Burlisch, R.D. Grigorieff, J. Schröder (Eds.), Numerical Treatment of Differential Equations, Proceedings of Oberwolfach, 1976, Lecture Notes in Mathematics, vol. 631, Springer, Berlin, 1978, pp. 188-200; Tolerance proportionality in ODE codes, in: R. März (Ed.), Proceedings of the Second Conference on Numerical Treatment of Ordinary Differential Equations, Humbold University, Berlin, 1980, pp. 109-123] and the extensions of Higham [Global error versus tolerance for explicit Runge-Kutta methods, IMA J. Numer. Anal. 11 (1991) 457-480; The tolerance proportionality of adaptive ODE solvers, J. Comput. Appl. Math. 45 (1993) 227-236; The reliability of standard local error control algorithms for initial value ordinary differential equations, in: Proceedings: The Quality of Numerical Software: Assessment and Enhancement, IFIP Series, Springer, Berlin, 1997], it has been proved that in many existing explicit Runge-Kutta codes the global errors behave asymptotically as some rational power of [delta]. This step-size policy, for a given IVP, determines at each grid point tn a new step-size hn+1=h(tn;[delta]) so that h(t;[delta]) is a continuous function of t. In this paper a study of the tolerance proportionality property under a discontinuous step-size policy that does not allow to change the size of the step if the step-size ratio between two consecutive steps is close to unity is carried out. This theory is applied to obtain global error estimations in a few problems that have been solved with

  13. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  14. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  15. A global building inventory for earthquake loss estimation and risk management

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  16. Quantitative Earthquake Prediction on Global and Regional Scales

    Kossobokov, Vladimir G.

    2006-01-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  17. Quantitative Earthquake Prediction on Global and Regional Scales

    Kossobokov, Vladimir G.

    2006-03-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  18. Estimating shaking-induced casualties and building damage for global earthquake events: a proposed modelling approach

    So, Emily; Spence, Robin

    2013-01-01

    Recent earthquakes such as the Haiti earthquake of 12 January 2010 and the Qinghai earthquake on 14 April 2010 have highlighted the importance of rapid estimation of casualties after the event for humanitarian response. Both of these events resulted in surprisingly high death tolls, casualties and survivors made homeless. In the Mw = 7.0 Haiti earthquake, over 200,000 people perished with more than 300,000 reported injuries and 2 million made homeless. The Mw = 6.9 earthquake in Qinghai resulted in over 2,000 deaths with a further 11,000 people with serious or moderate injuries and 100,000 people have been left homeless in this mountainous region of China. In such events relief efforts can be significantly benefitted by the availability of rapid estimation and mapping of expected casualties. This paper contributes to ongoing global efforts to estimate probable earthquake casualties very rapidly after an earthquake has taken place. The analysis uses the assembled empirical damage and casualty data in the Cambridge Earthquake Impacts Database (CEQID) and explores data by event and across events to test the relationships of building and fatality distributions to the main explanatory variables of building type, building damage level and earthquake intensity. The prototype global casualty estimation model described here uses a semi-empirical approach that estimates damage rates for different classes of buildings present in the local building stock, and then relates fatality rates to the damage rates of each class of buildings. This approach accounts for the effect of the very different types of buildings (by climatic zone, urban or rural location, culture, income level etc), on casualties. The resulting casualty parameters were tested against the overall casualty data from several historical earthquakes in CEQID; a reasonable fit was found.

  19. The GED4GEM project: development of a Global Exposure Database for the Global Earthquake Model initiative

    Gamba, P.; Cavalca, D.; Jaiswal, K.S.; Huyck, C.; Crowley, H.

    2012-01-01

    In order to quantify earthquake risk of any selected region or a country of the world within the Global Earthquake Model (GEM) framework (www.globalquakemodel.org/), a systematic compilation of building inventory and population exposure is indispensable. Through the consortium of leading institutions and by engaging the domain-experts from multiple countries, the GED4GEM project has been working towards the development of a first comprehensive publicly available Global Exposure Database (GED). This geospatial exposure database will eventually facilitate global earthquake risk and loss estimation through GEM’s OpenQuake platform. This paper provides an overview of the GED concepts, aims, datasets, and inference methodology, as well as the current implementation scheme, status and way forward.

  20. Characteristics of global strong earthquakes and their implications ...

    11

    as important sources for describing the present-day stress field and regime. ..... happened there will indicate relative movements between Pacific plate and Australia ... time, and (b) earthquake slip occurs in the direction of maximum shear stress .... circum-pacific seismic belt and the Himalaya collision boundary as shown in ...

  1. Earthquake and nuclear explosion location using the global seismic network

    Lopez, L.M.

    1983-01-01

    The relocation of nuclear explosions, aftershock sequence and regional seismicity is addressed by using joint hypocenter determination, Lomnitz' distance domain location, and origin time and earthquake depth determination with local observations. Distance domain and joint hypocenter location are used for a stepwise relocation of nuclear explosions in the USSR. The resulting origin times are 2.5 seconds earlier than those obtained by ISC. Local travel times from the relocated explosions are compared to Jeffreys-Bullen tables. P times are found to be faster at 9-30 0 distances, the largest deviation being around 10 seconds at 13-18 0 . At these distances S travel times also are faster by approximately 20 seconds. The 1977 Sumba earthquake sequence is relocated by iterative joint hypocenter determination of events with most station reports. Simultaneously determined station corrections are utilized for the relocation of smaller aftershocks. The relocated hypocenters indicate that the aftershocks were initially concentrated along the deep trench. Origin times and depths are recalculated for intermediate depth and deep earthquakes using local observations in and around the Japanese Islands. It is found that origin time and depth differ systematically from ISC values for intermediate depth events. Origin times obtained for events below the crust down to 100 km depth are earlier, whereas no general bias seem to exist for origin times of events in the 100-400 km depth range. The recalculated depths for earthquakes shallower than 100 km are shallower than ISC depths. The depth estimates for earthquakes deeper than 100 km were increased by the recalculations

  2. Earthquake and nuclear explosion location using the global seismic network

    Lopez, L.M.

    1983-01-01

    The relocation of nuclear explosions, aftershock sequence and regional seismicity is addressed by using joint hypocenter determination, Lomnitz' distance domain location, and origin time and earthquake depth determination with local observations. Distance domain and joint hypocenter location are used for a stepwise relocation of nuclear explosions in the USSR. The resulting origin times are 2.5 seconds earlier than those obtained by ISC. Local travel times from the relocated explosions are compared to Jeffreys-Bullen tables. P times are found to be faster at 9-30/sup 0/ distances, the largest deviation being around 10 seconds at 13-18/sup 0/. At these distances S travel times also are faster by approximately 20 seconds. The 1977 Sumba earthquake sequence is relocated by iterative joint hypocenter determination of events with most station reports. Simultaneously determined station corrections are utilized for the relocation of smaller aftershocks. The relocated hypocenters indicate that the aftershocks were initially concentrated along the deep trench. Origin times and depths are recalculated for intermediate depth and deep earthquakes using local observations in and around the Japanese Islands. It is found that origin time and depth differ systematically from ISC values for intermediate depth events. Origin times obtained for events below the crust down to 100 km depth are earlier, whereas no general bias seem to exist for origin times of events in the 100-400 km depth range. The recalculated depths for earthquakes shallower than 100 km are shallower than ISC depths. The depth estimates for earthquakes deeper than 100 km were increased by the recalculations.

  3. A global earthquake discrimination scheme to optimize ground-motion prediction equation selection

    Garcia, Daniel; Wald, David J.; Hearne, Michael

    2012-01-01

    We present a new automatic earthquake discrimination procedure to determine in near-real time the tectonic regime and seismotectonic domain of an earthquake, its most likely source type, and the corresponding ground-motion prediction equation (GMPE) class to be used in the U.S. Geological Survey (USGS) Global ShakeMap system. This method makes use of the Flinn–Engdahl regionalization scheme, seismotectonic information (plate boundaries, global geology, seismicity catalogs, and regional and local studies), and the source parameters available from the USGS National Earthquake Information Center in the minutes following an earthquake to give the best estimation of the setting and mechanism of the event. Depending on the tectonic setting, additional criteria based on hypocentral depth, style of faulting, and regional seismicity may be applied. For subduction zones, these criteria include the use of focal mechanism information and detailed interface models to discriminate among outer-rise, upper-plate, interface, and intraslab seismicity. The scheme is validated against a large database of recent historical earthquakes. Though developed to assess GMPE selection in Global ShakeMap operations, we anticipate a variety of uses for this strategy, from real-time processing systems to any analysis involving tectonic classification of sources from seismic catalogs.

  4. The Key Role of Eyewitnesses in Rapid Impact Assessment of Global Earthquake

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Etivant, C.; Frobert, L.; Godey, S.

    2014-12-01

    Uncertainties in rapid impact assessments of global earthquakes are intrinsically large because they rely on 3 main elements (ground motion prediction models, building stock inventory and related vulnerability) which values and/or spatial variations are poorly constrained. Furthermore, variations of hypocentral location and magnitude within their respective uncertainty domain can lead to significantly different shaking level for centers of population and change the scope of the disaster. We present the strategy and methods implemented at the Euro-Med Seismological Centre (EMSC) to rapidly collect in-situ observations on earthquake effects from eyewitnesses for reducing uncertainties of rapid earthquake impact assessment. It comprises crowdsourced information (online questionnaires, pics) as well as information derived from real time analysis of web traffic (flashourcing technique), and more recently deployment of QCN (Quake Catcher Network) low cost sensors. We underline the importance of merging results of different methods to improve performances and reliability of collected data.We try to better understand and respond to public demands and expectations after earthquakes through improved information services and diversification of information tools (social networks, smartphone app., browsers adds-on…), which, in turn, drive more eyewitnesses to our services and improve data collection. We will notably present our LastQuake Twitter feed (Quakebot) and smartphone applications (IOs and android) which only report earthquakes that matter for the public and authorities, i.e. felt and damaging earthquakes identified thanks to citizen generated information.

  5. Calculation of earthquake rupture histories using a hybrid global search algorithm: Application to the 1992 Landers, California, earthquake

    Hartzell, S.; Liu, P.

    1996-01-01

    A method is presented for the simultaneous calculation of slip amplitudes and rupture times for a finite fault using a hybrid global search algorithm. The method we use combines simulated annealing with the downhill simplex method to produce a more efficient search algorithm then either of the two constituent parts. This formulation has advantages over traditional iterative or linearized approaches to the problem because it is able to escape local minima in its search through model space for the global optimum. We apply this global search method to the calculation of the rupture history for the Landers, California, earthquake. The rupture is modeled using three separate finite-fault planes to represent the three main fault segments that failed during this earthquake. Both the slip amplitude and the time of slip are calculated for a grid work of subfaults. The data used consist of digital, teleseismic P and SH body waves. Long-period, broadband, and short-period records are utilized to obtain a wideband characterization of the source. The results of the global search inversion are compared with a more traditional linear-least-squares inversion for only slip amplitudes. We use a multi-time-window linear analysis to relax the constraints on rupture time and rise time in the least-squares inversion. Both inversions produce similar slip distributions, although the linear-least-squares solution has a 10% larger moment (7.3 ?? 1026 dyne-cm compared with 6.6 ?? 1026 dyne-cm). Both inversions fit the data equally well and point out the importance of (1) using a parameterization with sufficient spatial and temporal flexibility to encompass likely complexities in the rupture process, (2) including suitable physically based constraints on the inversion to reduce instabilities in the solution, and (3) focusing on those robust rupture characteristics that rise above the details of the parameterization and data set.

  6. Assessment of impact of strong earthquakes to the global economy by example of Thoku event

    Tatiana, Skufina; Peter, Skuf'in; Sergey, Baranov; Vera, Samarina; Taisiya, Shatalova

    2016-04-01

    We examine the economic consequences of strong earthquakes by example of M9 Tahoku one that occurred on March 11, 2011 close to the northeast shore of Japanese coast Honshu. This earthquake became the strongest in the whole history of the seismological observations in this part of the planet. The generated tsunami killed more than 15,700 people, damaged 332,395 buildings and 2,126 roads. The total economic loss in Japan was estimated at 309 billion. The catastrophe in Japan also impacted global economy. To estimate its impact, we used regional and global stock indexes, production indexes, stock prices of the main Japanese, European and US companies, import and export dynamics, as well as the data provided by the custom of Japan. We also demonstrated that the catastrophe substantially affected the markets and on the short run in some indicators it even exceeded the effect of the global financial crisis of 2008. The last strong earthquake occurred in Nepal (25.04.2015, M7.8) and Chile (16.09.2015, M8.3), both actualized the research of cost assessments of the overall economic impact of seismic hazard. We concluded that it is necessary to treat strong earthquakes as one very important factor that affects the world economy depending on their location. The research was supported by Russian Foundation for Basic Research (Project 16-06-00056A).

  7. Facilitating open global data use in earthquake source modelling to improve geodetic and seismological approaches

    Sudhaus, Henriette; Heimann, Sebastian; Steinberg, Andreas; Isken, Marius; Vasyura-Bathke, Hannes

    2017-04-01

    In the last few years impressive achievements have been made in improving inferences about earthquake sources by using InSAR (Interferometric Synthetic Aperture Radar) data. Several factors aided these developments. The open data basis of earthquake observations has expanded vastly with the two powerful Sentinel-1 SAR sensors up in space. Increasing computer power allows processing of large data sets for more detailed source models. Moreover, data inversion approaches for earthquake source inferences are becoming more advanced. By now data error propagation is widely implemented and the estimation of model uncertainties is a regular feature of reported optimum earthquake source models. Also, more regularly InSAR-derived surface displacements and seismological waveforms are combined, which requires finite rupture models instead of point-source approximations and layered medium models instead of homogeneous half-spaces. In other words the disciplinary differences in geodetic and seismological earthquake source modelling shrink towards common source-medium descriptions and a source near-field/far-field data point of view. We explore and facilitate the combination of InSAR-derived near-field static surface displacement maps and dynamic far-field seismological waveform data for global earthquake source inferences. We join in the community efforts with the particular goal to improve crustal earthquake source inferences in generally not well instrumented areas, where often only the global backbone observations of earthquakes are available provided by seismological broadband sensor networks and, since recently, by Sentinel-1 SAR acquisitions. We present our work on modelling standards for the combination of static and dynamic surface displacements in the source's near-field and far-field, e.g. on data and prediction error estimations as well as model uncertainty estimation. Rectangular dislocations and moment-tensor point sources are exchanged by simple planar finite

  8. Development of the Global Earthquake Model’s neotectonic fault database

    Christophersen, Annemarie; Litchfield, Nicola; Berryman, Kelvin; Thomas, Richard; Basili, Roberto; Wallace, Laura; Ries, William; Hayes, Gavin P.; Haller, Kathleen M.; Yoshioka, Toshikazu; Koehler, Richard D.; Clark, Dan; Wolfson-Schwehr, Monica; Boettcher, Margaret S.; Villamor, Pilar; Horspool, Nick; Ornthammarath, Teraphan; Zuñiga, Ramon; Langridge, Robert M.; Stirling, Mark W.; Goded, Tatiana; Costa, Carlos; Yeats, Robert

    2015-01-01

    The Global Earthquake Model (GEM) aims to develop uniform, openly available, standards, datasets and tools for worldwide seismic risk assessment through global collaboration, transparent communication and adapting state-of-the-art science. GEM Faulted Earth (GFE) is one of GEM’s global hazard module projects. This paper describes GFE’s development of a modern neotectonic fault database and a unique graphical interface for the compilation of new fault data. A key design principle is that of an electronic field notebook for capturing observations a geologist would make about a fault. The database is designed to accommodate abundant as well as sparse fault observations. It features two layers, one for capturing neotectonic faults and fold observations, and the other to calculate potential earthquake fault sources from the observations. In order to test the flexibility of the database structure and to start a global compilation, five preexisting databases have been uploaded to the first layer and two to the second. In addition, the GFE project has characterised the world’s approximately 55,000 km of subduction interfaces in a globally consistent manner as a basis for generating earthquake event sets for inclusion in earthquake hazard and risk modelling. Following the subduction interface fault schema and including the trace attributes of the GFE database schema, the 2500-km-long frontal thrust fault system of the Himalaya has also been characterised. We propose the database structure to be used widely, so that neotectonic fault data can make a more complete and beneficial contribution to seismic hazard and risk characterisation globally.

  9. Seismic waves and earthquakes in a global monolithic model

    Roubíček, Tomáš

    2018-03-01

    The philosophy that a single "monolithic" model can "asymptotically" replace and couple in a simple elegant way several specialized models relevant on various Earth layers is presented and, in special situations, also rigorously justified. In particular, global seismicity and tectonics is coupled to capture, e.g., (here by a simplified model) ruptures of lithospheric faults generating seismic waves which then propagate through the solid-like mantle and inner core both as shear (S) or pressure (P) waves, while S-waves are suppressed in the fluidic outer core and also in the oceans. The "monolithic-type" models have the capacity to describe all the mentioned features globally in a unified way together with corresponding interfacial conditions implicitly involved, only when scaling its parameters appropriately in different Earth's layers. Coupling of seismic waves with seismic sources due to tectonic events is thus an automatic side effect. The global ansatz is here based, rather for an illustration, only on a relatively simple Jeffreys' viscoelastic damageable material at small strains whose various scaling (limits) can lead to Boger's viscoelastic fluid or even to purely elastic (inviscid) fluid. Self-induced gravity field, Coriolis, centrifugal, and tidal forces are counted in our global model, as well. The rigorous mathematical analysis as far as the existence of solutions, convergence of the mentioned scalings, and energy conservation is briefly presented.

  10. Characteristics of global strong earthquakes and their implications ...

    Ju Wei

    2017-10-06

    Oct 6, 2017 ... 72. 024. 127. 037. SS. 19760727-194211. 39.52. 118.03 015.0. 7.6. 126 ...... (4) Misfit, regime and quality rank: average misfit angle, stress ... angular misfit until all misfit angles fall below .... present global tectonic framework.

  11. Global stabilisation of large-scale hydraulic networks with quantised and positive proportional controls

    Jensen, Tom Nørgaard; Wisniewski, Rafal

    2013-01-01

    a set of decentralised, logarithmic quantised and constrained control actions with properly designed quantisation parameters. That is, an attractor set with a compact basin of attraction exists. Subsequently, the basin can be increased by increasing the control gains. In our work, this result...... is extended by showing that an attractor set with a global basin of attraction exists for arbitrary values of positive control gains, given that the upper level of the quantiser is properly designed. Furthermore, the proof is given for general monotone quantisation maps. Since the basin of attraction...

  12. Proportionality lost - proportionality regained?

    Werlauff, Erik

    2010-01-01

    In recent years, the European Court of Justice (the ECJ) seems to have accepted restrictions on the freedom of establishment and other basic freedoms, despite the fact that a more thorough proportionality test would have revealed that the restriction in question did not pass the 'rule of reason' ...

  13. Statistical characteristics of seismo-ionospheric GPS TEC disturbances prior to global Mw ≥ 5.0 earthquakes (1998-2014)

    Shah, Munawar; Jin, Shuanggen

    2015-12-01

    Pre-earthquake ionospheric anomalies are still challenging and unclear to obtain and understand, particularly for different earthquake magnitudes and focal depths as well as types of fault. In this paper, the seismo-ionospheric disturbances (SID) related to global earthquakes with 1492 Mw ≥ 5.0 from 1998 to 2014 are investigated using the total electron content (TEC) of GPS global ionosphere maps (GIM). Statistical analysis of 10-day TEC data before global Mw ≥ 5.0 earthquakes shows significant enhancement 5 days before an earthquake of Mw ≥ 6.0 at a 95% confidence level. Earthquakes with a focal depth of less than 60 km and Mw ≥ 6.0 are presumably the root of deviation in the ionospheric TEC because earthquake breeding zones have gigantic quantities of energy at shallower focal depths. Increased anomalous TEC is recorded in cumulative percentages beyond Mw = 5.5. Sharpness in cumulative percentages is evident in seismo-ionospheric disturbance prior to Mw ≥ 6.0 earthquakes. Seismo-ionospheric disturbances related to strike slip and thrust earthquakes are noticeable for magnitude Mw6.0-7.0 earthquakes. The relative values reveal high ratios (up to 2) and low ratios (up to -0.5) within 5 days prior to global earthquakes for positive and negative anomalies. The anomalous patterns in TEC related to earthquakes are possibly due to the coupling of high amounts of energy from earthquake breeding zones of higher magnitude and shallower focal depth.

  14. GEM1: First-year modeling and IT activities for the Global Earthquake Model

    Anderson, G.; Giardini, D.; Wiemer, S.

    2009-04-01

    GEM is a public-private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) to build an independent standard for modeling and communicating earthquake risk worldwide. GEM is aimed at providing authoritative, open information about seismic risk and decision tools to support mitigation. GEM will also raise risk awareness and help post-disaster economic development, with the ultimate goal of reducing the toll of future earthquakes. GEM will provide a unified set of seismic hazard, risk, and loss modeling tools based on a common global IT infrastructure and consensus standards. These tools, systems, and standards will be developed in partnership with organizations around the world, with coordination by the GEM Secretariat and its Secretary General. GEM partners will develop a variety of global components, including a unified earthquake catalog, fault database, and ground motion prediction equations. To ensure broad representation and community acceptance, GEM will include local knowledge in all modeling activities, incorporate existing detailed models where possible, and independently test all resulting tools and models. When completed in five years, GEM will have a versatile, penly accessible modeling environment that can be updated as necessary, and will provide the global standard for seismic hazard, risk, and loss models to government ministers, scientists and engineers, financial institutions, and the public worldwide. GEM is now underway with key support provided by private sponsors (Munich Reinsurance Company, Zurich Financial Services, AIR Worldwide Corporation, and Willis Group Holdings); countries including Belgium, Germany, Italy, Singapore, Switzerland, and Turkey; and groups such as the European Commission. The GEM Secretariat has been selected by the OECD and will be hosted at the Eucentre at the University of Pavia in Italy; the Secretariat is now formalizing the creation of the GEM Foundation. Some of GEM's global

  15. Prediction of Global Damage and Reliability Based Upon Sequential Identification and Updating of RC Structures Subject to Earthquakes

    Nielsen, Søren R.K.; Skjærbæk, P. S.; Köylüoglu, H. U.

    The paper deals with the prediction of global damage and future structural reliability with special emphasis on sensitivity, bias and uncertainty of these predictions dependent on the statistically equivalent realizations of the future earthquake. The predictions are based on a modified Clough......-Johnston single-degree-of-freedom (SDOF) oscillator with three parameters which are calibrated to fit the displacement response and the damage development in the past earthquake....

  16. The Benefits and Limitations of Crowdsourced Information for Rapid Damage Assessment of Global Earthquakes

    Bossu, R.; Landès, M.; Roussel, F.

    2017-12-01

    The Internet has fastened the collection of felt reports and macroseismic data after global earthquakes. At the European-Mediterranean Seismological Centre (EMSC), where the traditional online questionnaires have been replace by thumbnail-based questionnaires, an average of half of the reports are collected within 10 minutes of an earthquake's occurrence. In regions where EMSC is well identified this goes down to 5 min. The user simply specifies the thumbnail corresponding to observed effects erasing languages barriers and improving collection via small smartphone screens. A previous study has shown that EMSC data is well correlated with "Did You Feel It" (DYFI) data and 3 independent, manually collected datasets. The efficiency and rapidity of felt report collection through thumbnail-based questionnaires does not necessarily mean that they offer a complete picture of the situation for all intensities values, especially the higher ones. There are several potential limitations. Demographics probably play a role but so might eyewitnesses' behaviors: it is probably not their priority to report when their own safety and that of their loved ones is at stake. We propose to test this hypothesis on EMSC felt reports and to extend the study to LastQuake smartphone application uses. LastQuake is a free smartphone app providing very rapid information on felt earthquakes. There are currently 210 000 active users around the world covering almost every country except for a few ones in Sub-Saharan Africa. Along with felt reports we also analyze the characteristics of LastQuake app launches. For both composite datasets created from 108 earthquakes, we analyze the rapidity of eyewitnesses' reaction and how it changes with intensity values and surmise how they reflect different types of behaviors. We will show the intrinsic limitations of crowdsourced information for rapid situation awareness. More importantly, we will show in which cases the lack of crowdsourced information could

  17. Global earthquake catalogs and long-range correlation of seismic activity (Invited)

    Ogata, Y.

    2009-12-01

    In view of the long-term seismic activity in the world, homogeneity of a global catalog is indispensable. Lately, Engdahl and Villaseñor (2002) compiled a global earthquake catalog of magnitude (M)7.0 or larger during the last century (1900-1999). This catalog is based on the various existing catalogs such as Abe catalog (Abe, 1981, 1984; Abe and Noguchi, 1983a, b) for the world seismicity (1894-1980), its modified catalogs by Perez and Scholz (1984) and by Pacheco and Sykes (1992), and also the Harvard University catalog since 1975. However, the original surface wave magnitudes of Abe catalog were systematically changed by Perez and Scholz (1984) and Pacheco and Sykes (1992). They suspected inhomogeneity of the Abe catalog and claimed that the two seeming changes in the occurrence rate around 1922 and 1948 resulted from magnitude shifts for some instrumental-related reasons. They used a statistical test assuming that such a series of large earthquakes in the world should behave as the stationary Poisson process (uniform occurrences). It is obvious that their claim strongly depends on their a priori assumption of an independent or short-range dependence of earthquake occurrence. We question this assumption from the viewpoint of long-range dependence of seismicity. We make some statistical analyses of the spectrum, dispersion-time diagrams and R/S for estimating and testing of the long-range correlations. We also attempt to show the possibility that the apparent rate change in the global seismicity can be simulated by a certain long-range correlated process. Further, if we divide the globe into the two regions of high and low latitudes, for example, we have different shapes of the cumulative curves to each other, and the above mentioned apparent change-points disappear from the both regions. This suggests that the Abe catalog shows the genuine seismic activity rather than the artifact of the suspected magnitude shifts that should appear in any wide enough regions

  18. The orientation of disaster donations: differences in the global response to five major earthquakes.

    Wei, Jiuchang; Marinova, Dora

    2016-07-01

    This study analyses the influence of gift giving, geographical location, political regime, and trade openness on disaster donation decisions, using five severe earthquakes that occurred between 2008 and 2012 as case studies. The results show that global disaster donation is not dominated by only philanthropy or trade interests, and that the determinants of donation decisions vary with the scale of the natural disaster and the characteristics of the disaster-affected countries. While gift giving exists in the case of middle-size earthquakes, political regimes play a very important part in the overall donation process. Countries with higher perceived corruption may donate more frequently, but those that are more democratic may be more generous in their donations. Generosity based on geographical proximity to the calamity is significant in the decision-making process for most natural disasters, yet it may have a negative effect on donations in Latin America and the Caribbean. © 2016 The Author(s). Disasters © Overseas Development Institute, 2016.

  19. Development of the U.S. Geological Survey's PAGER system (Prompt Assessment of Global Earthquakes for Response)

    Wald, D.J.; Earle, P.S.; Allen, T.I.; Jaiswal, K.; Porter, K.; Hearne, M.

    2008-01-01

    The Prompt Assessment of Global Earthquakes for Response (PAGER) System plays a primary alerting role for global earthquake disasters as part of the U.S. Geological Survey’s (USGS) response protocol. We provide an overview of the PAGER system, both of its current capabilities and our ongoing research and development. PAGER monitors the USGS’s near real-time U.S. and global earthquake origins and automatically identifies events that are of societal importance, well in advance of ground-truth or news accounts. Current PAGER notifications and Web pages estimate the population exposed to each seismic intensity level. In addition to being a useful indicator of potential impact, PAGER’s intensity/exposure display provides a new standard in the dissemination of rapid earthquake information. We are currently developing and testing a more comprehensive alert system that will include casualty estimates. This is motivated by the idea that an estimated range of possible number of deaths will aid in decisions regarding humanitarian response. Underlying the PAGER exposure and loss models are global earthquake ShakeMap shaking estimates, constrained as quickly as possible by finite-fault modeling and observed ground motions and intensities, when available. Loss modeling is being developed comprehensively with a suite of candidate models that range from fully empirical to largely analytical approaches. Which of these models is most appropriate for use in a particular earthquake depends on how much is known about local building stocks and their vulnerabilities. A first-order country-specific global building inventory has been developed, as have corresponding vulnerability functions. For calibrating PAGER loss models, we have systematically generated an Atlas of 5,000 ShakeMaps for significant global earthquakes during the last 36 years. For many of these, auxiliary earthquake source and shaking intensity data are also available. Refinements to the loss models are ongoing

  20. Estimating Fallout Building Attributes from Architectural Features and Global Earthquake Model (GEM) Building Descriptions

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Staci R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-03-01

    A nuclear explosion has the potential to injure or kill tens to hundreds of thousands (or more) of people through exposure to fallout (external gamma) radiation. Existing buildings can protect their occupants (reducing fallout radiation exposures) by placing material and distance between fallout particles and individuals indoors. Prior efforts have determined an initial set of building attributes suitable to reasonably assess a given building’s protection against fallout radiation. The current work provides methods to determine the quantitative values for these attributes from (a) common architectural features and data and (b) buildings described using the Global Earthquake Model (GEM) taxonomy. These methods will be used to improve estimates of fallout protection for operational US Department of Defense (DoD) and US Department of Energy (DOE) consequence assessment models.

  1. Teamwork tools and activities within the hazard component of the Global Earthquake Model

    Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.

    2013-05-01

    The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.

  2. Global exponential convergence of neutral-type Hopfield neural networks with multi-proportional delays and leakage delays

    Xu, Changjin; Li, Peiluan

    2017-01-01

    This paper is concerned with a class of neutral-type Hopfield neural networks with multi-proportional delays and leakage delays. Using the differential inequality theory, a set of sufficient conditions which guarantee that all solutions of neutral-type Hopfield neural networks with multi-proportional delays and leakage delays converge exponentially to zero vector are derived. Computer simulations are carried out to verify our theoretical findings. The obtained results of this paper are new and complement some previous studies.

  3. A global search inversion for earthquake kinematic rupture history: Application to the 2000 western Tottori, Japan earthquake

    Piatanesi, A.; Cirella, A.; Spudich, P.; Cocco, M.

    2007-01-01

    We present a two-stage nonlinear technique to invert strong motions records and geodetic data to retrieve the rupture history of an earthquake on a finite fault. To account for the actual rupture complexity, the fault parameters are spatially variable peak slip velocity, slip direction, rupture time and risetime. The unknown parameters are given at the nodes of the subfaults, whereas the parameters within a subfault are allowed to vary through a bilinear interpolation of the nodal values. The forward modeling is performed with a discrete wave number technique, whose Green's functions include the complete response of the vertically varying Earth structure. During the first stage, an algorithm based on the heat-bath simulated annealing generates an ensemble of models that efficiently sample the good data-fitting regions of parameter space. In the second stage (appraisal), the algorithm performs a statistical analysis of the model ensemble and computes a weighted mean model and its standard deviation. This technique, rather than simply looking at the best model, extracts the most stable features of the earthquake rupture that are consistent with the data and gives an estimate of the variability of each model parameter. We present some synthetic tests to show the effectiveness of the method and its robustness to uncertainty of the adopted crustal model. Finally, we apply this inverse technique to the well recorded 2000 western Tottori, Japan, earthquake (Mw 6.6); we confirm that the rupture process is characterized by large slip (3-4 m) at very shallow depths but, differently from previous studies, we imaged a new slip patch (2-2.5 m) located deeper, between 14 and 18 km depth. Copyright 2007 by the American Geophysical Union.

  4. Proportional reasoning

    Dole, Shelley; Hilton, Annette; Hilton, Geoff

    2015-01-01

    Proportional reasoning is widely acknowledged as a key to success in school mathematics, yet students’ continual difficulties with proportion-related tasks are well documented. This paper draws on a large research study that aimed to support 4th to 9th grade teachers to design and implement tasks...

  5. Comparison of earthquake source parameters and interseismic plate coupling variations in global subduction zones (Invited)

    Bilek, S. L.; Moyer, P. A.; Stankova-Pursley, J.

    2010-12-01

    Geodetically determined interseismic coupling variations have been found in subduction zones worldwide. These coupling variations have been linked to heterogeneities in interplate fault frictional conditions. These connections to fault friction imply that observed coupling variations are also important in influencing details in earthquake rupture behavior. Because of the wealth of newly available geodetic models along many subduction zones, it is now possible to examine detailed variations in coupling and compare to seismicity characteristics. Here we use a large catalog of earthquake source time functions and slip models for moderate to large magnitude earthquakes to explore these connections, comparing earthquake source parameters with available models of geodetic coupling along segments of the Japan, Kurile, Kamchatka, Peru, Chile, and Alaska subduction zones. In addition, we use published geodetic results along the Costa Rica margin to compare with source parameters of small magnitude earthquakes recorded with an onshore-offshore network of seismometers. For the moderate to large magnitude earthquakes, preliminary results suggest a complex relationship between earthquake parameters and estimates of strongly and weakly coupled segments of the plate interface. For example, along the Kamchatka subduction zone, these earthquakes occur primarily along the transition between strong and weak coupling, with significant heterogeneity in the pattern of moment scaled duration with respect to the coupling estimates. The longest scaled duration event in this catalog occurred in a region of strong coupling. Earthquakes along the transition between strong and weakly coupled exhibited the most complexity in the source time functions. Use of small magnitude (0.5 earthquake spectra, with higher corner frequencies and higher mean apparent stress for earthquakes that occur in along the Osa Peninsula relative to the Nicoya Peninsula, mimicking the along-strike variations in

  6. Imaging 2015 Mw 7.8 Gorkha Earthquake and Its Aftershock Sequence Combining Multiple Calibrated Global Seismic Arrays

    LI, B.; Ghosh, A.

    2016-12-01

    The 2015 Mw 7.8 Gorkha earthquake provides a good opportunity to study the tectonics and earthquake hazards in the Himalayas, one of the most seismically active plate boundaries. Details of the seismicity patterns and associated structures in the Himalayas are poorly understood mainly due to limited instrumentation. Here, we apply a back-projection method to study the mainshock rupture and the following aftershock sequence using four large aperture global seismic arrays. All the arrays show eastward rupture propagation of about 130 km and reveal similar evolution of seismic energy radiation, with strong high-frequency energy burst about 50 km north of Kathmandu. Each single array, however, is typically limited by large azimuthal gap, low resolution, and artifacts due to unmodeled velocity structures. Therefore, we use a self-consistent empirical calibration method to combine four different arrays to image the Gorkha event. It greatly improves the resolution, can better track rupture and reveal details that cannot be resolved by any individual array. In addition, we also use the same arrays at teleseismic distances and apply a back-projection technique to detect and locate the aftershocks immediately following the Gorkha earthquake. We detect about 2.5 times the aftershocks recorded by the Advance National Seismic System comprehensive earthquake catalog during the 19 days following the mainshock. The aftershocks detected by the arrays show an east-west trend in general, with majority of the aftershocks located at the eastern part of the rupture patch and surrounding the rupture zone of the largest Mw 7.3 aftershock. Overall spatiotemporal aftershock pattern agrees well with global catalog, with our catalog showing more details relative to the standard global catalog. The improved aftershock catalog enables us to better study the aftershock dynamics, stress evolution in this region. Moreover, rapid and better imaging of aftershock distribution may aid rapid response

  7. Global catalog of earthquake rupture velocities shows anticorrelation between stress drop and rupture velocity

    Chounet, Agnès; Vallée, Martin; Causse, Mathieu; Courboulex, Françoise

    2018-05-01

    Application of the SCARDEC method provides the apparent source time functions together with seismic moment, depth, and focal mechanism, for most of the recent earthquakes with magnitude larger than 5.6-6. Using this large dataset, we have developed a method to systematically invert for the rupture direction and average rupture velocity Vr, when unilateral rupture propagation dominates. The approach is applied to all the shallow (z earthquakes of the catalog over the 1992-2015 time period. After a careful validation process, rupture properties for a catalog of 96 earthquakes are obtained. The subsequent analysis of this catalog provides several insights about the seismic rupture process. We first report that up-dip ruptures are more abundant than down-dip ruptures for shallow subduction interface earthquakes, which can be understood as a consequence of the material contrast between the slab and the overriding crust. Rupture velocities, which are searched without any a-priori up to the maximal P wave velocity (6000-8000 m/s), are found between 1200 m/s and 4500 m/s. This observation indicates that no earthquakes propagate over long distances with rupture velocity approaching the P wave velocity. Among the 23 ruptures faster than 3100 m/s, we observe both documented supershear ruptures (e.g. the 2001 Kunlun earthquake), and undocumented ruptures that very likely include a supershear phase. We also find that the correlation of Vr with the source duration scaled to the seismic moment (Ts) is very weak. This directly implies that both Ts and Vr are anticorrelated with the stress drop Δσ. This result has implications for the assessment of the peak ground acceleration (PGA) variability. As shown by Causse and Song (2015), an anticorrelation between Δσ and Vr significantly reduces the predicted PGA variability, and brings it closer to the observed variability.

  8. Natural Time, Nowcasting and the Physics of Earthquakes: Estimation of Seismic Risk to Global Megacities

    Rundle, John B.; Luginbuhl, Molly; Giguere, Alexis; Turcotte, Donald L.

    2018-02-01

    Natural Time ("NT") refers to the concept of using small earthquake counts, for example of M > 3 events, to mark the intervals between large earthquakes, for example M > 6 events. The term was first used by Varotsos et al. (2005) and later by Holliday et al. (2006) in their studies of earthquakes. In this paper, we discuss ideas and applications arising from the use of NT to understand earthquake dynamics, in particular by use of the idea of nowcasting. Nowcasting differs from forecasting, in that the goal of nowcasting is to estimate the current state of the system, rather than the probability of a future event. Rather than focus on an individual earthquake faults, we focus on a defined local geographic region surrounding a particular location. This local region is considered to be embedded in a larger regional setting from which we accumulate the relevant statistics. We apply the nowcasting idea to the practical development of methods to estimate the current state of risk for dozens of the world's seismically exposed megacities, defined as cities having populations of over 1 million persons. We compute a ranking of these cities based on their current nowcast value, and discuss the advantages and limitations of this approach. We note explicitly that the nowcast method is not a model, in that there are no free parameters to be fit to data. Rather, the method is simply a presentation of statistical data, which the user can interpret. Among other results, we find, for example, that the current nowcast ranking of the Los Angeles region is comparable to its ranking just prior to the January 17, 1994 Northridge earthquake.

  9. Thumbnail‐based questionnaires for the rapid and efficient collection of macroseismic data from global earthquakes

    Bossu, Remy; Landes, Matthieu; Roussel, Frederic; Steed, Robert; Mazet-Roux, Gilles; Martin, Stacey S.; Hough, Susan E.

    2017-01-01

    The collection of earthquake testimonies (i.e., qualitative descriptions of felt shaking) is essential for macroseismic studies (i.e., studies gathering information on how strongly an earthquake was felt in different places), and when done rapidly and systematically, improves situational awareness and in turn can contribute to efficient emergency response. In this study, we present advances made in the collection of testimonies following earthquakes around the world using a thumbnail‐based questionnaire implemented on the European‐Mediterranean Seismological Centre (EMSC) smartphone app and its website compatible for mobile devices. In both instances, the questionnaire consists of a selection of thumbnails, each representing an intensity level of the European Macroseismic Scale 1998. We find that testimonies are collected faster, and in larger numbers, by way of thumbnail‐based questionnaires than by more traditional online questionnaires. Responses were received from all seismically active regions of our planet, suggesting that thumbnails overcome language barriers. We also observed that the app is not sufficient on its own, because the websites are the main source of testimonies when an earthquake strikes a region for the first time in a while; it is only for subsequent shocks that the app is widely used. Notably though, the speed of the collection of testimonies increases significantly when the app is used. We find that automated EMSC intensities as assigned by user‐specified thumbnails are, on average, well correlated with “Did You Feel It?” (DYFI) responses and with the three independently and manually derived macroseismic datasets, but there is a tendency for EMSC to be biased low with respect to DYFI at moderate and large intensities. We address this by proposing a simple adjustment that will be verified in future earthquakes.

  10. The global historical and future economic loss and cost of earthquakes during the production of adaptive worldwide economic fragility functions

    Daniell, James; Wenzel, Friedemann

    2014-05-01

    macroseismic intensity, capital stock estimate, GDP estimate, year and the combined seismic building index (a created combination of the global seismic code index, building practice factor, building age and infrastructure vulnerability). The analysis provided three key results: a) The production of economic fragility functions from the 1900-2008 events showed very good correlation to the economic loss and cost from earthquakes from 2009-2013, in real-time. This methodology has been extended to other natural disaster types (typhoon, flood, drought). b) The reanalysis of historical earthquake events in order to check associated historical loss and costs versus the expected exposure in terms of intensities. The 1939 Chillan, 1948 Turkmenistan, 1950 Iran, 1972 Managua, 1980 Western Nepal and 1992 Erzincan earthquake events were seen as huge outliers compared with the modelled capital stock and GDP and thus additional studies were undertaken to check the original loss results. c) A worldwide GIS layer database of capital stock (gross and net), GDP, infrastructure age and economic indices over the period 1900-2013 have been created in conjunction with the CATDAT database in order to define correct economic loss and costs.

  11. Building Capacity for Earthquake Monitoring: Linking Regional Networks with the Global Community

    Willemann, R. J.; Lerner-Lam, A.

    2006-12-01

    Installing or upgrading a seismic monitoring network is often among the mitigation efforts after earthquake disasters, and this is happening in response to the events both in Sumatra during December 2004 and in Pakistan during October 2005. These networks can yield improved hazard assessment, more resilient buildings where they are most needed, and emergency relief directed more quickly to the worst hit areas after the next large earthquake. Several commercial organizations are well prepared for the fleeting opportunity to provide the instruments that comprise a seismic network, including sensors, data loggers, telemetry stations, and the computers and software required for the network center. But seismic monitoring requires more than hardware and software, no matter how advanced. A well-trained staff is required to select appropriate and mutually compatible components, install and maintain telemetered stations, manage and archive data, and perform the analyses that actually yield the intended benefits. Monitoring is more effective when network operators cooperate with a larger community through free and open exchange of data, sharing information about working practices, and international collaboration in research. As an academic consortium, a facility operator and a founding member of the International Federation of Digital Seismographic Networks, IRIS has access to a broad range of expertise with the skills that are required to help design, install, and operate a seismic network and earthquake analysis center, and stimulate the core training for the professional teams required to establish and maintain these facilities. But delivering expertise quickly when and where it is unexpectedly in demand requires advance planning and coordination in order to respond to the needs of organizations that are building a seismic network, either with tight time constraints imposed by the budget cycles of aid agencies following a disastrous earthquake, or as part of more informed

  12. A global outer-rise/outer-trench-slope (OR/OTS) earthquake study

    Wartman, J. M.; Kita, S.; Kirby, S. H.; Choy, G. L.

    2009-12-01

    Using improved seismic, bathymetric, satellite gravity and other geophysical data, we investigated the seismicity patterns and focal mechanisms of earthquakes in oceanic lithosphere off the trenches of the world that are large enough to be well recorded at teleseismic distances. A number of prominent trends are apparent, some of which have been previously recognized based on more limited data [1], and some of which are largely new [2-5]: (1) The largest events and the highest seismicity rates tend to occur where Mesozoic incoming plates are subducting at high rates (e.g., those in the western Pacific and the Banda segment of Indonesia). The largest events are predominantly shallow normal faulting (SNF) earthquakes. Less common are reverse-faulting (RF) events that tend to be deeper and to be present along with SNF events where nearby seamounts, seamount chains and other volcanic features are subducting [Seno and Yamanaka, 1996]. Blooms of SNF OR/OTS events usually occur just after and seaward of great interplate thrust (IPT) earthquakes but are far less common after smaller IPT events. (2) Plates subducting at slow rates (Chile, the Ninety East Ridge in Sumatra, and the D’Entrecastaux Ridge in Vanuatu).

  13. The profound reach of the 11 April 2012 M 8.6 Indian Ocean earthquake: Short‐term global triggering followed by a longer‐term global shadow

    Pollitz, Fred; Burgmann, Roland; Stein, Ross S.; Sevilgen, Volkan

    2014-01-01

    The 11 April 2012 M 8.6 Indian Ocean earthquake was an unusually large intraoceanic strike‐slip event. For several days, the global M≥4.5 and M≥6.5 seismicity rate at remote distances (i.e., thousands of kilometers from the mainshock) was elevated. The strike‐slip mainshock appears through its Love waves to have triggered a global burst of strike‐slip aftershocks over several days. But the M≥6.5 rate subsequently dropped to zero for the succeeding 95 days, although the M≤6.0 global rate was close to background during this period. Such an extended period without an M≥6.5 event has happened rarely over the past century, and never after a large mainshock. Quiescent periods following previous large (M≥8) mainshocks over the past century are either much shorter or begin so long after a given mainshock that no physical interpretation is warranted. The 2012 mainshock is unique in terms of both the short‐lived global increase and subsequent long quiescent period. We believe that the two components are linked and interpret this pattern as the product of dynamic stressing of a global system of faults. Transient dynamic stresses can encourage short‐term triggering, but, paradoxically, it can also inhibit rupture temporarily until background tectonic loading restores the system to its premainshock stress levels.

  14. Global Positioning System data collection, processing, and analysis conducted by the U.S. Geological Survey Earthquake Hazards Program

    Murray, Jessica R.; Svarc, Jerry L.

    2017-01-01

    The U.S. Geological Survey Earthquake Science Center collects and processes Global Positioning System (GPS) data throughout the western United States to measure crustal deformation related to earthquakes and tectonic processes as part of a long‐term program of research and monitoring. Here, we outline data collection procedures and present the GPS dataset built through repeated temporary deployments since 1992. This dataset consists of observations at ∼1950 locations. In addition, this article details our data processing and analysis procedures, which consist of the following. We process the raw data collected through temporary deployments, in addition to data from continuously operating western U.S. GPS stations operated by multiple agencies, using the GIPSY software package to obtain position time series. Subsequently, we align the positions to a common reference frame, determine the optimal parameters for a temporally correlated noise model, and apply this noise model when carrying out time‐series analysis to derive deformation measures, including constant interseismic velocities, coseismic offsets, and transient postseismic motion.

  15. Expanding Horizons in Mitigating Earthquake Related Disasters in Urban Areas: Global Development of Real-Time Seismology

    Utkucu, Murat; Küyük, Hüseyin Serdar; Demir, İsmail Hakkı

    2016-01-01

    Abstract Real-time seismology is a newly developing alternative approach in seismology to mitigate earthquake hazard. It exploits up-to-date advances in seismic instrument technology, data acquisition, digital communications and computer systems for quickly transforming data into earthquake information in real-time to reduce earthquake losses and its impact on social and economic life in the earthquake prone densely populated urban and industrial areas.  Real-time seismology systems are not o...

  16. Global correlations between maximum magnitudes of subduction zone interface thrust earthquakes and physical parameters of subduction zones

    Schellart, W. P.; Rawlinson, N.

    2013-01-01

    The maximum earthquake magnitude recorded for subduction zone plate boundaries varies considerably on Earth, with some subduction zone segments producing giant subduction zone thrust earthquakes (e.g. Chile, Alaska, Sumatra-Andaman, Japan) and others producing relatively small earthquakes (e.g.

  17. Crowdsourced earthquake early warning

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  18. Global 3-D FDTD Maxwell's-Equations Modeling of Ionospheric Disturbances Associated with Earthquakes Using an Optimized Geodesic Grid

    Simpson, J. J.; Taflove, A.

    2005-12-01

    We report a finite-difference time-domain (FDTD) computational solution of Maxwell's equations [1] that models the possibility of detecting and characterizing ionospheric disturbances above seismic regions. Specifically, we study anomalies in Schumann resonance spectra in the extremely low frequency (ELF) range below 30 Hz as observed in Japan caused by a hypothetical cylindrical ionospheric disturbance above Taiwan. We consider excitation of the global Earth-ionosphere waveguide by lightning in three major thunderstorm regions of the world: Southeast Asia, South America (Amazon region), and Africa. Furthermore, we investigate varying geometries and characteristics of the ionospheric disturbance above Taiwan. The FDTD technique used in this study enables a direct, full-vector, three-dimensional (3-D) time-domain Maxwell's equations calculation of round-the-world ELF propagation accounting for arbitrary horizontal as well as vertical geometrical and electrical inhomogeneities and anisotropies of the excitation, ionosphere, lithosphere, and oceans. Our entire-Earth model grids the annular lithosphere-atmosphere volume within 100 km of sea level, and contains over 6,500,000 grid-points (63 km laterally between adjacent grid points, 5 km radial resolution). We use our recently developed spherical geodesic gridding technique having a spatial discretization best described as resembling the surface of a soccer ball [2]. The grid is comprised entirely of hexagonal cells except for a small fixed number of pentagonal cells needed for completion. Grid-cell areas and locations are optimized to yield a smoothly varying area difference between adjacent cells, thereby maximizing numerical convergence. We compare our calculated results with measured data prior to the Chi-Chi earthquake in Taiwan as reported by Hayakawa et. al. [3]. Acknowledgement This work was suggested by Dr. Masashi Hayakawa, University of Electro-Communications, Chofugaoka, Chofu Tokyo. References [1] A

  19. Global Compilation of InSAR Earthquake Source Models: Comparisons with Seismic Catalogues and the Effects of 3D Earth Structure

    Weston, J. M.; Ferreira, A. M.; Funning, G. J.

    2010-12-01

    While past progress in seismology led to extensive earthquake catalogues such as the Global Centroid Moment Tensor (GCMT) catalogue, recent advances in space geodesy have enabled earthquake parameter estimations from the measurement of the deformation of the Earth’s surface, notably using InSAR data. Many earthquakes have now been studied using InSAR, but a full assessment of the quality and of the additional value of these source parameters compared to traditional seismological techniques is still lacking. In this study we present results of systematic comparisons between earthquake CMT parameters determined using InSAR and seismic data, on a global scale. We compiled a large database of source parameters obtained using InSAR data from the literature and estimated the corresponding CMT parameters into a ICMT compilation. We here present results from the analysis of 58 earthquakes that occurred between 1992-2007 from about 80 published InSAR studies. Multiple studies of the same earthquake are included in the archive, as they are valuable to assess uncertainties. Where faults are segmented, with changes in width along-strike, a weighted average based on the seismic moment in each fault has been used to determine overall earthquake parameters. For variable slip models, we have calculated source parameters taking the spatial distribution of slip into account. The parameters in our ICMT compilation are compared with those taken from the Global CMT (GCMT), ISC, EHB and NEIC catalogues. We find that earthquake fault strike, dip and rake values in the GCMT and ICMT archives are generally compatible with each other. Likewise, the differences in seismic moment in these two archives are relatively small. However, the locations of the centroid epicentres show substantial discrepancies, which are larger when comparing with GCMT locations (10-30km differences) than for EHB, ISC and NEIC locations (5-15km differences). Since InSAR data have a high spatial resolution, and thus

  20. Earthquake Monitoring with the MyShake Global Smartphone Seismic Network

    Inbal, A.; Kong, Q.; Allen, R. M.; Savran, W. H.

    2017-12-01

    Smartphone arrays have the potential for significantly improving seismic monitoring in sparsely instrumented urban areas. This approach benefits from the dense spatial coverage of users, as well as from communication and computational capabilities built into smartphones, which facilitate big seismic data transfer and analysis. Advantages in data acquisition with smartphones trade-off with factors such as the low-quality sensors installed in phones, high noise levels, and strong network heterogeneity, all of which limit effective seismic monitoring. Here we utilize network and array-processing schemes to asses event detectability with the MyShake global smartphone network. We examine the benefits of using this network in either triggered or continuous modes of operation. A global database of ground motions measured on stationary phones triggered by M2-6 events is used to establish detection probabilities. We find that the probability of detecting an M=3 event with a single phone located 20 nearby phones closely match the regional catalog locations. We use simulated broadband seismic data to examine how location uncertainties vary with user distribution and noise levels. To this end, we have developed an empirical noise model for the metropolitan Los-Angeles (LA) area. We find that densities larger than 100 stationary phones/km2 are required to accurately locate M 2 events in the LA basin. Given the projected MyShake user distribution, that condition may be met within the next few years.

  1. A test of a global seismic system for monitoring earthquakes and underground nuclear explosions

    Bowman, J.R.; Muirhead, K.; Spiliopoulos, S.; Jepsen, D.; Leonard, M.

    1993-01-01

    Australia is a member of the Group of Scientific Experts (GSE) to consider international cooperative measures to detect and identify events, an ad hoc group of the United Nations Conference on Disarmament. The GSE conducted a large-scale technical test (GSETT-2) from 22 April to 9 June 1991 that focused on the exchange and analysis of seismic parameter and waveform data. Thirty-four countries participated in GSETT-2, and data were contributed from 60 stations on all continents. GSETT-2 demonstrated the feasibility of collecting and transmitting large volumes (around 1 giga-byte) of digital data around the world, and of producing a preliminary bulletin of global seismicity within 48 hours and a final bulletin within 7 days. However, the experiment also revealed the difficulty of keeping up with the flow of data and analysis with existing resources. The Final Event Bulletins listed 3715 events for the 42 recording days of the test, about twice the number reported routinely by another international agency 5 months later. The quality of the Final Event Bulletin was limited by the uneven spatial distribution of seismic stations that contributed to GSETT-2 and by the ambiguity of associating phases detected by widely separated stations to form seismic events. A monitoring system similar to that used in GSETT-2 could provide timely and accurate reporting of global seismicity. It would need an improved distribution of stations, application of more conservative event formation rules and further development of analysis software. 8 refs., 9 figs

  2. Investigation of Ionospheric Anomalies related to moderate Romanian earthquakes occurred during last decade using VLF/LF INFREP and GNSS Global Networks

    Moldovan, Iren-Adelina; Oikonomou, Christina; Haralambous, Haris; Nastase, Eduard; Emilian Toader, Victorin; Biagi, Pier Francesco; Colella, Roberto; Toma-Danila, Dragos

    2017-04-01

    Ionospheric TEC (Total Electron Content) variations and Low Frequency (LF) signal amplitude data prior to five moderate earthquakes (Mw≥5) occurred in Romania, in Vrancea crustal and subcrustal seismic zones, during the last decade were analyzed using observations from the Global Navigation Satellite System (GNSS) and the European INFREP (International Network for Frontier Research on Earthquake Precursors) networks respectively, aiming to detect potential ionospheric anomalies related to these events and describe their characteristics. For this, spectral analysis on TEC data and terminator time method on VLF/LF data were applied. It was found that TEC perturbations appeared few days (1-7) up to few hours before the events lasting around 2-3 hours, with periods 20 and 3-5 minutes which could be associated with the impending earthquakes. In addition, in all three events the sunrise terminator times were delayed approximately 20-40 min few days prior and during the earthquake day. Acknowledgments This work was partially supported by the Partnership in Priority Areas Program - PNII, under MEN-UEFISCDI, DARING Project no. 69/2014 and the Nucleu Program - PN 16-35, Project no. 03 01

  3. Global Earthquake and Volcanic Eruption Economic losses and costs from 1900-2014: 115 years of the CATDAT database - Trends, Normalisation and Visualisation

    Daniell, James; Skapski, Jens-Udo; Vervaeck, Armand; Wenzel, Friedemann; Schaefer, Andreas

    2015-04-01

    tolls from historic events is discussed. The CATDAT socioeconomic databases of parameters like disaggregated population, GDP, capital stock, building typologies, food security and inter-country export interactions are used to create a current exposure view of the world. The potential for losses globally is discussed with a re-creation of each damaging event since 1900, with well in excess of 10 trillion USD in normalised losses being seen from the 115 years of events. Potential worst case events for volcano and earthquake around the globe are discussed in terms of their potential for damage and huge economic loss today, and over the next century using SSP projections adjusted over a country basis including inter-country effects.

  4. Defeating Earthquakes

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  5. Reaching the global community during disasters: findings from a content analysis of the organizational use of Twitter after the 2010 Haiti earthquake.

    Gurman, Tilly A; Ellenberger, Nicole

    2015-01-01

    Social networking sites provide virtual environments in which individuals and organizations exchange real-time information on a multitude of topics, including health promotion and disease prevention. The January 2010 earthquake in Haiti has been posited as a turning point in the way in which organizations use social media, such as Twitter, for crisis communication. The purpose of this content analysis was to explore whether organizations' use of Twitter changed after the 2010 Haiti earthquake. A team of 13 coders analyzed all English-language tweets (N = 2,616) during the 3 months before and post earthquake from 6 leading organizations in the Haiti disaster relief efforts. Study findings indicate that the ways in which organizations used Twitter changed over time. Chi-square analyses demonstrated that organizations decreased in their use of certain strategies to disseminate information through Twitter, such as the use of links. Organizations did not change in their use of techniques to involve users (e.g., retweet, call to action), with the exception of using tweets as a fundraising mechanism. Study findings highlight missed opportunities among organizations to maximize Twitter in order to encourage more interactive and immediate communication with the global community.

  6. Source Parameter Inversion for Recent Great Earthquakes from a Decade-long Observation of Global Gravity Fields

    Han, Shin-Chan; Riva, Ricccardo; Sauber, Jeanne; Okal, Emile

    2013-01-01

    We quantify gravity changes after great earthquakes present within the 10 year long time series of monthly Gravity Recovery and Climate Experiment (GRACE) gravity fields. Using spherical harmonic normal-mode formulation, the respective source parameters of moment tensor and double-couple were estimated. For the 2004 Sumatra-Andaman earthquake, the gravity data indicate a composite moment of 1.2x10(exp 23)Nm with a dip of 10deg, in agreement with the estimate obtained at ultralong seismic periods. For the 2010 Maule earthquake, the GRACE solutions range from 2.0 to 2.7x10(exp 22)Nm for dips of 12deg-24deg and centroid depths within the lower crust. For the 2011 Tohoku-Oki earthquake, the estimated scalar moments range from 4.1 to 6.1x10(exp 22)Nm, with dips of 9deg-19deg and centroid depths within the lower crust. For the 2012 Indian Ocean strike-slip earthquakes, the gravity data delineate a composite moment of 1.9x10(exp 22)Nm regardless of the centroid depth, comparing favorably with the total moment of the main ruptures and aftershocks. The smallest event we successfully analyzed with GRACE was the 2007 Bengkulu earthquake with M(sub 0) approx. 5.0x10(exp 21)Nm. We found that the gravity data constrain the focal mechanism with the centroid only within the upper and lower crustal layers for thrust events. Deeper sources (i.e., in the upper mantle) could not reproduce the gravity observation as the larger rigidity and bulk modulus at mantle depths inhibit the interior from changing its volume, thus reducing the negative gravity component. Focal mechanisms and seismic moments obtained in this study represent the behavior of the sources on temporal and spatial scales exceeding the seismic and geodetic spectrum.

  7. Proportionality for Military Leaders

    Brown, Gary D

    2000-01-01

    .... Especially lacking is a realization that there are four distinct types of proportionality. In determining whether a particular resort to war is just, national leaders must consider the proportionality of the conflict (i.e...

  8. The economic costs of natural disasters globally from 1900-2015: historical and normalised floods, storms, earthquakes, volcanoes, bushfires, drought and other disasters

    Daniell, James; Wenzel, Friedemann; Schaefer, Andreas

    2016-04-01

    For the first time, a breakdown of natural disaster losses from 1900-2015 based on over 30,000 event economic losses globally is given based on increased analysis within the CATDAT Damaging Natural Disaster databases. Using country-CPI and GDP deflator adjustments, over 7 trillion (2015-adjusted) in losses have occurred; over 40% due to flood/rainfall, 26% due to earthquake, 19% due to storm effects, 12% due to drought, 2% due to wildfire and under 1% due to volcano. Using construction cost indices, higher percentages of flood losses are seen. Depending on how the adjustment of dollars are made to 2015 terms (CPI vs. construction cost indices), between 6.5 and 14.0 trillion USD (2015-adjusted) of natural disaster losses have been seen from 1900-2015 globally. Significant reductions in economic losses have been seen in China and Japan from 1950 onwards. An AAL of around 200 billion in the last 16 years has been seen equating to around 0.25% of Global GDP or around 0.1% of Net Capital Stock per year. Normalised losses have also been calculated to examine the trends in vulnerability through time for economic losses. The normalisation methodology globally using the exposure databases within CATDAT that were undertaken previously in papers for the earthquake and volcano databases, are used for this study. The original event year losses are adjusted directly by capital stock change, very high losses are observed with respect to floods over time (however with improved flood control structures). This shows clear trends in the improvement of building stock towards natural disasters and a decreasing trend in most perils for most countries.

  9. Do Earthquakes Shake Stock Markets?

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  10. Earthquake prediction

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  11. The CATDAT damaging earthquakes database

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  12. The CATDAT damaging earthquakes database

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  13. The Principle of Proportionality

    Bennedsen, Morten; Meisner Nielsen, Kasper

    2005-01-01

    Recent policy initiatives within the harmonization of European company laws have promoted a so-called "principle of proportionality" through proposals that regulate mechanisms opposing a proportional distribution of ownership and control. We scrutinize the foundation for these initiatives...... in relationship to the process of harmonization of the European capital markets.JEL classifications: G30, G32, G34 and G38Keywords: Ownership Structure, Dual Class Shares, Pyramids, EU companylaws....

  14. Ground motion attenuation during M 7.1 Darfield and M 6.2 Christchurch, New Zealand, earthquakes and performance of global Ppedictive models

    Segou, Margaret; Kalkan, Erol

    2011-01-01

    The M 7.1 Darfield earthquake occurred 40 km west of Christchurch (New Zealand) on 4 September 2010. Six months after, the city was struck again with an M 6.2 event on 22 February local time (21 February UTC). These events resulted in significant damage to infrastructure in the city and its suburbs. The purpose of this study is to evaluate the performance of global predictive models (GMPEs) using the strong motion data obtained from these two events to improve future seismic hazard assessment and building code provisions for the Canterbury region.The Canterbury region is located on the boundary between the Pacific and Australian plates; its surface expression is the active right lateral Alpine fault (Berryman et al. 1993). Beneath the North Island and the north South Island, the Pacific plate subducts obliquely under the Australian plate, while at the southwestern part of the South Island, a reverse process takes place. Although New Zealand has experienced several major earthquakes in the past as a result of its complex seismotectonic environment (e.g., M 7.1 1888 North Canterbury, M 7.0 1929 Arthur's Pass, and M 6.2 1995 Cass), there was no evidence of prior seismic activity in Christchurch and its surroundings before the September event. The Darfield and Christchurch earthquakes occurred along the previously unmapped Greendale fault in the Canterbury basin, which is covered by Quaternary alluvial deposits (Forsyth et al. 2008). In Figure 1, site conditions of the Canterbury epicentral area are depicted on a VS30 map. This map was determined on the basis of topographic slope calculated from a 1-km grid using the method of Allen and Wald (2007). Also shown are the locations of strong motion stations.The Darfield event was generated as a result of a complex rupture mechanism; the recordings and geodetic data reveal that earthquake consists of three sub-events (Barnhart et al. 2011, page 815 of this issue). The first event was due to rupturing of a blind reverse

  15. Nowcasting Earthquakes and Tsunamis

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  16. Fault model of the 2017 Jiuzhaigou Mw 6.5 earthquake estimated from coseismic deformation observed using Global Positioning System and Interferometric Synthetic Aperture Radar data

    Nie, Zhaosheng; Wang, Di-Jin; Jia, Zhige; Yu, Pengfei; Li, Liangfa

    2018-04-01

    On August 8, 2017, the Jiuzhaigou Mw 6.5 earthquake occurred in Sichuan province, southwestern China, along the eastern margin of the Tibetan Plateau. The epicenter is surrounded by the Minjiang, Huya, and Tazang Faults. As the seismic activity and tectonics are very complicated, there is controversy regarding the accurate location of the epicenter and the seismic fault of the Jiuzhaigou earthquake. To investigate these aspects, first, the coseismic deformation field was derived from Global Positioning System (GPS) and Interferometric Synthetic Aperture Radar (InSAR) measurements. Second, the fault geometry, coseismic slip model, and Coulomb stress changes around the seismic region were calculated using a homogeneous elastic half-space model. The coseismic deformation field derived from InSAR measurements shows that this event was mainly dominated by a left-lateral strike-slip fault. The maximal and minimal displacements were approximately 0.15 m and - 0.21 m, respectively, along line-of-sight observation. The whole deformation field follows a northwest-trending direction and is mainly concentrated west of the fault. The coseismic slip is 28 km along the strike and 18 km along the dip. It is dominated by a left-lateral strike-slip fault. The average and maximal fault slip is 0.18 and 0.85 m, respectively. The rupture did not fully reach the ground surface. The focal mechanism derived from GPS and InSAR data is consistent with the kinematics and geometry of the Huya Fault. Therefore, we conclude that the northern section or the Shuzheng segment of the Huya Fault is the seismogenic fault. The maximal fault slip is located at 33.25°N and 103.82°E at a depth of 11 km, and the release moment is approximately 6.635 × 1018 Nm, corresponding to a magnitude of Mw 6.49, which is consistent with results reported by the US Geological Survey, Global Centroid Moment Tensor, and other researchers. The coseismic Coulomb stress changes enhanced the stress on the northwest and

  17. Multiwire proportional chamber development

    Doolittle, R. F.; Pollvogt, U.; Eskovitz, A. J.

    1973-01-01

    The development of large area multiwire proportional chambers, to be used as high resolution spatial detectors in cosmic ray experiments is described. A readout system was developed which uses a directly coupled, lumped element delay-line whose characteristics are independent of the MWPC design. A complete analysis of the delay-line and the readout electronic system shows that a spatial resolution of about 0.1 mm can be reached with the MWPC operating in the strictly proportional region. This was confirmed by measurements with a small MWPC and Fe-55 X-rays. A simplified analysis was carried out to estimate the theoretical limit of spatial resolution due to delta-rays, spread of the discharge along the anode wire, and inclined trajectories. To calculate the gas gain of MWPC's of different geometrical configurations a method was developed which is based on the knowledge of the first Townsend coefficient of the chamber gas.

  18. Analog earthquakes

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  19. Earthquake number forecasts testing

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  20. Restrictions and Proportionality

    Werlauff, Erik

    2009-01-01

    The article discusses three central aspects of the freedoms under European Community law, namely 1) the prohibition against restrictions as an important extension of the prohibition against discrimination, 2) a prohibition against exit restrictions which is just as important as the prohibition...... against host country restrictions, but which is often not recognised to the same extent by national law, and 3) the importance of also identifying and recognising an exit restriction, so that it is possible to achieve the required test of appropriateness and proportionality in relation to the rule...

  1. The divine proportion

    Huntley, H E

    1970-01-01

    Using simple mathematical formulas, most as basic as Pythagoras's theorem and requiring only a very limited knowledge of mathematics, Professor Huntley explores the fascinating relationship between geometry and aesthetics. Poetry, patterns like Pascal's triangle, philosophy, psychology, music, and dozens of simple mathematical figures are enlisted to show that the ""divine proportion"" or ""golden ratio"" is a feature of geometry and analysis which awakes answering echoes in the human psyche. When we judge a work of art aesthetically satisfying, according to his formulation, we are making it c

  2. A nuclear proportional counter

    1973-01-01

    The invention relates to a nuclear proportional counter comprising in a bulb filled with a low-pressure gas, a wire forming an anode and a cathode, characterized in that said cathode is constituted by two plane plates parallel to each other and to the anode wire, and in that two branches of a circuit are connected to the anode wire end-portions, each branch comprising a pre-amplifier, a measuring circuit consisting of a differentiator-integrator-differentiator amplifier and a zero detector, one of the branches comprising an adjustable delay circuit, both branches jointly attacking a conversion circuit for converting the pulse duration into amplitudes said conversion circuit being followed by a multi-channel analyzer, contingently provided with a recorder [fr

  3. Load proportional safety brake

    Cacciola, M. J.

    1979-01-01

    This brake is a self-energizing mechanical friction brake and is intended for use in a rotary drive system. It incorporates a torque sensor which cuts power to the power unit on any overload condition. The brake is capable of driving against an opposing load or driving, paying-out, an aiding load in either direction of rotation. The brake also acts as a no-back device when torque is applied to the output shaft. The advantages of using this type of device are: (1) low frictional drag when driving; (2) smooth paying-out of an aiding load with no runaway danger; (3) energy absorption proportional to load; (4) no-back activates within a few degrees of output shaft rotation and resets automatically; and (5) built-in overload protection.

  4. Macroeconomic Proportions and Corellations

    Constantin Anghelache

    2006-02-01

    Full Text Available The work is focusing on the main proportions and correlations which are being set up between the major macroeconomic indicators. This is the general frame for the analysis of the relations between the Gross Domestic Product growth rate and the unemployment rate; the interaction between the inflation rate and the unemployment rate; the connection between the GDP growth rate and the inflation rate. Within the analysis being performed, a particular attention is paid to �the basic relationship of the economic growth� by emphasizing the possibilities as to a factorial analysis of the macroeconomic development, mainly as far as the Gross Domestic Product is concerned. At this point, the authors are introducing the mathematical relations, which are used for modeling the macroeconomic correlations, hence the strictness of the analysis being performed.

  5. Macroeconomic Proportions and Corellations

    Constantin Mitrut

    2006-04-01

    Full Text Available The work is focusing on the main proportions and correlations which are being set up between the major macroeconomic indicators. This is the general frame for the analysis of the relations between the Gross Domestic Product growth rate and the unemployment rate; the interaction between the inflation rate and the unemployment rate; the connection between the GDP growth rate and the inflation rate. Within the analysis being performed, a particular attention is paid to “the basic relationship of the economic growth” by emphasizing the possibilities as to a factorial analysis of the macroeconomic development, mainly as far as the Gross Domestic Product is concerned. At this point, the authors are introducing the mathematical relations, which are used for modeling the macroeconomic correlations, hence the strictness of the analysis being performed.

  6. Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER) project and a next-generation real-time volcano hazard assessment system

    Takarada, S.

    2012-12-01

    The first Workshop of Asia-Pacific Region Global Earthquake and Volcanic Eruption Risk Management (G-EVER1) was held in Tsukuba, Ibaraki Prefecture, Japan from February 23 to 24, 2012. The workshop focused on the formulation of strategies to reduce the risks of disasters worldwide caused by the occurrence of earthquakes, tsunamis, and volcanic eruptions. More than 150 participants attended the workshop. During the workshop, the G-EVER1 accord was approved by the participants. The Accord consists of 10 recommendations like enhancing collaboration, sharing of resources, and making information about the risks of earthquakes and volcanic eruptions freely available and understandable. The G-EVER Hub website (http://g-ever.org) was established to promote the exchange of information and knowledge among the Asia-Pacific countries. Several G-EVER Working Groups and Task Forces were proposed. One of the working groups was tasked to make the next-generation real-time volcano hazard assessment system. The next-generation volcano hazard assessment system is useful for volcanic eruption prediction, risk assessment, and evacuation at various eruption stages. The assessment system is planned to be developed based on volcanic eruption scenario datasets, volcanic eruption database, and numerical simulations. Defining volcanic eruption scenarios based on precursor phenomena leading up to major eruptions of active volcanoes is quite important for the future prediction of volcanic eruptions. Compiling volcanic eruption scenarios after a major eruption is also important. A high quality volcanic eruption database, which contains compilations of eruption dates, volumes, and styles, is important for the next-generation volcano hazard assessment system. The volcanic eruption database is developed based on past eruption results, which only represent a subset of possible future scenarios. Hence, different distributions from the previous deposits are mainly observed due to the differences in

  7. Connecting slow earthquakes to huge earthquakes

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  8. Ionospheric phenomena before strong earthquakes

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  9. Twitter earthquake detection: Earthquake monitoring in a social world

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  10. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  11. The mechanism of earthquake

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    strength of crust rocks: The gravitational pressure can initiate the elasticity-plasticity transition in crust rocks. By calculating the depth dependence of elasticity-plasticity transition and according to the actual situation analysis, the behaviors of crust rocks can be categorized in three typical zones: elastic, partially plastic and fully plastic. As the proportion of plastic portion reaches about 10% in the partially plastic zone, plastic interconnection may occur and the variation of shear strength in rocks is mainly characterized by plastic behavior. The equivalent coefficient of friction for the plastic slip is smaller by an order of magnitude, or even less than that for brittle fracture, thus the shear strength of rocks by plastic sliding is much less than that by brittle breaking. Moreover, with increasing depth a number of other factors can further reduce the shear yield strength of rocks. On the other hand, since earthquake is a large-scale damage, the rock breaking must occur along the weakest path. Therefore, the actual fracture strength of rocks in a shallow earthquake is assuredly lower than the average shear strength of rocks as generally observed. The typical distributions of the average strength and actual fracture strength in crustal rocks varying with depth are schematically illustrated. (3) The conditions for earthquake occurrence and mechanisms of earthquake: An earthquake will lead to volume expansion, and volume expansion must break through the obstacle. The condition for an earthquake to occur is as follows: the tectonic force exceeds the sum of the fracture strength of rock, the friction force of fault boundary and the resistance from obstacles. Therefore, the shallow earthquake is characterized by plastic sliding of rocks that break through the obstacles. Accordingly, four possible patterns for shallow earthquakes are put forward. Deep-focus earthquakes are believed to result from a wide-range rock flow that breaks the jam. Both shallow

  12. Earthquake Facts

    ... North Dakota, and Wisconsin. The core of the earth was the first internal structural element to be identified. In 1906 R.D. Oldham discovered it from his studies of earthquake records. The inner core is solid, and the outer core is liquid and so does not transmit ...

  13. Understanding Earthquakes

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  14. Globalization

    Tulio Rosembuj

    2006-12-01

    Full Text Available There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  15. Globalization

    Tulio Rosembuj

    2006-01-01

    There is no singular globalization, nor is the result of an individual agent. We could start by saying that global action has different angles and subjects who perform it are different, as well as its objectives. The global is an invisible invasion of materials and immediate effects.

  16. Twitter earthquake detection: earthquake monitoring in a social world

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  17. Globalization

    Andru?cã Maria Carmen

    2013-01-01

    The field of globalization has highlighted an interdependence implied by a more harmonious understanding determined by the daily interaction between nations through the inducement of peace and the management of streamlining and the effectiveness of the global economy. For the functioning of the globalization, the developing countries that can be helped by the developed ones must be involved. The international community can contribute to the institution of the development environment of the gl...

  18. Earthquake data base for Romania

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.

    2002-01-01

    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  19. Connecting slow earthquakes to huge earthquakes.

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  20. Earthquake Early Warning Systems

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  1. Proportioning of light weight concrete

    Palmus, Lars

    1996-01-01

    Development of a method to determine the proportions of the raw materials in light weight concrete made with leight expanded clay aggregate. The method is based on composite theory......Development of a method to determine the proportions of the raw materials in light weight concrete made with leight expanded clay aggregate. The method is based on composite theory...

  2. A Decade of Giant Earthquakes - What does it mean?

    Wallace, Terry C. Jr. [Los Alamos National Laboratory

    2012-07-16

    On December 26, 2004 the largest earthquake since 1964 occurred near Ache, Indonesia. The magnitude 9.2 earthquake and subsequent tsunami killed a quarter of million people; it also marked the being of a period of extraordinary seismicity. Since the Ache earthquake there have been 16 magnitude 8 earthquakes globally, including 2 this last April. For the 100 years previous to 2004 there was an average of 1 magnitude 8 earthquake every 2.2 years; since 2004 there has been 2 per year. Since magnitude 8 earthquakes dominate global seismic energy release, this period of seismicity has seismologist rethinking what they understand about plate tectonics and the connectivity between giant earthquakes. This talk will explore this remarkable period of time and its possible implications.

  3. Proportional Symbol Mapping in R

    Susumu Tanimura

    2006-01-01

    Full Text Available Visualization of spatial data on a map aids not only in data exploration but also in communication to impart spatial conception or ideas to others. Although recent carto-graphic functions in R are rapidly becoming richer, proportional symbol mapping, which is one of the common mapping approaches, has not been packaged thus far. Based on the theories of proportional symbol mapping developed in cartography, the authors developed some functions for proportional symbol mapping using R, including mathematical and perceptual scaling. An example of these functions demonstrated the new expressive power and options available in R, particularly for the visualization of conceptual point data.

  4. Optical fusions and proportional syntheses

    Albert-Vanel, Michel

    2002-06-01

    A tragic error is being made in the literature concerning matters of color when dealing with optical fusions. They are still considered to be of additive nature, whereas experience shows us somewhat different results. The goal of this presentation is to show that fusions are, in fact, of 'proportional' nature, tending to be additive or subtractive, depending on each individual case. Using the pointillist paintings done in the manner of Seurat, or the spinning discs experiment could highlight this intermediate sector of the proportional. So, let us try to examine more closely what occurs in fact, by reviewing additive, subtractive and proportional syntheses.

  5. Proportional counter end effects eliminator

    Meekins, J.F.

    1976-01-01

    An improved gas-filled proportional counter which includes a resistor network connected between the anode and cathode at the ends of the counter in order to eliminate ''end effects'' is described. 3 Claims, 2 Drawing Figures

  6. Electronics for proportional drift tubes

    Fremont, G.; Friend, B.; Mess, K.H.; Schmidt-Parzefall, W.; Tarle, J.C.; Verweij, H.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration); Geske, K.; Riege, H.; Schuett, J.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration); Semenov, Y.; CERN-Hamburg-Amsterdam-Rome-Moscow Collaboration)

    1980-01-01

    An electronic system for the read-out of a large number of proportional drift tubes (16,000) has been designed. This system measures deposited charge and drift-time of the charge of a particle traversing a proportional drift tube. A second event can be accepted during the read-out of the system. Up to 40 typical events can be collected and buffered before a data transfer to a computer is necessary. (orig.)

  7. Globalization

    Plum, Maja

    Globalization is often referred to as external to education - a state of affair facing the modern curriculum with numerous challenges. In this paper it is examined as internal to curriculum; analysed as a problematization in a Foucaultian sense. That is, as a complex of attentions, worries, ways...... of reasoning, producing curricular variables. The analysis is made through an example of early childhood curriculum in Danish Pre-school, and the way the curricular variable of the pre-school child comes into being through globalization as a problematization, carried forth by the comparative practices of PISA...

  8. Globalization

    F. Gerard Adams

    2008-01-01

    The rapid globalization of the world economy is causing fundamental changes in patterns of trade and finance. Some economists have argued that globalization has arrived and that the world is “flat†. While the geographic scope of markets has increased, the author argues that new patterns of trade and finance are a result of the discrepancies between “old†countries and “new†. As the differences are gradually wiped out, particularly if knowledge and technology spread worldwide, the t...

  9. Analogical proportions: another logical view

    Prade, Henri; Richard, Gilles

    This paper investigates the logical formalization of a restricted form of analogical reasoning based on analogical proportions, i.e. statements of the form a is to b as c is to d. Starting from a naive set theoretic interpretation, we highlight the existence of two noticeable companion proportions: one states that a is to b the converse of what c is to d (reverse analogy), while the other called paralogical proportion expresses that what a and b have in common, c and d have it also. We identify the characteristic postulates of the three types of proportions and examine their consequences from an abstract viewpoint. We further study the properties of the set theoretic interpretation and of the Boolean logic interpretation, and we provide another light on the understanding of the role of permutations in the modeling of the three types of proportions. Finally, we address the use of these proportions as a basis for inference in a propositional setting, and relate it to more general schemes of analogical reasoning. The differences between analogy, reverse-analogy, and paralogy is still emphasized in a three-valued setting, which is also briefly presented.

  10. Napa earthquake: An earthquake in a highly connected world

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  11. Seismicity map tools for earthquake studies

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  12. Development of multiwire proportional chambers

    Charpak, G

    1969-01-01

    It has happened quite often in the history of science that theoreticians, confronted with some major difficulty, have successfully gone back thirty years to look at ideas that had then been thrown overboard. But it is rare that experimentalists go back thirty years to look again at equipment which had become out-dated. This is what Charpak and his colleagues did to emerge with the 'multiwire proportional chamber' which has several new features making it a very useful addition to the armoury of particle detectors. In the 1930s, ion-chambers, Geiger- Muller counters and proportional counters, were vital pieces of equipment in nuclear physics research. Other types of detectors have since largely replaced them but now the proportional counter, in new array, is making a comeback.

  13. Bayesian inference on proportional elections.

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  14. How fault geometry controls earthquake magnitude

    Bletery, Q.; Thomas, A.; Karlstrom, L.; Rempel, A. W.; Sladen, A.; De Barros, L.

    2016-12-01

    Recent large megathrust earthquakes, such as the Mw9.3 Sumatra-Andaman earthquake in 2004 and the Mw9.0 Tohoku-Oki earthquake in 2011, astonished the scientific community. The first event occurred in a relatively low-convergence-rate subduction zone where events of its size were unexpected. The second event involved 60 m of shallow slip in a region thought to be aseismicaly creeping and hence incapable of hosting very large magnitude earthquakes. These earthquakes highlight gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution. Here we show that gradients in dip angle exert a primary control on mega-earthquake occurrence. We calculate the curvature along the major subduction zones of the world and show that past mega-earthquakes occurred on flat (low-curvature) interfaces. A simplified analytic model demonstrates that shear strength heterogeneity increases with curvature. Stress loading on flat megathrusts is more homogeneous and hence more likely to be released simultaneously over large areas than on highly-curved faults. Therefore, the absence of asperities on large faults might counter-intuitively be a source of higher hazard.

  15. Saving Money Using Proportional Reasoning

    de la Cruz, Jessica A.; Garney, Sandra

    2016-01-01

    It is beneficial for students to discover intuitive strategies, as opposed to the teacher presenting strategies to them. Certain proportional reasoning tasks are more likely to elicit intuitive strategies than other tasks. The strategies that students are apt to use when approaching a task, as well as the likelihood of a student's success or…

  16. Earthquake location in island arcs

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  17. Fault geometry and earthquake mechanics

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  18. Magnitudes and frequencies of earthquakes in relation to seismic risk

    Sharma, R.D.

    1989-01-01

    Estimating the frequencies of occurrence of earthquakes of different magnitudes on a regional basis is an important task in estimating seismic risk at a construction site. Analysis of global earthquake data provides an insight into the magnitudes frequency relationship in a statistical manner. It turns out that, whereas a linear relationship between the logarithm of earthquake occurrence rates and the corresponding earthquake magnitudes fits well in the magnitude range between 5 and 7, a second degree polynomial in M, the earthquake magnitude provides a better description of the frequencies of earthquakes in a much wider range of magnitudes. It may be possible to adopt magnitude frequency relation for regions, for which adequate earthquake data are not available, to carry out seismic risk calculations. (author). 32 refs., 8 tabs., 7 figs

  19. Disease proportions attributable to environment

    Vineis Paolo

    2007-11-01

    Full Text Available Abstract Population disease proportions attributable to various causal agents are popular as they present a simplified view of the contribution of each agent to the disease load. However they are only summary figures that may be easily misinterpreted or over-interpreted even when the causal link between an exposure and an effect is well established. This commentary discusses several issues surrounding the estimation of attributable proportions, particularly with reference to environmental causes of cancers, and critically examines two recently published papers. These issues encompass potential biases as well as the very definition of environment and of environmental agent. The latter aspect is not just a semantic question but carries implications for the focus of preventive actions, whether centred on the material and social environment or on single individuals.

  20. PEP quark search proportional chambers

    Parker, S I; Harris, F; Karliner, I; Yount, D [Hawaii Univ., Honolulu (USA); Ely, R; Hamilton, R; Pun, T [California Univ., Berkeley (USA). Lawrence Berkeley Lab.; Guryn, W; Miller, D; Fries, R [Northwestern Univ., Evanston, IL (USA)

    1981-04-01

    Proportional chambers are used in the PEP Free Quark Search to identify and remove possible background sources such as particles traversing the edges of counters, to permit geometric corrections to the dE/dx and TOF information from the scintillator and Cerenkov counters, and to look for possible high cross section quarks. The present beam pipe has a thickness of 0.007 interaction lengths (lambdasub(i)) and is followed in both arms each with 45/sup 0/ <= theta <= 135/sup 0/, ..delta..phi=90/sup 0/ by 5 proportional chambers, each 0.0008 lambdasub(i) thick with 32 channels of pulse height readout, and by 3 thin scintillator planes, each 0.003 lambdasub(i) thick. Following this thin front end, each arm of the detector has 8 layers of scintillator (one with scintillating light pipes) interspersed with 4 proportional chambers and a layer of lucite Cerenkov counters. Both the calculated ion statistics and measurements using He-CH/sub 4/ gas in a test chamber indicate that the chamber efficiencies should be >98% for q=1/3. The Landau spread measured in the test was equal to that observed for normal q=1 traversals. One scintillator plane and thin chamber in each arm will have an extra set of ADC's with a wide gate bracketing the normal one so timing errors and tails of earlier pulses should not produce fake quarks.

  1. Solar eruptions - soil radon - earthquakes

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time a new natural phenomenon was established: a contrasting increase in the soil radon level under the influence of solar flares. Such an increase is one of geochemical indicators of earthquakes. Most researchers consider this a phenomenon of exclusively terrestrial processes. Investigations regarding the link of earthquakes to solar activity carried out during the last decade in different countries are based on the analysis of statistical data ΣΕ (t) and W (t). As established, the overall seismicity of the Earth and its separate regions depends of an 11-year long cycle of solar activity. Data provided in the paper based on experimental studies serve the first step on the way of experimental data on revealing cause-and-reason solar-terrestrials bonds in a series s olar eruption-lithosphere radon-earthquakes . They need further collection of experimental data. For the first time, through radon constituent of terrestrial radiation objectification has been made of elementary lattice of the Hartmann's network contoured out by bio location method. As found out, radon concentration variations in Hartmann's network nodes determine the dynamics of solar-terrestrial relationships. Of the three types of rapidly running processes conditioned by solar-terrestrial bonds earthquakes are attributed to rapidly running destructive processes that occur in the most intense way at the juncture of tectonic massifs, along transformed and deep failures. The basic factors provoking the earthquakes are both magnetic-structural effects and a long-term (over 5 months) bombing of the surface of lithosphere by highly energetic particles of corpuscular solar flows, this being approved by photometry. As a result of solar flares that occurred from 29 October to 4 November 2003, a sharply contrasting increase in soil radon was established which is an earthquake indicator on the territory of Yerevan City. A month and a half later, earthquakes occurred in San-Francisco, Iran, Turkey

  2. Earthquakes: hydrogeochemical precursors

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  3. Ground water and earthquakes

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  4. Smartphone MEMS accelerometers and earthquake early warning

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  5. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  6. Ionospheric earthquake precursors

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  7. Children's Ideas about Earthquakes

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  8. Incisors’ proportions in smile esthetics

    Alsulaimani, Fahad F; Batwa, Waeil

    2013-01-01

    Aims: To determine whether alteration of the maxillary central and lateral incisors’ length and width, respectively, would affect perceived smile esthetics and to validate the most esthetic length and width, respectively, for the central and lateral incisors. Materials and Methods: Photographic manipulation was undertaken to produce two sets of photographs, each set of four photographs showing the altered width of the lateral incisor and length of the central length. The eight produced photographs were assessed by laypeople, dentists and orthodontists. Results: Alteration in the incisors’ proportion affected the relative smile attractiveness for laypeople (n=124), dentists (n=115) and orthodontists (n=68); dentists and orthodontists did not accept lateral width reduction of more than 0.5 mm (P<0.01), which suggests that the lateral to central incisor width ratio ranges from 54% to 62%. However, laypeople did not accept lateral width reduction of more than 1 mm (P<0.01), widening the range to be from 48% to 62%. All groups had zero tolerance for changes in central crown length (P<0.01). Conclusion: All participants recognized that the central incisors’ length changes. For lateral incisors, laypeople were more tolerant than dentists and orthodontists. This suggests that changing incisors’ proportions affects the relative smile attractiveness. PMID:24987650

  9. Latin American contributions to the GEM’s Earthquake Consequences Database

    Cardona Arboleda, Omar Dario; Ordaz Schroeder, Mario Gustavo; Salgado Gálvez, Mario Andrés; Carreño Tibaduiza, Martha Liliana; Barbat Barbat, Horia Alejandro

    2016-01-01

    One of the projects of the Global Earthquake Model (GEM) was to develop a global earthquake consequences database (GEMECD) which served both to be an open and public repository of damages and losses on different types of elements at global level and also as a benchmark for the development of vulnerability models that could capture specific characteristics of the affected countries. The online earthquakes consequences database has information on 71 events where 14 correspond to events that occ...

  10. Collaboratory for the Study of Earthquake Predictability

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  11. Constant Proportion Debt Obligations (CPDOs)

    Cont, Rama; Jessen, Cathrine

    2012-01-01

    be made arbitrarily small—and thus the credit rating arbitrarily high—by increasing leverage, but the ratings obtained strongly depend on assumptions on the credit environment (high spread or low spread). More importantly, CPDO loss distributions are found to exhibit a wide range of tail risk measures......Constant Proportion Debt Obligations (CPDOs) are structured credit derivatives that generate high coupon payments by dynamically leveraging a position in an underlying portfolio of investment-grade index default swaps. CPDO coupons and principal notes received high initial credit ratings from...... the major rating agencies, based on complex models for the joint transition of ratings and spreads for all names in the underlying portfolio. We propose a parsimonious model for analysing the performance of CPDO strategies using a top-down approach that captures the essential risk factors of the CPDO. Our...

  12. Estimating Source Duration for Moderate and Large Earthquakes in Taiwan

    Chang, Wen-Yen; Hwang, Ruey-Der; Ho, Chien-Yin; Lin, Tzu-Wei

    2017-04-01

    Estimating Source Duration for Moderate and Large Earthquakes in Taiwan Wen-Yen Chang1, Ruey-Der Hwang2, Chien-Yin Ho3 and Tzu-Wei Lin4 1 Department of Natural Resources and Environmental Studies, National Dong Hwa University, Hualien, Taiwan, ROC 2Department of Geology, Chinese Culture University, Taipei, Taiwan, ROC 3Department of Earth Sciences, National Cheng Kung University, Tainan, Taiwan, ROC 4Seismology Center, Central Weather Bureau, Taipei, Taiwan, ROC ABSTRACT To construct a relationship between seismic moment (M0) and source duration (t) was important for seismic hazard in Taiwan, where earthquakes were quite active. In this study, we used a proposed inversion process using teleseismic P-waves to derive the M0-t relationship in the Taiwan region for the first time. Fifteen earthquakes with MW 5.5-7.1 and focal depths of less than 40 km were adopted. The inversion process could simultaneously determine source duration, focal depth, and pseudo radiation patterns of direct P-wave and two depth phases, by which M0 and fault plane solutions were estimated. Results showed that the estimated t ranging from 2.7 to 24.9 sec varied with one-third power of M0. That is, M0 is proportional to t**3, and then the relationship between both of them was M0=0.76*10**23(t)**3 , where M0 in dyne-cm and t in second. The M0-t relationship derived from this study was very close to those determined from global moderate to large earthquakes. For further understanding the validity in the derived relationship, through the constructed relationship of M0-, we inferred the source duration of the 1999 Chi-Chi (Taiwan) earthquake with M0=2-5*10**27 dyne-cm (corresponding to Mw = 7.5-7.7) to be approximately 29-40 sec, in agreement with many previous studies for source duration (28-42 sec).

  13. Earthquake forecasting and warning

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  14. Renormalization group theory of earthquakes

    H. Saleur

    1996-01-01

    Full Text Available We study theoretically the physical origin of the proposed discrete scale invariance of earthquake processes, at the origin of the universal log-periodic corrections to scaling, recently discovered in regional seismic activity (Sornette and Sammis (1995. The discrete scaling symmetries which may be present at smaller scales are shown to be robust on a global scale with respect to disorder. Furthermore, a single complex exponent is sufficient in practice to capture the essential properties of the leading correction to scaling, whose real part may be renormalized by disorder, and thus be specific to the system. We then propose a new mechanism for discrete scale invariance, based on the interplay between dynamics and disorder. The existence of non-linear corrections to the renormalization group flow implies that an earthquake is not an isolated 'critical point', but is accompanied by an embedded set of 'critical points', its foreshocks and any subsequent shocks for which it may be a foreshock.

  15. Position-sensitive proportional counter

    Kopp, M.K.

    1980-01-01

    A position-sensitive proportional counter circuit uses a conventional (low-resistance, metal-wire anode) counter for spatial resolution of an ionizing event along the anode, which functions as an RC line. A pair of preamplifiers at the anode ends act as stabilized active-capacitance loads, each comprising a series-feedback, low-noise amplifier and a unity-gain, shunt-feedback amplifier whose output is connected through a feedback capacitor to the series-feedback amplifier input. The stabilized capacitance loading of the anode allows distributed RC-line position encoding and subsequent time difference decoding by sensing the difference in rise times of pulses at the anode ends where the difference is primarily in response to the distributed capacitance along the anode. This allows the use of lower resistance wire anodes for spatial radiation detection which simplifies the counter construction of handling of the anodes, and stabilizes the anode resistivity at high count rates (>10 6 counts/sec). (author)

  16. Earthquake cycle deformation and the Moho: Implications for the rheology of continental lithosphere

    Wright, TJ; Elliott, JR; Wang, H; Ryder, I

    2013-01-01

    The last 20. years has seen a dramatic improvement in the quantity and quality of geodetic measurements of the earthquake loading cycle. In this paper we compile and review these observations and test whether crustal thickness exerts any control. We found 78 earthquake source mechanisms for continental earthquakes derived from satellite geodesy, 187 estimates of interseismic "locking depth", and 23 earthquakes (or sequences) for which postseismic deformation has been observed. Globally we est...

  17. Retrospective Evaluation of the Five-Year and Ten-Year CSEP-Italy Earthquake Forecasts

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-01-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten...

  18. Superconducting Gravimeters Detect Gravity Fluctuations Induced by Mw 5.7 Earthquake Along South Pacific Rise Few Hours Before the 2011 Mw 9.0 Tohoku-Oki Earthquake

    Keliang Zhang Jin Ma

    2014-01-01

    Full Text Available Gravity changes sometimes appear before a big earthquake. To determine the possible sources is important for recognizing the mechanism and further geodynamic studies. During the first two hours on March 11 before the Mw 9.0 Tohoku-Oki earthquake, the non-tidal gravity time series of superconducting gravimeters worldwide showed low-frequency (< 0.10 Hz fluctuations with amplitude of ~1 to 4 × 10-8 ms-2 lasting ~10 - 20 minutes. Through comparing global seismicity with the arrival times of seismic waves, we find that the fluctuations were induced by the Mw 5.7 earthquake that occurred at 0:14:54.68 at (53.27°S, 118.18°W along the eastern South Pacific Rise. Several body waves such as P, S are clearly recorded in the station with ~400 km distance to the hypocenter. The fluctuations are in response to the waves that propagate with a velocity of about 4 km s-1. Their amplitudes are proportional to the inverse of the epicentral distances even though the fluctuations of European sites were overlapped with waves associated with a smaller, i.e., Mw 2.6, event in Europe during this period. That is, the Mw 5.7 earthquake induced remarkable gravity fluctuations over long distances at stations all over the world. As such, the foreshocks with larger magnitudes occurred before the Mw 9.0 earthquake would have more significant influence on the gravity recordings and the seismic-wave induced component should be removed during the analysis of anomalies prior to a great earthquake in future studies.

  19. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with the earthquake date and in this case the FDL method coincides with the MFDL. Based on the MDFL method we present the prediction method capable of predicting global events or localized earthquakes and we will discuss the accuracy of the method in as far as the prediction and location parts of the method. We show example calendar style predictions for global events as well as for the Greek region using

  20. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  1. Responses to the 2011 Earthquake on Facebook

    Hansen, Annette Skovsted

    In my investigation of how Japanese ODA policies and practices have engendered global networks, I have frequented the Association of Overseas Technical Scholarships (AOTS)' Facebook group. In the wake of the earthquake on March 11, 2011, many greetings came in from alumni who have within the last...

  2. Encyclopedia of earthquake engineering

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  3. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  4. Major earthquakes occur regularly on an isolated plate boundary fault.

    Berryman, Kelvin R; Cochran, Ursula A; Clark, Kate J; Biasi, Glenn P; Langridge, Robert M; Villamor, Pilar

    2012-06-29

    The scarcity of long geological records of major earthquakes, on different types of faults, makes testing hypotheses of regular versus random or clustered earthquake recurrence behavior difficult. We provide a fault-proximal major earthquake record spanning 8000 years on the strike-slip Alpine Fault in New Zealand. Cyclic stratigraphy at Hokuri Creek suggests that the fault ruptured to the surface 24 times, and event ages yield a 0.33 coefficient of variation in recurrence interval. We associate this near-regular earthquake recurrence with a geometrically simple strike-slip fault, with high slip rate, accommodating a high proportion of plate boundary motion that works in isolation from other faults. We propose that it is valid to apply time-dependent earthquake recurrence models for seismic hazard estimation to similar faults worldwide.

  5. Earthquake at 40 feet

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  6. Earthquakes and economic growth

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  7. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes

    Yamada, T.; Ide, S.

    2007-12-01

    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  8. A prospective earthquake forecast experiment in the western Pacific

    Eberhard, David A. J.; Zechar, J. Douglas; Wiemer, Stefan

    2012-09-01

    Since the beginning of 2009, the Collaboratory for the Study of Earthquake Predictability (CSEP) has been conducting an earthquake forecast experiment in the western Pacific. This experiment is an extension of the Kagan-Jackson experiments begun 15 years earlier and is a prototype for future global earthquake predictability experiments. At the beginning of each year, seismicity models make a spatially gridded forecast of the number of Mw≥ 5.8 earthquakes expected in the next year. For the three participating statistical models, we analyse the first two years of this experiment. We use likelihood-based metrics to evaluate the consistency of the forecasts with the observed target earthquakes and we apply measures based on Student's t-test and the Wilcoxon signed-rank test to compare the forecasts. Overall, a simple smoothed seismicity model (TripleS) performs the best, but there are some exceptions that indicate continued experiments are vital to fully understand the stability of these models, the robustness of model selection and, more generally, earthquake predictability in this region. We also estimate uncertainties in our results that are caused by uncertainties in earthquake location and seismic moment. Our uncertainty estimates are relatively small and suggest that the evaluation metrics are relatively robust. Finally, we consider the implications of our results for a global earthquake forecast experiment.

  9. OMG Earthquake! Can Twitter improve earthquake response?

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  10. Earthquakes and Schools

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  11. Bam Earthquake in Iran

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  12. Tradable Earthquake Certificates

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  13. Historic Eastern Canadian earthquakes

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  14. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  15. Impossibility Theorem in Proportional Representation Problem

    Karpov, Alexander

    2010-01-01

    The study examines general axiomatics of Balinski and Young and analyzes existed proportional representation methods using this approach. The second part of the paper provides new axiomatics based on rational choice models. New system of axioms is applied to study known proportional representation systems. It is shown that there is no proportional representation method satisfying a minimal set of the axioms (monotonicity and neutrality).

  16. Cognitive and Metacognitive Aspects of Proportional Reasoning

    Modestou, Modestina; Gagatsis, Athanasios

    2010-01-01

    In this study we attempt to propose a new model of proportional reasoning based both on bibliographical and research data. This is impelled with the help of three written tests involving analogical, proportional, and non-proportional situations that were administered to pupils from grade 7 to 9. The results suggest the existence of a…

  17. Evaluating Middle Years Students' Proportional Reasoning

    Hilton, Annette; Dole, Shelley; Hilton, Geoff; Goos, Merrilyn; O'Brien, Mia

    2012-01-01

    Proportional reasoning is a key aspect of numeracy that is not always developed naturally by students. Understanding the types of proportional reasoning that students apply to different problem types is a useful first step to identifying ways to support teachers and students to develop proportional reasoning in the classroom. This paper describes…

  18. Global teaching of global seismology

    Stein, S.; Wysession, M.

    2005-12-01

    Our recent textbook, Introduction to Seismology, Earthquakes, & Earth Structure (Blackwell, 2003) is used in many countries. Part of the reason for this may be our deliberate attempt to write the book for an international audience. This effort appears in several ways. We stress seismology's long tradition of global data interchange. Our brief discussions of the science's history illustrate the contributions of scientists around the world. Perhaps most importantly, our discussions of earthquakes, tectonics, and seismic hazards take a global view. Many examples are from North America, whereas others are from other areas. Our view is that non-North American students should be exposed to North American examples that are type examples, and that North American students should be similarly exposed to examples elsewhere. For example, we illustrate how the Euler vector geometry changes a plate boundary from spreading, to strike-slip, to convergence using both the Pacific-North America boundary from the Gulf of California to Alaska and the Eurasia-Africa boundary from the Azores to the Mediterranean. We illustrate diffuse plate boundary zones using western North America, the Andes, the Himalayas, the Mediterranean, and the East Africa Rift. The subduction zone discussions examine Japan, Tonga, and Chile. We discuss significant earthquakes both in the U.S. and elsewhere, and explore hazard mitigation issues in different contexts. Both comments from foreign colleagues and our experience lecturing overseas indicate that this approach works well. Beyond the specifics of our text, we believe that such a global approach is facilitated by the international traditions of the earth sciences and the world youth culture that gives students worldwide common culture. For example, a video of the scene in New Madrid, Missouri that arose from a nonsensical earthquake prediction in 1990 elicits similar responses from American and European students.

  19. Earthquakes, November-December 1977

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  20. Earthquakes, September-October 1986

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  1. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  2. Proportioning of U3O8 powder

    Cermak, V.; Markvart, M.; Novy, P.; Vanka, M.

    1989-01-01

    The tests are briefly described or proportioning U 3 O 8 powder of a granulometric grain size range of 0-160 μm using a vertical screw, a horizontal dual screw and a vibration dispenser with a view to proportioning very fine U 3 O 8 powder fractions produced in the oxidation of UO 2 fuel pellets. In the tests, the evenness of proportioning was assessed by the percentage value of the proportioning rate spread measured at one-minute intervals at a proportioning rate of 1-3 kg/h. In feeding the U 3 O 3 in a flame fluorator, it is advantageous to monitor the continuity of the powder column being proportioned and to assess it radiometrically by the value of the proportioning rate spread at very short intervals (0.1 s). (author). 10 figs., 1 tab., 12 refs

  3. Earthquake hazard assessment and small earthquakes

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  4. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  5. Sun, Moon and Earthquakes

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  6. Dynamic strains for earthquake source characterization

    Barbour, Andrew J.; Crowell, Brendan W

    2017-01-01

    Strainmeters measure elastodynamic deformation associated with earthquakes over a broad frequency band, with detection characteristics that complement traditional instrumentation, but they are commonly used to study slow transient deformation along active faults and at subduction zones, for example. Here, we analyze dynamic strains at Plate Boundary Observatory (PBO) borehole strainmeters (BSM) associated with 146 local and regional earthquakes from 2004–2014, with magnitudes from M 4.5 to 7.2. We find that peak values in seismic strain can be predicted from a general regression against distance and magnitude, with improvements in accuracy gained by accounting for biases associated with site–station effects and source–path effects, the latter exhibiting the strongest influence on the regression coefficients. To account for the influence of these biases in a general way, we include crustal‐type classifications from the CRUST1.0 global velocity model, which demonstrates that high‐frequency strain data from the PBO BSM network carry information on crustal structure and fault mechanics: earthquakes nucleating offshore on the Blanco fracture zone, for example, generate consistently lower dynamic strains than earthquakes around the Sierra Nevada microplate and in the Salton trough. Finally, we test our dynamic strain prediction equations on the 2011 M 9 Tohoku‐Oki earthquake, specifically continuous strain records derived from triangulation of 137 high‐rate Global Navigation Satellite System Earth Observation Network stations in Japan. Moment magnitudes inferred from these data and the strain model are in agreement when Global Positioning System subnetworks are unaffected by spatial aliasing.

  7. Earthquake Source Spectral Study beyond the Omega-Square Model

    Uchide, T.; Imanishi, K.

    2017-12-01

    Earthquake source spectra have been used for characterizing earthquake source processes quantitatively and, at the same time, simply, so that we can analyze the source spectra for many earthquakes, especially for small earthquakes, at once and compare them each other. A standard model for the source spectra is the omega-square model, which has the flat spectrum and the falloff inversely proportional to the square of frequencies at low and high frequencies, respectively, which are bordered by a corner frequency. The corner frequency has often been converted to the stress drop under the assumption of circular crack models. However, recent studies claimed the existence of another corner frequency [Denolle and Shearer, 2016; Uchide and Imanishi, 2016] thanks to the recent development of seismic networks. We have found that many earthquakes in areas other than the area studied by Uchide and Imanishi [2016] also have source spectra deviating from the omega-square model. Another part of the earthquake spectra we now focus on is the falloff rate at high frequencies, which will affect the seismic energy estimation [e.g., Hirano and Yagi, 2017]. In June, 2016, we deployed seven velocity seismometers in the northern Ibaraki prefecture, where the shallow crustal seismicity mainly with normal-faulting events was activated by the 2011 Tohoku-oki earthquake. We have recorded seismograms at 1000 samples per second and at a short distance from the source, so that we can investigate the high-frequency components of the earthquake source spectra. Although we are still in the stage of discovery and confirmation of the deviation from the standard omega-square model, the update of the earthquake source spectrum model will help us systematically extract more information on the earthquake source process.

  8. What Googling Trends Tell Us About Public Interest in Earthquakes

    Tan, Y. J.; Maharjan, R.

    2017-12-01

    Previous studies have shown that immediately after large earthquakes, there is a period of increased public interest. This represents a window of opportunity for science communication and disaster relief fundraising efforts to reach more people. However, how public interest varies for different earthquakes has not been quantified systematically on a global scale. We analyze how global search interest for the term "earthquake" on Google varies following earthquakes of magnitude ≥ 5.5 from 2004 to 2016. We find that there is a spike in search interest after large earthquakes followed by an exponential temporal decay. Preliminary results suggest that the period of increased search interest scales with death toll and correlates with the period of increased media coverage. This suggests that the relationship between the period of increased public interest in earthquakes and death toll might be an effect of differences in media coverage. However, public interest never remains elevated for more than three weeks. Therefore, to take advantage of this short period of increased public interest, science communication and disaster relief fundraising efforts have to act promptly following devastating earthquakes.

  9. Tidal controls on earthquake size-frequency statistics

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  10. Earthquake Ground Motion Selection

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  11. 1988 Spitak Earthquake Database

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  12. Why do card issuers charge proportional fees?

    Oz Shy; Zhu Wang

    2008-01-01

    This paper explains why payment card companies charge consumers and merchants fees which are proportional to the transaction values instead of charging a fixed per-transaction fee. Our theory shows that, even in the absence of any cost considerations, card companies earn much higher profit when they charge proportional fees. It is also shown that competition among merchants reduces card companies' gains from using proportional fees relative to a fixed per-transaction fee. Merchants are found ...

  13. Surface rupturing earthquakes repeated in the 300 years along the ISTL active fault system, central Japan

    Katsube, Aya; Kondo, Hisao; Kurosawa, Hideki

    2017-06-01

    Surface rupturing earthquakes produced by intraplate active faults generally have long recurrence intervals of a few thousands to tens of thousands of years. We here report the first evidence for an extremely short recurrence interval of 300 years for surface rupturing earthquakes on an intraplate system in Japan. The Kamishiro fault of the Itoigawa-Shizuoka Tectonic Line (ISTL) active fault system generated a Mw 6.2 earthquake in 2014. A paleoseismic trench excavation across the 2014 surface rupture showed the evidence for the 2014 event and two prior paleoearthquakes. The slip of the penultimate earthquake was similar to that of 2014 earthquake, and its timing was constrained to be after A.D. 1645. Judging from the timing, the damaged area, and the amount of slip, the penultimate earthquake most probably corresponds to a historical earthquake in A.D. 1714. The recurrence interval of the two most recent earthquakes is thus extremely short compared with intervals on other active faults known globally. Furthermore, the slip repetition during the last three earthquakes is in accordance with the time-predictable recurrence model rather than the characteristic earthquake model. In addition, the spatial extent of the 2014 surface rupture accords with the distribution of a serpentinite block, suggesting that the relatively low coefficient of friction may account for the unusually frequent earthquakes. These findings would affect long-term forecast of earthquake probability and seismic hazard assessment on active faults.

  14. Incorporating human-triggered earthquake risks into energy and water policies

    Klose, C. D.; Seeber, L.; Jacob, K. H.

    2010-12-01

    A comprehensive understanding of earthquake risks in urbanized regions requires an accurate assessment of both urban vulnerabilities and hazards from earthquakes, including ones whose timing might be affected by human activities. Socioeconomic risks associated with human-triggered earthquakes are often misconstrued and receive little scientific, legal, and public attention. Worldwide, more than 200 damaging earthquakes, associated with industrialization and urbanization, were documented since the 20th century. Geomechanical pollution due to large-scale geoengineering activities can advance the clock of earthquakes, trigger new seismic events or even shot down natural background seismicity. Activities include mining, hydrocarbon production, fluid injections, water reservoir impoundments and deep-well geothermal energy production. This type of geohazard has impacts on human security on a regional and national level. Some planned or considered future engineering projects raise particularly strong concerns about triggered earthquakes, such as for instance, sequestration of carbon dioxide by injecting it deep underground and large-scale natural gas production in the Marcellus shale in the Appalacian basin. Worldwide examples of earthquakes are discussed, including their associated losses of human life and monetary losses (e.g., 1989 Newcastle and Volkershausen earthquakes, 2001 Killari earthquake, 2006 Basel earthquake, 2010 Wenchuan earthquake). An overview is given on global statistics of human-triggered earthquakes, including depths and time delay of triggering. Lastly, strategies are described, including risk mitigation measures such as urban planning adaptations and seismic hazard mapping.

  15. Relating arithmetical techniques of proportion to geometry

    Wijayanti, Dyana

    2015-01-01

    The purpose of this study is to investigate how textbooks introduce and treat the theme of proportion in geometry (similarity) and arithmetic (ratio and proportion), and how these themes are linked to each other in the books. To pursue this aim, we use the anthropological theory of the didactic....... Considering 6 common Indonesian textbooks in use, we describe how proportion is explained and appears in examples and exercises, using an explicit reference model of the mathematical organizations of both themes. We also identify how the proportion themes of the geometry and arithmetic domains are linked. Our...

  16. Electromagnetic Manifestation of Earthquakes

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  17. Electromagnetic Manifestation of Earthquakes

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  18. Charles Darwin's earthquake reports

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  19. Signals of ENPEMF Used in Earthquake Prediction

    Hao, G.; Dong, H.; Zeng, Z.; Wu, G.; Zabrodin, S. M.

    2012-12-01

    The signals of Earth's natural pulse electromagnetic field (ENPEMF) is a combination of the abnormal crustal magnetic field pulse affected by the earthquake, the induced field of earth's endogenous magnetic field, the induced magnetic field of the exogenous variation magnetic field, geomagnetic pulsation disturbance and other energy coupling process between sun and earth. As an instantaneous disturbance of the variation field of natural geomagnetism, ENPEMF can be used to predict earthquakes. This theory was introduced by A.A Vorobyov, who expressed a hypothesis that pulses can arise not only in the atmosphere but within the Earth's crust due to processes of tectonic-to-electric energy conversion (Vorobyov, 1970; Vorobyov, 1979). The global field time scale of ENPEMF signals has specific stability. Although the wave curves may not overlap completely at different regions, the smoothed diurnal ENPEMF patterns always exhibit the same trend per month. The feature is a good reference for observing the abnormalities of the Earth's natural magnetic field in a specific region. The frequencies of the ENPEMF signals generally locate in kilo Hz range, where frequencies within 5-25 kilo Hz range can be applied to monitor earthquakes. In Wuhan, the best observation frequency is 14.5 kilo Hz. Two special devices are placed in accordance with the S-N and W-E direction. Dramatic variation from the comparison between the pulses waveform obtained from the instruments and the normal reference envelope diagram should indicate high possibility of earthquake. The proposed detection method of earthquake based on ENPEMF can improve the geodynamic monitoring effect and can enrich earthquake prediction methods. We suggest the prospective further researches are about on the exact sources composition of ENPEMF signals, the distinction between noise and useful signals, and the effect of the Earth's gravity tide and solid tidal wave. This method may also provide a promising application in

  20. Adaptively smoothed seismicity earthquake forecasts for Italy

    Yan Y. Kagan

    2010-11-01

    Full Text Available We present a model for estimation of the probabilities of future earthquakes of magnitudes m ≥ 4.95 in Italy. This model is a modified version of that proposed for California, USA, by Helmstetter et al. [2007] and Werner et al. [2010a], and it approximates seismicity using a spatially heterogeneous, temporally homogeneous Poisson point process. The temporal, spatial and magnitude dimensions are entirely decoupled. Magnitudes are independently and identically distributed according to a tapered Gutenberg-Richter magnitude distribution. We have estimated the spatial distribution of future seismicity by smoothing the locations of past earthquakes listed in two Italian catalogs: a short instrumental catalog, and a longer instrumental and historic catalog. The bandwidth of the adaptive spatial kernel is estimated by optimizing the predictive power of the kernel estimate of the spatial earthquake density in retrospective forecasts. When available and reliable, we used small earthquakes of m ≥ 2.95 to reveal active fault structures and 29 probable future epicenters. By calibrating the model with these two catalogs of different durations to create two forecasts, we intend to quantify the loss (or gain of predictability incurred when only a short, but recent, data record is available. Both forecasts were scaled to five and ten years, and have been submitted to the Italian prospective forecasting experiment of the global Collaboratory for the Study of Earthquake Predictability (CSEP. An earlier forecast from the model was submitted by Helmstetter et al. [2007] to the Regional Earthquake Likelihood Model (RELM experiment in California, and with more than half of the five-year experimental period over, the forecast has performed better than the others.

  1. Proportional Reasoning and the Visually Impaired

    Hilton, Geoff; Hilton, Annette; Dole, Shelley L.; Goos, Merrilyn; O'Brien, Mia

    2012-01-01

    Proportional reasoning is an important aspect of formal thinking that is acquired during the developmental years that approximate the middle years of schooling. Students who fail to acquire sound proportional reasoning often experience difficulties in subjects that require quantitative thinking, such as science, technology, engineering, and…

  2. Adaptive bayesian analysis for binomial proportions

    Das, Sonali

    2008-10-01

    Full Text Available of testing the proportion of some trait. For example, say, we are interested to infer about the effectiveness of a certain intervention teaching strategy, by comparing proportion of ‘proficient’ teachers, before and after an intervention. The number...

  3. Mix Proportion Design of Asphalt Concrete

    Wu, Xianhu; Gao, Lingling; Du, Shoujun

    2017-12-01

    Based on the gradation of AC and SMA, this paper designs a new type of anti slide mixture with two types of advantages. Chapter introduces the material selection, ratio of ore mixture ratio design calculation, and determine the optimal asphalt content test and proportioning design of asphalt concrete mix. This paper introduces the new technology of mix proportion.

  4. Proportional gas scintillation detectors and their applications

    Petr, I.

    1978-01-01

    The principle is described of a gas proportional scintillation detector and its function. Dependence of Si(Li) and xenon proportional detectors energy resolution on the input window size is given. A typical design is shown of a xenon detector used for X-ray spetrometry at an energy of 277 eV to 5.898 keV and at a gas pressure of 98 to 270 kPa. Gas proportional scintillation detectors show considerable better energy resolution than common proportional counters and even better resolution than semiconductor Si(Li) detectors for low X radiation energies. For detection areas smaller than 25 mm 2 Si(Li) detectors show better resolution, especially for higher X radiation energies. For window areas 25 to 190 mm 2 both types of detectors are equal, for a window area exceeding 190 mm 2 the proportional scintillation detector has higher energy resolution. (B.S.)

  5. Indoor radon and earthquake

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  6. Icon arrays help younger children's proportional reasoning.

    Ruggeri, Azzurra; Vagharchakian, Laurianne; Xu, Fei

    2018-06-01

    We investigated the effects of two context variables, presentation format (icon arrays or numerical frequencies) and time limitation (limited or unlimited time), on the proportional reasoning abilities of children aged 7 and 10 years, as well as adults. Participants had to select, between two sets of tokens, the one that offered the highest likelihood of drawing a gold token, that is, the set of elements with the greater proportion of gold tokens. Results show that participants performed better in the unlimited time condition. Moreover, besides a general developmental improvement in accuracy, our results show that younger children performed better when proportions were presented as icon arrays, whereas older children and adults were similarly accurate in the two presentation format conditions. Statement of contribution What is already known on this subject? There is a developmental improvement in proportional reasoning accuracy. Icon arrays facilitate reasoning in adults with low numeracy. What does this study add? Participants were more accurate when they were given more time to make the proportional judgement. Younger children's proportional reasoning was more accurate when they were presented with icon arrays. Proportional reasoning abilities correlate with working memory, approximate number system, and subitizing skills. © 2018 The British Psychological Society.

  7. The performance of the Armenia Nuclear Power Plant and power facilities in the 1988 Armenia earthquake

    Yanev, P.I.

    1989-01-01

    The speaker presents the geology, seismology, and effects of the Armenian earthquake which occured on December 7,1988. This is a highly industrialized area with numerous power plants, including one nuclear power plant in addition to several conventional plants. The response of the nuclear plant to the earthquake is described in detail, and the speaker concludes with an outline of suggestions which could help with earthquake preparedness in the United States, particularly in Puget Sound, Washington; along the New Madrid in Missouri and Tennessee;in the Charleston ,South Carolina area; and in the Salt Lake City region of Utah. All of these areas have similar earthquake potential, building types, absence of earthquake preparedness programs, the possibility of an earthquake of comparable proportions as Armenia

  8. a Collaborative Cyberinfrastructure for Earthquake Seismology

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  9. Rupture, waves and earthquakes.

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  10. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  11. Testing earthquake source inversion methodologies

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  12. Earthquakes; May-June 1982

    Person, W.J.

    1982-01-01

    There were four major earthquakes (7.0-7.9) during this reporting period: two struck in Mexico, one in El Salvador, and one in teh Kuril Islands. Mexico, El Salvador, and China experienced fatalities from earthquakes.

  13. Count rate effect in proportional counters

    Bednarek, B.

    1980-01-01

    A critical evaluaton is presented of the actual state of investigations and explanations of the resolution and pulse height changes resulted in proportional counters from radiation intensity variations. (author)

  14. Ionospheric earthquake effects detection based on Total Electron Content (TEC) GPS Correlation

    Sunardi, Bambang; Muslim, Buldan; Eka Sakya, Andi; Rohadi, Supriyanto; Sulastri; Murjaya, Jaya

    2018-03-01

    Advances in science and technology showed that ground-based GPS receiver was able to detect ionospheric Total Electron Content (TEC) disturbances caused by various natural phenomena such as earthquakes. One study of Tohoku (Japan) earthquake, March 11, 2011, magnitude M 9.0 showed TEC fluctuations observed from GPS observation network spread around the disaster area. This paper discussed the ionospheric earthquake effects detection using TEC GPS data. The case studies taken were Kebumen earthquake, January 25, 2014, magnitude M 6.2, Sumba earthquake, February 12, 2016, M 6.2 and Halmahera earthquake, February 17, 2016, M 6.1. TEC-GIM (Global Ionosphere Map) correlation methods for 31 days were used to monitor TEC anomaly in ionosphere. To ensure the geomagnetic disturbances due to solar activity, we also compare with Dst index in the same time window. The results showed anomalous ratio of correlation coefficient deviation to its standard deviation upon occurrences of Kebumen and Sumba earthquake, but not detected a similar anomaly for the Halmahera earthquake. It was needed a continous monitoring of TEC GPS data to detect the earthquake effects in ionosphere. This study giving hope in strengthening the earthquake effect early warning system using TEC GPS data. The method development of continuous TEC GPS observation derived from GPS observation network that already exists in Indonesia is needed to support earthquake effects early warning systems.

  15. Sensing the earthquake

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  16. Turkish Children's Ideas about Earthquakes

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  17. Earthquakes, May-June 1991

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  18. Organizational changes at Earthquakes & Volcanoes

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  19. The 1976 Tangshan earthquake

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  20. [Earthquakes in El Salvador].

    de Ville de Goyet, C

    2001-02-01

    The Pan American Health Organization (PAHO) has 25 years of experience dealing with major natural disasters. This piece provides a preliminary review of the events taking place in the weeks following the major earthquakes in El Salvador on 13 January and 13 February 2001. It also describes the lessons that have been learned over the last 25 years and the impact that the El Salvador earthquakes and other disasters have had on the health of the affected populations. Topics covered include mass-casualties management, communicable diseases, water supply, managing donations and international assistance, damages to the health-facilities infrastructure, mental health, and PAHO's role in disasters.

  1. Globalization and business ethics

    Khadartseva, L.; Agnaeva, L.

    2014-01-01

    It is assumed that local conditions of markets may be different, but some global markets, ethics and social responsibility principles should be applicable to all markets. As markets globalize and an increasing proportion of business activity transcends national borders, institutions need to help manage, regulate, and police the global marketplace, and to promote the establishment of multinational treaties to govern the global business system

  2. Proportion congruency effects: Instructions may be enough

    Olga eEntel

    2014-10-01

    Full Text Available Learning takes time, namely, one needs to be exposed to contingency relations between stimulus dimensions in order to learn, whereas intentional control can be recruited through task demands. Therefore showing that control can be recruited as a function of experimental instructions alone, that is, adapting the processing according to the instructions before the exposure to the task, can be taken as evidence for existence of control recruitment in the absence of learning. This was done by manipulating the information given at the outset of the experiment. In the first experiment, we manipulated list-level congruency proportion. Half of the participants were informed that most of the stimuli would be congruent, whereas the other half were informed that most of the stimuli would be incongruent. This held true for the stimuli in the second part of each experiment. In the first part, however, the proportion of the two stimulus types was equal. A proportion congruent effect was found in both parts of the experiment, but it was larger in the second part. In our second experiment, we manipulated the proportion of the stimuli within participants by applying an item-specific design. This was done by presenting some color words most often in their congruent color, and other color words in incongruent colors. Participants were informed about the exact word-color pairings in advance. Similar to Experiment 1, this held true only for the second experimental part. In contrast to our first experiment, informing participants in advance did not result in an item-specific proportion effect, which was observed only in the second part. Thus our results support the hypothesis that instructions may be enough to trigger list-level control, yet learning does contribute to the proportion congruent effect under such conditions. The item-level proportion effect is apparently caused by learning or at least it is moderated by it.

  3. Groundwater electrical conductivity and soil radon gas monitoring for earthquake precursory studies in Koyna, India

    Reddy, D.V.; Nagabhushanam, P.

    2011-01-01

    Research highlights: → It is the first hydrochemical precursory study in the Koyna region, India. → Discrete conductivity measurements indicated progressive increase for 4 years. → Strong precursory EC change observed 40 h before the M 5.1 earthquake. → Precursory increase of soil Rn gas 20 days earlier than earthquakes M 4.7 and 5.1. → On-line monitoring of these parameters may help in earthquake forecast. - Abstract: Hourly monitoring of electrical conductivity (EC) of groundwater along with groundwater levels in the 210 m deep boreholes (specially drilled for pore pressure/earthquake studies) and soil Rn gas at 60 cm below ground level in real time, in the Koyna-Warna region (characterized by basaltic rocks, >1500 m thick, and dotted with several sets of fault systems), western India, provided strong precursory signatures in response to two earthquakes (M 4.7 on 14/11/09, and M 5.1 on 12/12/09) that occurred in the study region. The EC measured in Govare well water showed precursory perturbations about 40 h prior to the M 5.1 earthquake and continued further for about 20 h after the earthquake. In response to the M 4.7 earthquake, there were EC perturbations 8 days after the earthquake. In another well (Koyna) which is located 4 km north of Govare well, no precursory signatures were found for the M 4.7 earthquake, while for M 5.1 earthquake, post-seismic precursors were found 18 days after the earthquake. Increased porosity and reduced pressure head accompanied by mixing of a freshwater component from the top zone due to earthquakes are the suggested mechanisms responsible for the observed anomalies in EC. Another parameter, soil Rn gas showed relatively proportional strength signals corresponding to these two earthquakes. In both the cases, the pre-seismic increase in Rn concentration started about 20 days in advance. The co-seismic drop in Rn levels was less by 30% from its peak value for the M 4.7 earthquake and 50% for the M 5.1 earthquake. The Rn

  4. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  5. Trajectories of posttraumatic growth and depreciation after two major earthquakes.

    Marshall, Emma M; Frazier, Patricia; Frankfurt, Sheila; Kuijer, Roeline G

    2015-03-01

    This study examined trajectories of posttraumatic growth or depreciation (i.e., positive or negative life change) in personal strength and relationships after 2 major earthquakes in Canterbury, New Zealand using group-based trajectory modeling. Participants completed questionnaires regarding posttraumatic growth or depreciation in personal strength and relationship domains 1 month after the first earthquake in September 2010 (N = 185) and 3 months (n = 156) and 12 months (n = 144) after the more severe February 2011 earthquake. Three classes of growth or depreciation patterns were found for both domains. For personal strength, most of the participants were grouped into a "no growth or depreciation" class and smaller proportions were grouped into either a "posttraumatic depreciation" or "posttraumatic growth" class. The 3 classes for relationships all reported posttraumatic growth, differing only in degree. None of the slopes were significant for any of the classes, indicating that levels of growth or depreciation reported after the first earthquake remained stable when assessed at 2 time points after the second earthquake. Multinomial logistic regression analyses examining pre- and postearthquake predictors of trajectory class membership revealed that those in the "posttraumatic growth" personal strength class were significantly younger and had significantly higher pre-earthquake mental health than those in the "posttraumatic depreciation" class. Sex was the only predictor of the relationship classes: No men were assigned to the "high posttraumatic growth" class. Implications and future directions are discussed. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  6. Earthquake forecasting test for Kanto district to reduce vulnerability of urban mega earthquake disasters

    Yokoi, S.; Tsuruoka, H.; Nanjo, K.; Hirata, N.

    2012-12-01

    Collaboratory for the Study of Earthquake Predictability (CSEP) is a global project on earthquake predictability research. The final goal of this project is to search for the intrinsic predictability of the earthquake rupture process through forecast testing experiments. The Earthquake Research Institute, the University of Tokyo joined CSEP and started the Japanese testing center called as CSEP-Japan. This testing center provides an open access to researchers contributing earthquake forecast models applied to Japan. Now more than 100 earthquake forecast models were submitted on the prospective experiment. The models are separated into 4 testing classes (1 day, 3 months, 1 year and 3 years) and 3 testing regions covering an area of Japan including sea area, Japanese mainland and Kanto district. We evaluate the performance of the models in the official suite of tests defined by CSEP. The total number of experiments was implemented for approximately 300 rounds. These results provide new knowledge concerning statistical forecasting models. We started a study for constructing a 3-dimensional earthquake forecasting model for Kanto district in Japan based on CSEP experiments under the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters. Because seismicity of the area ranges from shallower part to a depth of 80 km due to subducting Philippine Sea plate and Pacific plate, we need to study effect of depth distribution. We will develop models for forecasting based on the results of 2-D modeling. We defined the 3D - forecasting area in the Kanto region with test classes of 1 day, 3 months, 1 year and 3 years, and magnitudes from 4.0 to 9.0 as in CSEP-Japan. In the first step of the study, we will install RI10K model (Nanjo, 2011) and the HISTETAS models (Ogata, 2011) to know if those models have good performance as in the 3 months 2-D CSEP-Japan experiments in the Kanto region before the 2011 Tohoku event (Yokoi et al., in preparation). We use CSEP

  7. A matter of life or limb? A review of traumatic injury patterns and anesthesia techniques for disaster relief after major earthquakes.

    Missair, Andres; Pretto, Ernesto A; Visan, Alexandru; Lobo, Laila; Paula, Frank; Castillo-Pedraza, Catalina; Cooper, Lebron; Gebhard, Ralf E

    2013-10-01

    All modalities of anesthetic care, including conscious sedation, general, and regional anesthesia, have been used to manage earthquake survivors who require urgent surgical intervention during the acute phase of medical relief. Consequently, we felt that a review of epidemiologic data from major earthquakes in the context of urgent intraoperative management was warranted to optimize anesthesia disaster preparedness for future medical relief operations. The primary outcome measure of this study was to identify the predominant preoperative injury pattern (anatomic location and pathology) of survivors presenting for surgical care immediately after major earthquakes during the acute phase of medical relief (0-15 days after disaster). The injury pattern is of significant relevance because it closely relates to the anesthetic techniques available for patient management. We discuss our findings in the context of evidence-based strategies for anesthetic management during the acute phase of medical relief after major earthquakes and the associated obstacles of devastated medical infrastructure. To identify reports on acute medical care in the aftermath of natural disasters, a query was conducted using MEDLINE/PubMed, Embase, CINAHL, as well as an online search engine (Google Scholar). The search terms were "disaster" and "earthquake" in combination with "injury," "trauma," "surgery," "anesthesia," and "wounds." Our investigation focused only on studies of acute traumatic injury that specified surgical intervention among survivors in the acute phase of medical relief. A total of 31 articles reporting on 15 major earthquakes (between 1980 and 2010) and the treatment of more than 33,410 patients met our specific inclusion criteria. The mean incidence of traumatic limb injury per major earthquake was 68.0%. The global incidence of traumatic limb injury was 54.3% (18,144/33,410 patients). The pooled estimate of the proportion of limb injuries was calculated to be 67.95%, with a

  8. Proportional hazards models of infrastructure system recovery

    Barker, Kash; Baroud, Hiba

    2014-01-01

    As emphasis is being placed on a system's ability to withstand and to recover from a disruptive event, collectively referred to as dynamic resilience, there exists a need to quantify a system's ability to bounce back after a disruptive event. This work applies a statistical technique from biostatistics, the proportional hazards model, to describe (i) the instantaneous rate of recovery of an infrastructure system and (ii) the likelihood that recovery occurs prior to a given point in time. A major benefit of the proportional hazards model is its ability to describe a recovery event as a function of time as well as covariates describing the infrastructure system or disruptive event, among others, which can also vary with time. The proportional hazards approach is illustrated with a publicly available electric power outage data set

  9. The Origins of Scintillator Non-Proportionality

    Moses, W. W.; Bizarri, G. A.; Williams, R. T.; Payne, S. A.; Vasil'ev, A. N.; Singh, J.; Li, Q.; Grim, J. Q.; Choong, W.-S.

    2012-10-01

    Recent years have seen significant advances in both theoretically understanding and mathematically modeling the underlying causes of scintillator non-proportionality. The core cause is that the interaction of radiation with matter invariably leads to a non-uniform ionization density in the scintillator, coupled with the fact that the light yield depends on the ionization density. The mechanisms that lead to the luminescence dependence on ionization density are incompletely understood, but several important features have been identified, notably Auger-like processes (where two carriers of excitation interact with each other, causing one to de-excite non-radiatively), the inability of excitation carriers to recombine (caused either by trapping or physical separation), and the carrier mobility. This paper reviews the present understanding of the fundamental origins of scintillator non-proportionality, specifically the various theories that have been used to explain non-proportionality.

  10. The EM Earthquake Precursor

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  11. Large-Scale Analysis of Art Proportions

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  12. Simulated earthquake ground motions

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  13. The HayWired Earthquake Scenario—Earthquake Hazards

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  14. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  15. A multiwire proportional counter for very high counting rates

    Barbosa, A.F.; Guedes, G.P.; Tamura, E.; Pepe, I.M.; Oliveira, N.B.

    1997-12-01

    Preliminary measurements in a proportional counter with two independently counting wires showed that counting rates up to 10 6 counts/s per wire can be reached without critical loss in the true versus measured linearity relation. Results obtained with a detector containing 30 active wires (2 mm pitch) are presented. To each wire is associated a fast pre-amplifier and a discriminator channel. Global counting rates in excess to 10 7 events/s are reported. Data acquisition systems are described for 1D (real time) and 2D (off-line) position sensitive detection systems. (author)

  16. A multiwire proportional counter for very high counting rates

    Barbosa, A F; Guedes, G P [Centro Brasileiro de Pesquisas Fisicas (CBPF), Rio de Janeiro, RJ (Brazil); Tamura, E [Laboratorio Nacional de Luz Sincrotron (LNLS), Campinas, SP (Brazil); Pepe, I M; Oliveira, N B [Bahia Univ., Salvador, BA (Brazil). Inst. de Fisica

    1997-12-01

    Preliminary measurements in a proportional counter with two independently counting wires showed that counting rates up to 10{sup 6} counts/s per wire can be reached without critical loss in the true versus measured linearity relation. Results obtained with a detector containing 30 active wires (2 mm pitch) are presented. To each wire is associated a fast pre-amplifier and a discriminator channel. Global counting rates in excess to 10{sup 7} events/s are reported. Data acquisition systems are described for 1D (real time) and 2D (off-line) position sensitive detection systems. (author) 13 refs., 6 figs.

  17. Investigation of a multiwire proportional chamber

    Konijn, J.

    1976-01-01

    The article discusses some aspects of a prototype multiwire proportional chamber for electron detection located at IKO in Amsterdam, i.e. voltage, counting rates, noise and gas mixture (argon, ethylene bromide). The efficiency and performance of the chamber have been investigated and an error analysis is given

  18. Author: IM Rautenbach PROPORTIONALITY AND THE LIMITATION ...

    10332324

    public law, it is not clear to me that there are any differences at this more abstract ... Bills of Rights entrench "basic principles of rationality and proportionality – of ..... Sweet Europe of Rights 10-11; Van Dijk and Van Hoof Theory and Practice.

  19. Obtaining a Proportional Allocation by Deleting Items

    Dorn, B.; de Haan, R.; Schlotter, I.; Röthe, J.

    2017-01-01

    We consider the following control problem on fair allocation of indivisible goods. Given a set I of items and a set of agents, each having strict linear preference over the items, we ask for a minimum subset of the items whose deletion guarantees the existence of a proportional allocation in the

  20. Proportional green time scheduling for traffic lights

    P. Kovacs; Le, T. (Tung); R. Núñez Queija (Rudesindo); Vu, H. (Hai); N. Walton

    2016-01-01

    textabstractWe consider the decentralized scheduling of a large number of urban traffic lights. We investigate factors determining system performance, in particular, the length of the traffic light cycle and the proportion of green time allocated to each junction. We study the effect of the length

  1. Triangular tube proportional wire chamber system

    Badtke, D H; Bakken, J A; Barnett, B A; Blumenfeld, B J; Chien, C Y; Madansky, L; Matthews, J A.J.; Pevsner, A; Spangler, W J [Johns Hopkins Univ., Baltimore, MD (USA); Lee, K L [California Univ., Berkeley (USA). Lawrence Berkeley Lab.

    1981-10-15

    We report on the characteristics of the proportional tube chamber system which has been constructed for muon identification in the PEP-4 experiment at SLAC. The mechanical and electrical properties of the extruded aluminum triangular tubes allow these detectors to be used as crude drift chambers.

  2. Commanding to 'Nudge' via the Proportionality Principle?

    K.P. Purnhagen (Kai); E. van Kleef (Ellen)

    2017-01-01

    textabstractThis piece assesses whether nudging techniques can be argued to be a less restrictive but equally effective way to regulate diets in EU law. It has been argued that nudging techniques, due to their freedom-preserving nature, might influence the proportionality test in such a way that

  3. A review on remotely sensed land surface temperature anomaly as an earthquake precursor

    Bhardwaj, Anshuman; Singh, Shaktiman; Sam, Lydia; Joshi, P. K.; Bhardwaj, Akanksha; Martín-Torres, F. Javier; Kumar, Rajesh

    2017-12-01

    The low predictability of earthquakes and the high uncertainty associated with their forecasts make earthquakes one of the worst natural calamities, capable of causing instant loss of life and property. Here, we discuss the studies reporting the observed anomalies in the satellite-derived Land Surface Temperature (LST) before an earthquake. We compile the conclusions of these studies and evaluate the use of remotely sensed LST anomalies as precursors of earthquakes. The arrival times and the amplitudes of the anomalies vary widely, thus making it difficult to consider them as universal markers to issue earthquake warnings. Based on the randomness in the observations of these precursors, we support employing a global-scale monitoring system to detect statistically robust anomalous geophysical signals prior to earthquakes before considering them as definite precursors.

  4. Large earthquake rates from geologic, geodetic, and seismological perspectives

    Jackson, D. D.

    2017-12-01

    Earthquake rate and recurrence information comes primarily from geology, geodesy, and seismology. Geology gives the longest temporal perspective, but it reveals only surface deformation, relatable to earthquakes only with many assumptions. Geodesy is also limited to surface observations, but it detects evidence of the processes leading to earthquakes, again subject to important assumptions. Seismology reveals actual earthquakes, but its history is too short to capture important properties of very large ones. Unfortunately, the ranges of these observation types barely overlap, so that integrating them into a consistent picture adequate to infer future prospects requires a great deal of trust. Perhaps the most important boundary is the temporal one at the beginning of the instrumental seismic era, about a century ago. We have virtually no seismological or geodetic information on large earthquakes before then, and little geological information after. Virtually all-modern forecasts of large earthquakes assume some form of equivalence between tectonic- and seismic moment rates as functions of location, time, and magnitude threshold. That assumption links geology, geodesy, and seismology, but it invokes a host of other assumptions and incurs very significant uncertainties. Questions include temporal behavior of seismic and tectonic moment rates; shape of the earthquake magnitude distribution; upper magnitude limit; scaling between rupture length, width, and displacement; depth dependence of stress coupling; value of crustal rigidity; and relation between faults at depth and their surface fault traces, to name just a few. In this report I'll estimate the quantitative implications for estimating large earthquake rate. Global studies like the GEAR1 project suggest that surface deformation from geology and geodesy best show the geography of very large, rare earthquakes in the long term, while seismological observations of small earthquakes best forecasts moderate earthquakes

  5. Earthquake Risk Mitigation in the Tokyo Metropolitan area

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.

    2010-12-01

    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  6. Rapid Estimates of Rupture Extent for Large Earthquakes Using Aftershocks

    Polet, J.; Thio, H. K.; Kremer, M.

    2009-12-01

    of the rupture extent and dimensions, but not necessarily the strike. We found that using standard earthquake catalogs, such as the National Earthquake Information Center catalog, we can constrain the rupture extent, rupture direction, and in many cases the type of faulting, of the mainshock with the aftershocks that occur within the first hour after the mainshock. However, this data may not be currently available in near real-time. Since our results show that these early aftershock locations may be used to estimate first order rupture parameters for large global earthquakes, the near real-time availability of these data would be useful for fast earthquake damage assessment.

  7. Historical earthquake research in Austria

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  8. Earthquake hazard evaluation for Switzerland

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  9. Earthquake likelihood model testing

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  10. Iranian earthquakes, a uniform catalog with moment magnitudes

    Karimiparidari, Sepideh; Zaré, Mehdi; Memarian, Hossein; Kijko, Andrzej

    2013-07-01

    A uniform earthquake catalog is an essential tool in any seismic hazard analysis. In this study, an earthquake catalog of Iran and adjacent areas was compiled, using international and national databanks. The following priorities were applied in selecting magnitude and earthquake location: (a) local catalogs were given higher priority for establishing the location of an earthquake and (b) global catalogs were preferred for determining earthquake magnitudes. Earthquakes that have occurred within the bounds between 23-42° N and 42-65° E, with a magnitude range of M W 3.5-7.9, from the third millennium BC until April 2010 were included. In an effort to avoid the "boundary effect," since the newly compiled catalog will be mainly used for seismic hazard assessment, the study area includes the areas adjacent to Iran. The standardization of the catalog in terms of magnitude was achieved by the conversion of all types of magnitude into moment magnitude, M W, by using the orthogonal regression technique. In the newly compiled catalog, all aftershocks were detected, based on the procedure described by Gardner and Knopoff (Bull Seismol Soc Am 64:1363-1367, 1974). The seismicity parameters were calculated for the six main tectonic seismic zones of Iran, i.e., the Zagros Mountain Range, the Alborz Mountain Range, Central Iran, Kope Dagh, Azerbaijan, and Makran.

  11. Identified EM Earthquake Precursors

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  12. Fast rise times and the physical mechanism of deep earthquakes

    Houston, H.; Williams, Q.

    1991-01-01

    A systematic global survey of the rise times and stress drops of deep and intermediate earthquakes is reported. When the rise times are scaled to the seismic moment release of the events, their average is nearly twice as fast for events deeper than about 450 km as for shallower events.

  13. Multistate cohort models with proportional transfer rates

    Schoen, Robert; Canudas-Romo, Vladimir

    2006-01-01

    of transfer rates. The two living state case and hierarchical multistate models with any number of living states are analyzed in detail. Applying our approach to 1997 U.S. fertility data, we find that observed rates of parity progression are roughly proportional over age. Our proportional transfer rate...... approach provides trajectories by parity state and facilitates analyses of the implications of changes in parity rate levels and patterns. More women complete childbearing at parity 2 than at any other parity, and parity 2 would be the modal parity in models with total fertility rates (TFRs) of 1.40 to 2......We present a new, broadly applicable approach to summarizing the behavior of a cohort as it moves through a variety of statuses (or states). The approach is based on the assumption that all rates of transfer maintain a constant ratio to one another over age. We present closed-form expressions...

  14. Pulse triggering mechanism of air proportional counters

    Aoyama, T.; Mori, T.; Watanabe, T.

    1983-01-01

    This paper describes the pulse triggering mechanism of a cylindrical proportional counter filled with air at atmospheric pressure for the incidence of β-rays. Experimental results indicate that primary electrons created distantly from the anode wire by a β-ray are transformed into negative ions, which then detach electrons close to the anode wire and generate electron avalanches thus triggering pulses, while electrons created near the anode wire by a β-ray directly trigger a pulse. Since a negative ion pulse is triggered by a single electron detached from a negative ion, multiple pulses are generated by a large number of ions produced by the incidence of a single β-ray. It is therefore necessary not to count pulses triggered by negative ions but to count those by primary electrons alone when use is made of air proportional counters for the detection of β-rays. (orig.)

  15. Very large area multiwire spectroscopic proportional counters

    Ubertini, P.; Bazzano, A.; Boccaccini, L.; Mastropietro, M.; La Padula, C.D.; Patriarca, R.; Polcaro, V.F.

    1981-01-01

    As a result of a five year development program, a final prototype of a Very Large Area Spectroscopic Proportional Counter (VLASPC), to be employed in space borne payloads, was produced at the Istituto di Astrofisica Spaziale, Frascati. The instrument is the last version of a new generation of Multiwire Spectroscopic Proportional Counters (MWSPC) succesfully employed in many balloon borne flights, devoted to hard X-ray astronomy. The sensitive area of this standard unit is 2700 cm 2 with an efficiency higher than 10% in the range 15-180 keV (80% at 60 keV). The low cost and weight make this new type of VLASPC competitive with Nal arrays, phoswich and GSPC detectors in terms of achievable scientific results. (orig.)

  16. Very large area multiwire spectroscopic proportional counters

    Ubertini, P.; Bazzano, A.; Boccaccini, L.; Mastropietro, M.; La Padula, C.D.; Patriarca, R.; Polcaro, V.F. (Istituto di Astrofisica Spaziale, Frascati (Italy))

    1981-07-01

    As a result of a five year development program, a final prototype of a Very Large Area Spectroscopic Proportional Counter (VLASPC), to be employed in space borne payloads, was produced at the Istituto di Astrofisica Spaziale, Frascati. The instrument is the last version of a new generation of Multiwire Spectroscopic Proportional Counters (MWSPC) successfully employed in many balloon borne flights, devoted to hard X-ray astronomy. The sensitive area of this standard unit is 2700 cm/sup 2/ with an efficiency higher than 10% in the range 15-180 keV (80% at 60 keV). The low cost and weight make this new type of VLASPC competitive with Nal arrays, phoswich and GSPC detectors in terms of achievable scientific results.

  17. Geophysical Anomalies and Earthquake Prediction

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  18. Earthquake Drill using the Earthquake Early Warning System at an Elementary School

    Oki, Satoko; Yazaki, Yoshiaki; Koketsu, Kazuki

    2010-05-01

    Japan frequently suffers from many kinds of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. On average, we lose about 120 people a year due to natural hazards in this decade. Above all, earthquakes are noteworthy, since it may kill thousands of people in a moment like in Kobe in 1995. People know that we may have "a big one" some day as long as we live on this land and that what to do; retrofit houses, appliance heavy furniture to walls, add latches to kitchen cabinets, and prepare emergency packs. Yet most of them do not take the action, and result in the loss of many lives. It is only the victims that learn something from the earthquake, and it has never become the lore of the nations. One of the most essential ways to reduce the damage is to educate the general public to be able to make the sound decision on what to do at the moment when an earthquake hits. This will require the knowledge of the backgrounds of the on-going phenomenon. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools. This presentation is the report of a year and half courses that we had at the model elementary school in Tokyo Metropolitan Area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1923 Kanto earthquake (M 7.9) making 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global

  19. Pain after earthquake

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  20. Cylindrical geometry for proportional and drift chambers

    Sadoulet, B.

    1975-06-01

    For experiments performed around storage rings such as e + e - rings or the ISR pp rings, cylindrical wire chambers are very attractive. They surround the beam pipe completely without any dead region in the azimuth, and fit well with the geometry of events where particles are more or less spherically produced. Unfortunately, cylindrical proportional or drift chambers are difficult to make. Problems are discussed and two approaches to fabricating the cathodes are discussed. (WHK)

  1. 2 π gaseous flux proportional detector

    Guevara, E.A.; Costello, E.D.; Di Carlo, R.O.

    1986-01-01

    A counting system has been developed in order to measure carbon-14 samples obtained in the course of a study of a plasmapheresis treatment for diabetic children. The system is based on the use of a 2π gaseous flux proportional detector especially designed for the stated purpose. The detector is described and experiment results are given, determining the characteristic parameters which set up the working conditions. (Author) [es

  2. General methods for analyzing bounded proportion data

    Hossain, Abu

    2017-01-01

    This thesis introduces two general classes of models for analyzing proportion response variable when the response variable Y can take values between zero and one, inclusive of zero and/or one. The models are inflated GAMLSS model and generalized Tobit GAMLSS model. The inflated GAMLSS model extends the flexibility of beta inflated models by allowing the distribution on (0,1) of the continuous component of the dependent variable to come from any explicit or transformed (i.e. logit or truncated...

  3. Commanding to 'Nudge' via the Proportionality Principle?

    Purnhagen, Kai; van Kleef, Ellen

    2017-01-01

    textabstractThis piece assesses whether nudging techniques can be argued to be a less restrictive but equally effective way to regulate diets in EU law. It has been argued that nudging techniques, due to their freedom-preserving nature, might influence the proportionality test in such a way that authorities need to give preference to nudging techniques over content-related or information regulation. We will illustrate on the example of EU food law how behavioural sciences have first altered t...

  4. Contingency proportion systematically influences contingency learning.

    Forrin, Noah D; MacLeod, Colin M

    2018-01-01

    In the color-word contingency learning paradigm, each word appears more often in one color (high contingency) than in the other colors (low contingency). Shortly after beginning the task, color identification responses become faster on the high-contingency trials than on the low-contingency trials-the contingency learning effect. Across five groups, we varied the high-contingency proportion in 10% steps, from 80% to 40%. The size of the contingency learning effect was positively related to high-contingency proportion, with the effect disappearing when high contingency was reduced to 40%. At the two highest contingency proportions, the magnitude of the effect increased over trials, the pattern suggesting that there was an increasing cost for the low-contingency trials rather than an increasing benefit for the high-contingency trials. Overall, the results fit a modified version of Schmidt's (2013, Acta Psychologica, 142, 119-126) parallel episodic processing account in which prior trial instances are routinely retrieved from memory and influence current trial performance.

  5. Fault lubrication during earthquakes.

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  6. Housing Damage Following Earthquake

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  7. Earthquake engineering for nuclear facilities

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  8. Earthquake resistant design of structures

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  9. Foreshock occurrence rates before large earthquakes worldwide

    Reasenberg, P.A.

    1999-01-01

    Global rates of foreshock occurrence involving shallow M ??? 6 and M ??? 7 mainshocks and M ??? 5 foreshocks were measured, using earthquakes listed in the Harvard CMT catalog for the period 1978-1996. These rates are similar to rates ones measured in previous worldwide and regional studies when they are normalized for the ranges of magnitude difference they each span. The observed worldwide rates were compared to a generic model of earthquake clustering, which is based on patterns of small and moderate aftershocks in California, and were found to exceed the California model by a factor of approximately 2. Significant differences in foreshock rate were found among subsets of earthquakes defined by their focal mechanism and tectonic region, with the rate before thrust events higher and the rate before strike-slip events lower than the worldwide average. Among the thrust events a large majority, composed of events located in shallow subduction zones, registered a high foreshock rate, while a minority, located in continental thrust belts, measured a low rate. These differences may explain why previous surveys have revealed low foreshock rates among thrust events in California (especially southern California), while the worldwide observations suggest the opposite: California, lacking an active subduction zone in most of its territory, and including a region of mountain-building thrusts in the south, reflects the low rate apparently typical for continental thrusts, while the worldwide observations, dominated by shallow subduction zone events, are foreshock-rich.

  10. The severity of an earthquake

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  11. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    , and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.

  12. Education for Earthquake Disaster Prevention in the Tokyo Metropolitan Area

    Oki, S.; Tsuji, H.; Koketsu, K.; Yazaki, Y.

    2008-12-01

    Japan frequently suffers from all types of disasters such as earthquakes, typhoons, floods, volcanic eruptions, and landslides. In the first half of this year, we already had three big earthquakes and heavy rainfall, which killed more than 30 people. This is not just for Japan but Asia is the most disaster-afflicted region in the world, accounting for about 90% of all those affected by disasters, and more than 50% of the total fatalities and economic losses. One of the most essential ways to reduce the damage of natural disasters is to educate the general public to let them understand what is going on during those desasters. This leads individual to make the sound decision on what to do to prevent or reduce the damage. The Ministry of Education, Culture, Sports, Science and Technology (MEXT), therefore, offered for public subscription to choose several model areas to adopt scientific education to the local elementary schools, and ERI, the Earthquake Research Institute, is qualified to develop education for earthquake disaster prevention in the Tokyo metropolitan area. The tectonic setting of this area is very complicated; there are the Pacific and Philippine Sea plates subducting beneath the North America and the Eurasia plates. The subduction of the Philippine Sea plate causes mega-thrust earthquakes such as the 1703 Genroku earthquake (M 8.0) and the 1923 Kanto earthquake (M 7.9) which had 105,000 fatalities. A magnitude 7 or greater earthquake beneath this area is recently evaluated to occur with a probability of 70 % in 30 years. This is of immediate concern for the devastating loss of life and property because the Tokyo urban region now has a population of 42 million and is the center of approximately 40 % of the nation's activities, which may cause great global economic repercussion. To better understand earthquakes in this region, "Special Project for Earthquake Disaster Mitigation in Tokyo Metropolitan Area" has been conducted mainly by ERI. It is a 4-year

  13. Statistical validation of earthquake related observations

    Kossobokov, V. G.

    2011-12-01

    optional "antipodal strategy", one can make the predictions efficient, so that the wins will systematically outscore the losses. Sounds easy, however, many precursor phenomena are lacking info on a rigorous control and, in many cases, even the necessary precondition of any scientific study, i.e., an unambiguous definition of "precursor/signal". On the other hand, understanding the complexity of seismic process along with its non-stationary hierarchically organized behaviors, has led already to reproducible intermediate-term middle-range earthquake prediction technique that has passed control test in forward real-time applications during at least the last two decades. In particular, place and time of each of the mega earthquakes of 27 February 2010 in Chile and 11 March 2011 in Japan were recognized as in state of increased probability of such events in advance their occurrences in the ongoing since 1992 Global Test of the algorithms M8 and MSc. These evidences, in conjunction with a retrospective analysis of seismic activity preceding 26 December 2004 in the Indian Ocean and other mega earthquakes of the 20th century, give grounds for assuming that the algorithms of validated effectiveness in magnitude ranges M7.5+ and M8.0+ are applicable to predict the mega-earthquakes as well.

  14. Nucleation speed limit on remote fluid induced earthquakes

    Parsons, Thomas E.; Akinci, Aybige; Malignini, Luca

    2017-01-01

    Earthquakes triggered by other remote seismic events are explained as a response to long-traveling seismic waves that temporarily stress the crust. However, delays of hours or days after seismic waves pass through are reported by several studies, which are difficult to reconcile with the transient stresses imparted by seismic waves. We show that these delays are proportional to magnitude and that nucleation times are best fit to a fluid diffusion process if the governing rupture process involves unlocking a magnitude-dependent critical nucleation zone. It is well established that distant earthquakes can strongly affect the pressure and distribution of crustal pore fluids. Earth’s crust contains hydraulically isolated, pressurized compartments in which fluids are contained within low-permeability walls. We know that strong shaking induced by seismic waves from large earthquakes can change the permeability of rocks. Thus, the boundary of a pressurized compartment may see its permeability rise. Previously confined, overpressurized pore fluids may then diffuse away, infiltrate faults, decrease their strength, and induce earthquakes. Magnitude-dependent delays and critical nucleation zone conclusions can also be applied to human-induced earthquakes.

  15. Nucleation speed limit on remote fluid-induced earthquakes

    Parsons, Tom; Malagnini, Luca; Akinci, Aybige

    2017-01-01

    Earthquakes triggered by other remote seismic events are explained as a response to long-traveling seismic waves that temporarily stress the crust. However, delays of hours or days after seismic waves pass through are reported by several studies, which are difficult to reconcile with the transient stresses imparted by seismic waves. We show that these delays are proportional to magnitude and that nucleation times are best fit to a fluid diffusion process if the governing rupture process involves unlocking a magnitude-dependent critical nucleation zone. It is well established that distant earthquakes can strongly affect the pressure and distribution of crustal pore fluids. Earth’s crust contains hydraulically isolated, pressurized compartments in which fluids are contained within low-permeability walls. We know that strong shaking induced by seismic waves from large earthquakes can change the permeability of rocks. Thus, the boundary of a pressurized compartment may see its permeability rise. Previously confined, overpressurized pore fluids may then diffuse away, infiltrate faults, decrease their strength, and induce earthquakes. Magnitude-dependent delays and critical nucleation zone conclusions can also be applied to human-induced earthquakes. PMID:28845448

  16. Accounting for orphaned aftershocks in the earthquake background rate

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  17. Designing an optimally proportional inorganic scintillator

    Singh, Jai, E-mail: jai.singh@cdu.edu.au [School of Engineering and IT, B-Purple-12, Faculty of EHSE, Charles Darwin University, NT 0909 (Australia); Koblov, Alexander [School of Engineering and IT, B-Purple-12, Faculty of EHSE, Charles Darwin University, NT 0909 (Australia)

    2012-09-01

    The nonproportionality observed in the light yield of inorganic scintillators is studied theoretically as a function of the rates of bimolecular and Auger quenching processes occurring within the electron track initiated by a gamma- or X-ray photon incident on a scintillator. Assuming a cylindrical track, the influence of the track radius and concentration of excitations created within the track on the scintillator light yield is also studied. Analysing the calculated light yield a guideline for inventing an optimally proportional scintillator with optimal energy resolution is presented.

  18. Designing an optimally proportional inorganic scintillator

    Singh, Jai; Koblov, Alexander

    2012-01-01

    The nonproportionality observed in the light yield of inorganic scintillators is studied theoretically as a function of the rates of bimolecular and Auger quenching processes occurring within the electron track initiated by a gamma- or X-ray photon incident on a scintillator. Assuming a cylindrical track, the influence of the track radius and concentration of excitations created within the track on the scintillator light yield is also studied. Analysing the calculated light yield a guideline for inventing an optimally proportional scintillator with optimal energy resolution is presented.

  19. Rate dependent image distortions in proportional counters

    Trow, M.W.; Bento, A.C.; Smith, A.

    1994-01-01

    The positional linearity of imaging proportional counters is affected by the intensity distribution of the incident radiation. A mechanism for this effect is described, in which drifting positive ions in the gas produce a distorting electric field which perturbs the trajectories of the primary electrons. In certain cases, the phenomenon causes an apparent improvement of the position resolution. We demonstrate the effect in a detector filled with a xenon-argon-CO 2 mixture. The images obtained are compared with the results of a simulation. If quantitative predictions for a particular detector are required, accurate values of the absolute detector gain, ion mobility and electron drift velocity are needed. ((orig.))

  20. Hydrogen high pressure proportional drift detector

    Arefiev, A.; Balaev, A.

    1983-01-01

    The design and operation performances of a proportional drift detector PDD are described. High sensitivity of the applied PAD makes it possible to detect the neutron-proton elastic scattering in the energy range of recoil protons as low as 1 keV. The PDD is filled with hydrogen up to the pressure at 40 bars. High purity of the gas is maintained by a continuously operating purification system. The detector has been operating for several years in a neutron beam at the North Area of the CERN SPS

  1. Natural time analysis of the Centennial Earthquake Catalog

    Sarlis, N. V.; Christopoulos, S.-R. G.

    2012-01-01

    By using the most recent version (1900–2007) of the Centennial Earthquake Catalog, we examine the properties of the global seismicity. Natural time analysis reveals that the fluctuations of the order parameter κ 1 of seismicity exhibit for at least three orders of magnitude a characteristic feature similar to that of the order parameter for other equilibrium or non-equilibrium critical systems—including self-organized critical systems. Moreover, we find non-trivial magnitude correlations for earthquakes of magnitude greater than or equal to 7.

  2. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  3. Global Geomorphology

    Douglas, I.

    1985-01-01

    Any global view of landforms must include an evaluation of the link between plate tectonics and geomorphology. To explain the broad features of the continents and ocean floors, a basic distinction between the tectogene and cratogene part of the Earth's surface must be made. The tectogene areas are those that are dominated by crustal movements, earthquakes and volcanicity at the present time and are essentially those of the great mountain belts and mid ocean ridges. Cratogene areas comprise the plate interiors, especially the old lands of Gondwanaland and Laurasia. Fundamental as this division between plate margin areas and plate interiors is, it cannot be said to be a simple case of a distinction between tectonically active and stable areas. Indeed, in terms of megageomorphology, former plate margins and tectonic activity up to 600 million years ago have to be considered.

  4. Viking-Age Sails: Form and Proportion

    Bischoff, Vibeke

    2017-04-01

    Archaeological ship-finds have shed much light on the design and construction of vessels from the Viking Age. However, the exact proportions of their sails remain unknown due to the lack of fully preserved sails, or other definite indicators of their proportions. Key Viking-Age ship-finds from Scandinavia—the Oseberg Ship, the Gokstad Ship and Skuldelev 3—have all revealed traces of rigging. In all three finds, the keelson—with the mast position—is preserved, together with fastenings for the sheets and the tack, indicating the breadth of the sail. The sail area can then be estimated based on practical experience of how large a sail the specific ship can carry, in conjunction with hull form and displacement. This article presents reconstructions of the form and dimensions of rigging and sail based on the archaeological finds, evidence from iconographic and written sources, and ethnographic parallels with traditional Nordic boats. When these sources are analysed, not only do the similarities become apparent, but so too does the relative disparity between the archaeological record and the other sources. Preferential selection in terms of which source is given the greatest merit is therefore required, as it is not possible to afford them all equal value.

  5. Amplifier Design for Proportional Ionization Chambers

    Baker, W. H.

    1950-08-24

    This paper presents the requirements of a nuclear amplifier of short resolving time, designed to accept pulses of widely varying amplitudes. Data are given which show that a proportional ionization chamber loaded with a 1,000-ohm resistor develops pulses of 0.5 microsecond duration and several volts amplitude. Results indicate that seven basic requirements are imposed on the amplifier when counting soft beta and gamma radiation in the presence of alpha particles, without absorbers. It should, (1) have a fast recovery time, (2) have a relatively good low frequency response, (3) accept pulses of widely varying heights without developing spurious pulsed, (4) have a limiting output stage, (5) preserve the inherently short rise time of the chamber, (6) minimize pulse integration, and (7) have sufficient gain to detect the weak pulses well below the chamber voltage at which continuous discharge takes place. The results obtained with an amplifier which meets these requirements is described. A formula is derived which indicates that redesign of the proportional ionization chamber might eliminate the need for an amplifier. This may be possible if the radioactive particles are collimated parallel to the collecting electrode.

  6. Divine proportions in attractive and nonattractive faces.

    Pancherz, Hans; Knapp, Verena; Erbe, Christina; Heiss, Anja Melina

    2010-01-01

    To test Ricketts' 1982 hypothesis that facial beauty is measurable by comparing attractive and nonattractive faces of females and males with respect to the presence of the divine proportions. The analysis of frontal view facial photos of 90 cover models (50 females, 40 males) from famous fashion magazines and of 34 attractive (29 females, five males) and 34 nonattractive (13 females, 21 males) persons selected from a group of former orthodontic patients was carried out in this study. Based on Ricketts' method, five transverse and seven vertical facial reference distances were measured and compared with the corresponding calculated divine distances expressed in phi-relationships (f=1.618). Furthermore, transverse and vertical facial disproportion indices were created. For both the models and patients, all the reference distances varied largely from respective divine values. The average deviations ranged from 0.3% to 7.8% in the female groups of models and attractive patients with no difference between them. In the male groups of models and attractive patients, the average deviations ranged from 0.2% to 11.2%. When comparing attractive and nonattractive female, as well as male, patients, deviations from the divine values for all variables were larger in the nonattractive sample. Attractive individuals have facial proportions closer to the divine values than nonattractive ones. In accordance with the hypothesis of Ricketts, facial beauty is measurable to some degree. COPYRIGHT © 2009 BY QUINTESSENCE PUBLISHING CO, INC.

  7. Kalman-predictive-proportional-integral-derivative (KPPID)

    Fluerasu, A.; Sutton, M.

    2004-01-01

    With third generation synchrotron X-ray sources, it is possible to acquire detailed structural information about the system under study with time resolution orders of magnitude faster than was possible a few years ago. These advances have generated many new challenges for changing and controlling the state of the system on very short time scales, in a uniform and controlled manner. For our particular X-ray experiments on crystallization or order-disorder phase transitions in metallic alloys, we need to change the sample temperature by hundreds of degrees as fast as possible while avoiding over or under shooting. To achieve this, we designed and implemented a computer-controlled temperature tracking system which combines standard Proportional-Integral-Derivative (PID) feedback, thermal modeling and finite difference thermal calculations (feedforward), and Kalman filtering of the temperature readings in order to reduce the noise. The resulting Kalman-Predictive-Proportional-Integral-Derivative (KPPID) algorithm allows us to obtain accurate control, to minimize the response time and to avoid over/under shooting, even in systems with inherently noisy temperature readings and time delays. The KPPID temperature controller was successfully implemented at the Advanced Photon Source at Argonne National Laboratories and was used to perform coherent and time-resolved X-ray diffraction experiments.

  8. Proportional-Integral-Resonant AC Current Controller

    STOJIC, D.

    2017-02-01

    Full Text Available In this paper an improved stationary-frame AC current controller based on the proportional-integral-resonant control action (PIR is proposed. Namely, the novel two-parameter PIR controller is applied in the stationary-frame AC current control, accompanied by the corresponding parameter-tuning procedure. In this way, the proportional-resonant (PR controller, common in the stationary-frame AC current control, is extended by the integral (I action in order to enable the AC current DC component tracking, and, also, to enable the DC disturbance compensation, caused by the voltage source inverter (VSI nonidealities and by nonlinear loads. The proposed controller parameter-tuning procedure is based on the three-phase back-EMF-type load, which corresponds to a wide range of AC power converter applications, such as AC motor drives, uninterruptible power supplies, and active filters. While the PIR controllers commonly have three parameters, the novel controller has two. Also, the provided parameter-tuning procedure needs only one parameter to be tuned in relation to the load and power converter model parameters, since the second controller parameter is directly derived from the required controller bandwidth value. The dynamic performance of the proposed controller is verified by means of simulation and experimental runs.

  9. Searching for evidence of a preferred rupture direction in small earthquakes at Parkfield

    Kane, D. L.; Shearer, P. M.; Allmann, B.; Vernon, F. L.

    2009-12-01

    Theoretical modeling of strike-slip ruptures along a bimaterial interface suggests that the interface will have a preferred rupture direction and will produce asymmetric ground motion (Shi and Ben-Zion, 2006). This could have widespread implications for earthquake source physics and for hazard analysis on mature faults because larger ground motions would be expected in the direction of rupture propagation. Studies have shown that many large global earthquakes exhibit unilateral rupture, but a consistently preferred rupture direction along faults has not been observed. Some researchers have argued that the bimaterial interface model does not apply to natural faults, noting that the rupture of the M 6 2004 Parkfield earthquake propagated in the opposite direction from previous M 6 earthquakes along that section of the San Andreas Fault (Harris and Day, 2005). We analyze earthquake spectra from the Parkfield area to look for evidence of consistent rupture directivity along the San Andreas Fault. We separate the earthquakes into spatially defined clusters and quantify the differences in high-frequency energy among earthquakes recorded at each station. Propagation path effects are minimized in this analysis because we compare earthquakes located within a small volume and recorded by the same stations. By considering a number of potential end-member models, we seek to determine if a preferred rupture direction is present among small earthquakes at Parkfield.

  10. Generation of earthquake signals

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  11. Earthquakes Threaten Many American Schools

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  12. Make an Earthquake: Ground Shaking!

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  13. Archiving, sharing, processing and publishing historical earthquakes data: the IT point of view

    Locati, Mario; Rovida, Andrea; Albini, Paola

    2014-05-01

    Digital tools devised for seismological data are mostly designed for handling instrumentally recorded data. Researchers working on historical seismology are forced to perform their daily job using a general purpose tool and/or coding their own to address their specific tasks. The lack of out-of-the-box tools expressly conceived to deal with historical data leads to a huge amount of time lost in performing tedious task to search for the data and, to manually reformat it in order to jump from one tool to the other, sometimes causing a loss of the original data. This reality is common to all activities related to the study of earthquakes of the past centuries, from the interpretations of past historical sources, to the compilation of earthquake catalogues. A platform able to preserve the historical earthquake data, trace back their source, and able to fulfil many common tasks was very much needed. In the framework of two European projects (NERIES and SHARE) and one global project (Global Earthquake History, GEM), two new data portals were designed and implemented. The European portal "Archive of Historical Earthquakes Data" (AHEAD) and the worldwide "Global Historical Earthquake Archive" (GHEA), are aimed at addressing at least some of the above mentioned issues. The availability of these new portals and their well-defined standards makes it easier than before the development of side tools for archiving, publishing and processing the available historical earthquake data. The AHEAD and GHEA portals, their underlying technologies and the developed side tools are presented.

  14. Earthquake Catalogue of the Caucasus

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  15. Hepatitis E virus seroepidemiology: a post-earthquake study among blood donors in Nepal

    Ashish C. Shrestha

    2016-11-01

    Full Text Available Abstract Background As one of the causative agents of viral hepatitis, hepatitis E virus (HEV has gained public health attention globally. HEV epidemics occur in developing countries, associated with faecal contamination of water and poor sanitation. In industrialised nations, HEV infections are associated with travel to countries endemic for HEV, however, autochthonous infections, mainly through zoonotic transmission, are increasingly being reported. HEV can also be transmitted by blood transfusion. Nepal has experienced a number of HEV outbreaks, and recent earthquakes resulted in predictions raising the risk of an HEV outbreak to very high. This study aimed to measure HEV exposure in Nepalese blood donors after large earthquakes. Methods Samples (n = 1,845 were collected from blood donors from Kathmandu, Chitwan, Bhaktapur and Kavre. Demographic details, including age and sex along with possible risk factors associated with HEV exposure were collected via a study-specific questionnaire. Samples were tested for HEV IgM, IgG and antigen. The proportion of donors positive for HEV IgM or IgG was calculated overall, and for each of the variables studied. Chi square and regression analyses were performed to identify factors associated with HEV exposure. Results Of the donors residing in earthquake affected regions (Kathmandu, Bhaktapur and Kavre, 3.2% (54/1,686; 95% CI 2.7–4.0% were HEV IgM positive and two donors were positive for HEV antigen. Overall, 41.9% (773/1,845; 95% CI 39.7–44.2% of donors were HEV IgG positive, with regional variation observed. Higher HEV IgG and IgM prevalence was observed in donors who reported eating pork, likely an indicator of zoonotic transmission. Previous exposure to HEV in Nepalese blood donors is relatively high. Conclusion Detection of recent markers of HEV infection in healthy donors suggests recent asymptomatic HEV infection and therefore transfusion-transmission in vulnerable patients is a risk in

  16. Hepatitis E virus seroepidemiology: a post-earthquake study among blood donors in Nepal.

    Shrestha, Ashish C; Flower, Robert L P; Seed, Clive R; Rajkarnikar, Manita; Shrestha, Shrawan K; Thapa, Uru; Hoad, Veronica C; Faddy, Helen M

    2016-11-25

    As one of the causative agents of viral hepatitis, hepatitis E virus (HEV) has gained public health attention globally. HEV epidemics occur in developing countries, associated with faecal contamination of water and poor sanitation. In industrialised nations, HEV infections are associated with travel to countries endemic for HEV, however, autochthonous infections, mainly through zoonotic transmission, are increasingly being reported. HEV can also be transmitted by blood transfusion. Nepal has experienced a number of HEV outbreaks, and recent earthquakes resulted in predictions raising the risk of an HEV outbreak to very high. This study aimed to measure HEV exposure in Nepalese blood donors after large earthquakes. Samples (n = 1,845) were collected from blood donors from Kathmandu, Chitwan, Bhaktapur and Kavre. Demographic details, including age and sex along with possible risk factors associated with HEV exposure were collected via a study-specific questionnaire. Samples were tested for HEV IgM, IgG and antigen. The proportion of donors positive for HEV IgM or IgG was calculated overall, and for each of the variables studied. Chi square and regression analyses were performed to identify factors associated with HEV exposure. Of the donors residing in earthquake affected regions (Kathmandu, Bhaktapur and Kavre), 3.2% (54/1,686; 95% CI 2.7-4.0%) were HEV IgM positive and two donors were positive for HEV antigen. Overall, 41.9% (773/1,845; 95% CI 39.7-44.2%) of donors were HEV IgG positive, with regional variation observed. Higher HEV IgG and IgM prevalence was observed in donors who reported eating pork, likely an indicator of zoonotic transmission. Previous exposure to HEV in Nepalese blood donors is relatively high. Detection of recent markers of HEV infection in healthy donors suggests recent asymptomatic HEV infection and therefore transfusion-transmission in vulnerable patients is a risk in Nepal. Surprisingly, this study did not provide evidence of a large

  17. Testing earthquake source inversion methodologies

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  18. BCE selector valves and flow proportional sampler

    Rippy, G.L.

    1994-01-01

    This Acceptance Test Procedure (ATP) has been prepared to demonstrate that the Electrical/Instrumentation systems for the B-Plant Process Condensate Treatment Facility (BCE) function as required by project criteria. Tests will be run to: Verify the operation of the solenoid valve and associated limit switches installed for the BCE portion of W-007H; Operate the solenoid valve and verify the proper operation of the associated limit switches based on the position of the solenoid valve;and, Demonstrate the integrity of the Sample Failure Alarm Relay XFA-211BA-BCE-1, and Power Failure ALarm Relay JFA-211BA-BCE-1 located inside the Flow Proportional Sampler in Building 211 BA

  19. Absolute calibration of TFTR helium proportional counters

    Strachan, J.D.; Diesso, M.; Jassby, D.; Johnson, L.; McCauley, S.; Munsat, T.; Roquemore, A.L.; Loughlin, M.

    1995-06-01

    The TFTR helium proportional counters are located in the central five (5) channels of the TFTR multichannel neutron collimator. These detectors were absolutely calibrated using a 14 MeV neutron generator positioned at the horizontal midplane of the TFTR vacuum vessel. The neutron generator position was scanned in centimeter steps to determine the collimator aperture width to 14 MeV neutrons and the absolute sensitivity of each channel. Neutron profiles were measured for TFTR plasmas with time resolution between 5 msec and 50 msec depending upon count rates. The He detectors were used to measure the burnup of 1 MeV tritons in deuterium plasmas, the transport of tritium in trace tritium experiments, and the residual tritium levels in plasmas following 50:50 DT experiments

  20. CWRU multiwire proportional counter readout system

    Bevington, P.R.; Leskovec, R.A.

    1977-01-01

    An electronic system is described which translates pulses from individual wires of multiwire proportional counters into binary addresses indicating the location of the wires in the chambers. The system combines a fast (<100 ns) serial scan of an event buffer with parallel encoding to provide fast transfer of addresses (250 ns per hit). The buffer has provision for disabling the input less than 40 ns after detection of an event to suppress recording of multiple hits caused by individual events. The encoder can digitize the address of every hit encountered or just the first addresses of contiguous hits. The system includes a coincidence trigger for determining whether timing criteria have been satisfied between chambers and with external devices. Events which do not meet the coincidence criteria are typically reset within 400 ns. The addresses are transferred to a computer interface through CAMAC modules. Multiple buffering permits further data acquisition during CAMAC transfer cycles. (Auth.)

  1. Fabrication of preamplifier for proportional counter

    Lotfi, Y.; Yazdanpanah, M.; Talebi, B.; Mohammadi, A.; Etaati, Gh.

    2002-01-01

    We have tried to describe techniques of preamplifier fabrication for proportional counter. At first electronic circuit of preamplifier has been analyzed by means of Or cad 9.1. Then we assembled the circuit. Thereafter essential and standard parameters of preamplifier has been measured and compared with foreign made one, according to IEEE standard method. (IEEE Std 301-1988) Specification for our preamplifier is: 1. Rise time of output plus: 25 nsec. 2. Fall time of output pulse: 50μ sec. 3. Charge sensitive: 46.3 mV/pc. 4. Average noise: 500 ion pair (rms) 5. Count R ate L imit: 9.14*10 10 Count/sec. 6. Resolution: %1.3 7. Spectrum of Bf3 detector to 300μ Ci Am-Be source for this preamplifier is the same as foreign one. On the Whole comparison of this preamplifier with the foreign one shows that their parameters similarity is about %95

  2. Tables of Confidence Limits for Proportions

    1990-09-01

    0.972 180 49 0.319 0.332 0,357 175 165 0.964 0.969 0.976 ISO 50 0.325 0.338 0.363 175 166 0.969 0.973 0.980 180 51 0.331 0.344 0.368 175 167 0.973 0.977...0.528 180 18 0.135 0 145 0.164 180 19 0.141 0.151 0.171 ISO 80 0.495 0,508 0.534 347 UPPER CONFIDENCE LIMIT FOR PROPORTIONS CONFIDENCE LEVEL...500 409 0.8401 0.8459 0.8565 500 355 0.7364 0.7434 0.7564 500 356 0.7383 0.7453 0.7582 500 410 0.8420 0.8478 0 8583 500 357 0.7402 0.7472 0.7602 500

  3. Proportional chamber with data analog output

    Popov, V.E.; Prokof'ev, A.N.

    1977-01-01

    A proportional multiwier chamber is described. The chamber makes it possible to determine angles at wich a pion strikes a polarized target. A delay line, made of 60-core flat cable is used for removing signals from the chamber. From the delay line, signals are amplified and successively injected into shapers and a time-to-amplitude converter. An amplitude of the time-to amplitude converter output signal unambiguously determines the coordinate of a point at which a particle strikes the chamber plane. There are also given circuits of amplifiers, which consist of a preamplifier with gain 30 and a main amplifier with adjustable gain. Data on testing the chamber with the 450 MeV pion beam is demonstrated. The chamber features an efficiency of about 98 per cent under load of 2x10 5 s -1

  4. Proportional representation apportionment methods and their applications

    Pukelsheim, Friedrich

    2017-01-01

    The book offers an in-depth study of the translation of vote counts into seat numbers in proportional representation systems  – an approach guided by practical needs. It also provides plenty of empirical instances illustrating the results. It analyzes in detail the 2014 elections to the European Parliament in the 28 member states, as well as the 2009 and 2013 elections to the German Bundestag. This second edition is a complete revision and expanded version of the first edition published in 2014, and many empirical election results that serve as examples have been updated. Further, a final chapter has been added assembling biographical sketches and authoritative quotes from individuals who pioneered the development of apportionment methodology. The mathematical exposition and the interrelations with political science and constitutional jurisprudence make this an apt resource for interdisciplinary courses and seminars on electoral systems and apportionment methods.

  5. Quality measurement by proportional counter with B

    Onizuka, Yoshihiko; Endo, Satoru; Tanaka, Kenichi

    2005-01-01

    The dosimetry of air and the tissue-equivalent phantom made of acryl are carried out by the tissue-equivalent proportional counter (TEPC) and TEPC with wall contained B, and both results were compared. The changes of quality with distance from the beam center are determined by the frequency mean renewal energy y F (y)and the dose mean renewal energy y D (y) as indicators of quality. Both y F (y)and y D (y) of tissue-equivalent phantom are larger than air, but very large change was not observed in all distance. The dose rate is determined by y D (y), the number of events and measurement time. Change of dose rate was larger than the change of quality. The maximum value of dose rate depended on γray and neutron beam showed at the point 2 cm away from the center. (S.Y.)

  6. Proportional counter system for radiation measurement

    Sugimoto, M; Okudera, S

    1970-11-21

    A gas such as Xe or Kr employed in counter tubes is charged into the counter tube of a gas-flow type proportional counter for radiation measurement and into a vessel having a volume larger than that of the counter tube. The vessel communicates with the counter tube to circulate the gas via a pump through both the vessel and tube during measurement. An organic film such as a polyester synthetic resin film is used for the window of the counter tube to measure X-rays in the long wavelength range. Accordingly, a wide range of X-rays can be measured including both long and short wavelengths ranges by utilizing only one counter tube, thus permitting the gases employed to be effectively used.

  7. Count rate effect in proportional counters

    Bednarek, B.

    1980-01-01

    A new concept is presented explaining changes in spectrometric parameters of proportional counters which occur due to varying count rate. The basic feature of this concept is that the gas gain of the counter remains constant in a wide range of count rate and that the decrease in the pulse amplitude and the detorioration of the energy resolution observed are the results of changes in the shape of original current pulses generated in the active volume of the counter. In order to confirm the validity of this statement, measurements of the gas amplification factor have been made in a wide count rate range. It is shown that above a certain critical value the gas gain depends on both the operating voltage and the count rate. (author)

  8. Near real-time aftershock hazard maps for earthquakes

    McCloskey, J.; Nalbant, S. S.

    2009-04-01

    Stress interaction modelling is routinely used to explain the spatial relationships between earthquakes and their aftershocks. On 28 October 2008 a M6.4 earthquake occurred near the Pakistan-Afghanistan border killing several hundred and causing widespread devastation. A second M6.4 event occurred 12 hours later 20km to the south east. By making some well supported assumptions concerning the source event and the geometry of any likely triggered event it was possible to map those areas most likely to experience further activity. Using Google earth, it would further have been possible to identify particular settlements in the source area which were particularly at risk and to publish their locations globally within about 3 hours of the first earthquake. Such actions could have significantly focused the initial emergency response management. We argue for routine prospective testing of such forecasts and dialogue between social and physical scientists and emergency response professionals around the practical application of these techniques.

  9. Mw 8.5 BENGKULU EARTHQUAKES FROM CONTINUOUS GPS DATA

    W. A. W. Aris

    2016-09-01

    Full Text Available The Mw 8.5 Bengkulu earthquake of 30 September 2007 and the Mw8.6 28 March 2005 are considered amongst large earthquake ever recorded in Southeast Asia. The impact into tectonic deformation was recorded by a network of Global Positioning System (GPS Continuously Operating Reference Station (CORS within southern of Sumatra and west-coast of Peninsular Malaysia. The GPS data from the GPS CORS network has been deployed to investigate the characteristic of postseismic deformation due to the earthquakes. Analytical logarithmic and exponential function was applied to investigate the deformation decay period of postseismic deformation. This investigation provides a preliminary insight into postseismic cycle along the Sumatra subduction zone in particular and on the dynamics Peninsular Malaysia in general.

  10. Neoliberalism and criticisms of earthquake insurance arrangements in New Zealand.

    Hay, I

    1996-03-01

    Global collapse of the Fordist-Keynesian regime of accumulation and an attendant philosophical shift in New Zealand politics to neoliberalism have prompted criticisms of, and changes to, the Earthquake and War Damage Commission. Earthquake insurance arrangements made 50 years ago in an era of collectivist, welfarist political action are now set in an environment in which emphasis is given to competitive relations and individualism. Six specific criticisms of the Commission are identified, each of which is founded in the rhetoric and ideology of a neoliberal political project which has underpinned radical social and economic changes in New Zealand since the early 1980s. On the basis of those criticisms, and in terms of the Earthquake Commission Act 1993, the Commission has been restructured. The new Commission is withdrawing from its primary position as the nation's non-residential property hazards insurer and is restricting its coverage of residential properties.

  11. Regional dependence in earthquake early warning and real time seismology

    Caprio, M.

    2013-01-01

    An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

  12. Regional dependence in earthquake early warning and real time seismology

    Caprio, M.

    2013-07-01

    An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

  13. A phenomenological SMA model for combined axial–torsional proportional/non-proportional loading conditions

    Bodaghi, M.; Damanpack, A.R.; Aghdam, M.M.; Shakeri, M.

    2013-01-01

    In this paper, a simple and robust phenomenological model for shape memory alloys (SMAs) is proposed to simulate main features of SMAs under uniaxial as well as biaxial combined axial–torsional proportional/non-proportional loadings. The constitutive model for polycrystalline SMAs is developed within the framework of continuum thermodynamics of irreversible processes. The model nominates the volume fractions of self-accommodated and oriented martensite as scalar internal variables and the preferred direction of oriented martensitic variants as directional internal variable. An algorithm is introduced to develop explicit relationships for the thermo-mechanical behavior of SMAs under uniaxial and biaxial combined axial–torsional proportional/non-proportional loading conditions and also thermal loading. It is shown that the model is able to simulate main aspects of SMAs including self-accommodation, martensitic transformation, orientation and reorientation of martensite, shape memory effect, ferro-elasticity and pseudo-elasticity. A description of the time-discrete counterpart of the proposed SMA model is presented. Experimental results of uniaxial tension and biaxial combined tension–torsion non-proportional tests are simulated and a good qualitative correlation between numerical and experimental responses is achieved. Due to simplicity and accuracy, the model is expected to be used in the future studies dealing with the analysis of SMA devices in which two stress components including one normal and one shear stress are dominant

  14. Impact-based earthquake alerts with the U.S. Geological Survey's PAGER system: what's next?

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Garcia, D.; So, E.; Hearne, M.

    2012-01-01

    In September 2010, the USGS began publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses with its Prompt Assessment of Global Earthquakes for Response (PAGER) system. These estimates significantly enhanced the utility of the USGS PAGER system which had been, since 2006, providing estimated population exposures to specific shaking intensities. Quantifying earthquake impacts and communicating estimated losses (and their uncertainties) to the public, the media, humanitarian, and response communities required a new protocol—necessitating the development of an Earthquake Impact Scale—described herein and now deployed with the PAGER system. After two years of PAGER-based impact alerting, we now review operations, hazard calculations, loss models, alerting protocols, and our success rate for recent (2010-2011) events. This review prompts analyses of the strengths, limitations, opportunities, and pressures, allowing clearer definition of future research and development priorities for the PAGER system.

  15. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    Delorey, Andrew A.; Elst, Nicholas J. van der; Johnson, Paul Allan

    2016-01-01

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering of earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Lastly, our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.

  16. Analyses of surface motions caused by the magnitude 9.0 2004 Sumatra earthquake

    Khan, Shfaqat Abbas; Gudmundsson, Ó.

    The Sumatra, Indonesia, earthquake on December 26th was one of the most devastating earthquakes in history. With a magnitude of Mw = 9.0 it is the forth largest earthquake recorded since 1900. It occurred about one hundred kilometers off the west coast of northern Sumatra, where the relatively thin...... of years. The result was a devastating tsunami hitting coastlines across the Indian Ocean killing more than 225,000 people in Sri Lanka, India, Indonesia, Thailand and Malaysia. An earthquake of this magnitude is expected to involve a displacement on the fault on the order of 10 meters. But, what...... was the actual amplitude of the surface motions that triggered the tsunami? This can be constrained using the amplitudes of elastic waves radiated from the earthquake, or by direct measurements of deformation. Here we present estimates of the deformation based on continuous Global Positioning System (GPS...

  17. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A.; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne

    2018-01-01

    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 Mw (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes. PMID:29404404

  18. Tidal triggering of earthquakes suggests poroelastic behavior on the San Andreas Fault

    Delorey, Andrew; Van Der Elst, Nicholas; Johnson, Paul

    2017-01-01

    Tidal triggering of earthquakes is hypothesized to provide quantitative information regarding the fault's stress state, poroelastic properties, and may be significant for our understanding of seismic hazard. To date, studies of regional or global earthquake catalogs have had only modest successes in identifying tidal triggering. We posit that the smallest events that may provide additional evidence of triggering go unidentified and thus we developed a technique to improve the identification of very small magnitude events. We identify events applying a method known as inter-station seismic coherence where we prioritize detection and discrimination over characterization. Here we show tidal triggering of earthquakes on the San Andreas Fault. We find the complex interaction of semi-diurnal and fortnightly tidal periods exposes both stress threshold and critical state behavior. Our findings reveal earthquake nucleation processes and pore pressure conditions – properties of faults that are difficult to measure, yet extremely important for characterizing earthquake physics and seismic hazards.

  19. Assessment of earthquake effects - contribution from online communication

    D'Amico, Sebastiano; Agius, Matthew; Galea, Pauline

    2014-05-01

    The rapid increase of social media and online newspapers in the last years have given the opportunity to make a national investigation on macroseismic effects on the Maltese Islands based on felt earthquake reports. A magnitude 4.1 earthquake struck close to Malta on Sunday 24th April 2011 at 13:02 GMT. The earthquake was preceded and followed by a series of smaller magnitude quakes throughout the day, most of which were felt by the locals on the island. The continuous news media coverage during the day and the extensive sharing of the news item on social media resulted in a strong public response to fill in the 'Did you feel it?' online form on the website of the Seismic Monitoring and Research Unit (SMRU) at the University of Malta (http://seismic.research.um.edu.mt/). The results yield interesting information about the demographics of the island, and the different felt experiences possibly relating to geological settings and diverse structural and age-classified buildings. Based on this case study, the SMRU is in the process of developing a mobile phone application dedicated to share earthquake information to the local community. The application will automatically prompt users to fill in a simplified 'Did you feel it?' report to potentially felt earthquakes. Automatic location using Global Positioning Systems can be incorporated to provide a 'real time' intensity map that can be used by the Civil Protection Department.

  20. Investigation of Backprojection Uncertainties With M6 Earthquakes

    Fan, Wenyuan; Shearer, Peter M.

    2017-10-01

    We investigate possible biasing effects of inaccurate timing corrections on teleseismic P wave backprojection imaging of large earthquake ruptures. These errors occur because empirically estimated time shifts based on aligning P wave first arrivals are exact only at the hypocenter and provide approximate corrections for other parts of the rupture. Using the Japan subduction zone as a test region, we analyze 46 M6-M7 earthquakes over a 10 year period, including many aftershocks of the 2011 M9 Tohoku earthquake, performing waveform cross correlation of their initial P wave arrivals to obtain hypocenter timing corrections to global seismic stations. We then compare backprojection images for each earthquake using its own timing corrections with those obtained using the time corrections from other earthquakes. This provides a measure of how well subevents can be resolved with backprojection of a large rupture as a function of distance from the hypocenter. Our results show that backprojection is generally very robust and that the median subevent location error is about 25 km across the entire study region (˜700 km). The backprojection coherence loss and location errors do not noticeably converge to zero even when the event pairs are very close (<20 km). This indicates that most of the timing differences are due to 3-D structure close to each of the hypocenter regions, which limits the effectiveness of attempts to refine backprojection images using aftershock calibration, at least in this region.

  1. Gas and Dust Phenomena of Mega-earthquakes and the Cause

    Yue, Z.

    2013-12-01

    A mega-earthquake suddenly releases a large to extremely large amount of kinetic energy within a few tens to two hundreds seconds and over ten to hundreds kilometer distances in the Earth's crust and on ground surface. It also generates seismic waves that can be received globally and co-seismic ground damages such co-seismic ruptures and landslides. However, such vast, dramatic and devastating kinetic actions in the Earth's crustal rocks and on the ground soils cannot be known or predicted by people at few weeks, days, hours, or minutes before they are happening. Although seismologists can develop and use seismometers to report the locations and magnitudes of earthquakes within minutes of their occurrence, they cannot predict earthquakes at present. Therefore, damage earthquakes have caused and would continue to cause huge disasters, fatalities and injuries to our human beings. This problem may indicate that it is necessary to re-examine the cause of mega-earthquakes in addition to the conventional cause of active fault elastic rebounding. In the last ten years, many mega-earthquakes occurred in China and around the Pacific Ocean and caused many casualties to human beings and devastating disasters to environments. The author will give a brief review on the impacts of the mega-earthquakes happened in recent years. He will then present many gas and dust related phenomena associated with the sudden occurrences of these mega earthquakes. They include the 2001 Kunlunshan Earthquake M8.1, 2008 Wenchuan Earthquake M8.0 and the 2010 Yushu Earthquake M7.1 in China, the 2010 Haiti Earthquake M7.0, the 2010 Mexicali Earthquake M7.2, the 2010 Chile Earthquake M8.8, the 2011 Christchurch earthquake M6.3 and the 2011 Japan Earthquake M9.0 around the Pacific Ocean. He will discuss the cause of these gas and dust related phenomena. He will use these phenomena and their common cause to show that the earthquakes were caused the rapid migration and expansion of highly compressed and

  2. Leveraging geodetic data to reduce losses from earthquakes

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    event response products and by expanded use of geodetic imaging data to assess fault rupture and source parameters.Uncertainties in the NSHM, and in regional earthquake models, are reduced by fully incorporating geodetic data into earthquake probability calculations.Geodetic networks and data are integrated into the operations and earthquake information products of the Advanced National Seismic System (ANSS).Earthquake early warnings are improved by more rapidly assessing ground displacement and the dynamic faulting process for the largest earthquakes using real-time geodetic data.Methodology for probabilistic earthquake forecasting is refined by including geodetic data when calculating evolving moment release during aftershock sequences and by better understanding the implications of transient deformation for earthquake likelihood.A geodesy program that encompasses a balanced mix of activities to sustain missioncritical capabilities, grows new competencies through the continuum of fundamental to applied research, and ensures sufficient resources for these endeavors provides a foundation by which the EHP can be a leader in the application of geodesy to earthquake science. With this in mind the following objectives provide a framework to guide EHP efforts:Fully utilize geodetic information to improve key products, such as the NSHM and EEW, and to address new ventures like the USGS Subduction Zone Science Plan.Expand the variety, accuracy, and timeliness of post-earthquake information products, such as PAGER (Prompt Assessment of Global Earthquakes for Response), through incorporation of geodetic observations.Determine if geodetic measurements of transient deformation can significantly improve estimates of earthquake probability.Maintain an observational strategy aligned with the target outcomes of this document that includes continuous monitoring, recording of ephemeral observations, focused data collection for use in research, and application-driven data processing and

  3. Earthquake Emergency Education in Dushanbe, Tajikistan

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  4. Determination of Design Basis Earthquake ground motion

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  5. Determination of Design Basis Earthquake ground motion

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  6. The ordered network structure of M {>=} 6 strong earthquakes and its prediction in the Jiangsu-South Yellow Sea region

    Men, Ke-Pei [Nanjing Univ. of Information Science and Technology (China). College of Mathematics and Statistics; Cui, Lei [California Univ., Santa Barbara, CA (United States). Applied Probability and Statistics Dept.

    2013-05-15

    The the Jiangsu-South Yellow Sea region is one of the key seismic monitoring defence areas in the eastern part of China. Since 1846, M {>=} 6 strong earthquakes have showed an obvious commensurability and orderliness in this region. The main orderly values are 74 {proportional_to} 75 a, 57 {proportional_to} 58 a, 11 {proportional_to} 12 a, and 5 {proportional_to} 6 a, wherein 74 {proportional_to} 75 a and 57 {proportional_to} 58 a with an outstanding predictive role. According to the information prediction theory of Wen-Bo Weng, we conceived the M {>=} 6 strong earthquake ordered network structure in the South Yellow Sea and the whole region. Based on this, we analyzed and discussed the variation of seismicity in detail and also made a trend prediction of M {>=} 6 strong earthquakes in the future. The results showed that since 1998 it has entered into a new quiet episode which may continue until about 2042; and the first M {>=} 6 strong earthquake in the next active episode will probably occur in 2053 pre and post, with the location likely in the sea area of the South Yellow Sea; also, the second and the third ones or strong earthquake swarm in the future will probably occur in 2058 and 2070 pre and post. (orig.)

  7. Calibration of proportional counters in microdosimetry

    Varma, M.N.

    1982-01-01

    Many microdosimetric spectra for low LET as well as high LET radiations are measured using commercially available (similar to EG and G) Rossi proportional counters. This paper discusses the corrections to be applied to data when calibration of the counter is made using one type of radiation, and then the counter is used in a different radiation field. The principal correction factor is due to differences in W-value of the radiation used for calibration and the radiation for which microdosimetric measurements are made. Both propane and methane base tissue-equivalent (TE) gases are used in these counters. When calibrating the detectors, it is important to use the correct stopping power value for that gas. Deviations in y-bar/sub F/ and y-bar/sub D/ are calculated for 60 Co using different extrapolation procedures from 0.15 keV/μm to zero event size. These deviations can be as large as 30%. Advantages of reporting microdosimetric parameters such as y-bar/sub F/ and y-bar/sub D/ above a certain minimum cut-off are discussed

  8. Environmental drivers of sapwood and heartwood proportions

    Thurner, Martin; Beer, Christian

    2017-04-01

    Recent advances combining information on stem volume from remote sensing with allometric relationships derived from forest inventory databases have led to spatially continuous estimates of stem, branch, root and foliage biomass in northern boreal and temperate forests. However, a separation of stem biomass into sapwood and heartwood mass has remained unsolved, despite their important differences in biogeochemical function, for instance concerning their contribution to tree respiratory costs. Although relationships between sapwood cross-sectional area and supported leaf area are well established, less is known about relations between sapwood or heartwood mass and other traits (e.g. stem mass), since these biomass compartments are more difficult to measure in practice. Here we investigate the variability in sapwood and heartwood proportions and determining environmental factors. For this task we explore an available biomass and allometry database (BAAD) and study relative sapwood and heartwood area, volume, mass and density in dependence of tree species, age and climate. First, a theoretical framework on how to estimate sap- and heartwood mass from stem mass is developed. Subsequently, the underlying assumptions and relationships are explored with the help of the BAAD. The established relationships can be used to derive spatially continuous sapwood and heartwood mass estimates by applying them to remote sensing based stem volume products. This would be a fundamental step forward to a data-driven estimate of autotrophic respiration.

  9. Physics of Earthquake Rupture Propagation

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  10. Radon observation for earthquake prediction

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  11. Earthquake prediction by Kina Method

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  12. Elastic energy release in great earthquakes and eruptions

    Agust eGudmundsson

    2014-05-01

    Full Text Available The sizes of earthquakes are measured using well-defined, measurable quantities such as seismic moment and released (transformed elastic energy. No similar measures exist for the sizes of volcanic eruptions, making it difficult to compare the energies released in earthquakes and eruptions. Here I provide a new measure of the elastic energy (the potential mechanical energy associated with magma chamber rupture and contraction (shrinkage during an eruption. For earthquakes and eruptions, elastic energy derives from two sources: (1 the strain energy stored in the volcano/fault zone before rupture, and (2 the external applied load (force, pressure, stress, displacement on the volcano/fault zone. From thermodynamic considerations it follows that the elastic energy released or transformed (dU during an eruption is directly proportional to the excess pressure (pe in the magma chamber at the time of rupture multiplied by the volume decrease (-dVc of the chamber, so that . This formula can be used as a basis for a new eruption magnitude scale, based on elastic energy released, which can be related to the moment-magnitude scale for earthquakes. For very large eruptions (>100 km3, the volume of the feeder-dike is negligible, so that the decrease in chamber volume during an eruption corresponds roughly to the associated volume of erupted materials , so that the elastic energy is . Using a typical excess pressures of 5 MPa, it is shown that the largest known eruptions on Earth, such as the explosive La Garita Caldera eruption (27-28 million years ago and largest single (effusive Colombia River basalt lava flows (15-16 million years ago, both of which have estimated volumes of about 5000 km3, released elastic energy of the order of 10EJ. For comparison, the seismic moment of the largest earthquake ever recorded, the M9.5 1960 Chile earthquake, is estimated at 100 ZJ and the associated elastic energy release at 10EJ.

  13. Precisely locating the Klamath Falls, Oregon, earthquakes

    Qamar, A.; Meagher, K.L.

    1993-01-01

    The Klamath Falls earthquakes on September 20, 1993, were the largest earthquakes centered in Oregon in more than 50 yrs. Only the magnitude 5.75 Milton-Freewater earthquake in 1936, which was centered near the Oregon-Washington border and felt in an area of about 190,000 sq km, compares in size with the recent Klamath Falls earthquakes. Although the 1993 earthquakes surprised many local residents, geologists have long recognized that strong earthquakes may occur along potentially active faults that pass through the Klamath Falls area. These faults are geologically related to similar faults in Oregon, Idaho, and Nevada that occasionally spawn strong earthquakes

  14. ANALYSIS OF REGULARITIES IN DISTRIBUTION OF EARTHQUAKES BY FOCAL DISPLACEMENT IN THE KURIL-OKHOTSK REGION BEFORE THE CATASTROPHIC SIMUSHIR EARTHQUAKE OF 15 NOVEMBER 2006

    Timofei K. Zlobin

    2012-01-01

    Full Text Available The catastrophic Simushir earthquake occurred on 15 November 2006 in the Kuril-Okhotsk region in the Middle Kuril Islands which is a transition zone between the Eurasian continent and the Pacific Ocean. It was followed by numerous strong earthquakes. It is established that the catastrophic earthquake was prepared on a site characterized by increased relative effective pressures which is located at the border of the low-pressure area (Figure 1.Based on data from GlobalCMT (Harvard, earthquake focal mechanisms were reconstructed, and tectonic stresses, the seismotectonic setting and the earthquakes distribution pattern were studied for analysis of the field of stresses in the region before to the Simushir earthquake (Figures 2 and 3; Table 1.Five areas of various types of movement were determined. Three of them are stretched along the Kuril Islands. It is established that seismodislocations in earthquake focal areas are regularly distributed. In each of the determined areas, displacements of a specific type (shear or reverse shear are concentrated and give evidence of the alteration and change of zones characterized by horizontal stretching and compression.The presence of the horizontal stretching and compression zones can be explained by a model of subduction (Figure 4. Detailed studies of the state of stresses of the Kuril region confirm such zones (Figure 5. Recent GeodynamicsThe established specific features of tectonic stresses before the catastrophic Simushir earthquake of 15 November 2006 contribute to studies of earthquake forecasting problems. The state of stresses and the geodynamic conditions suggesting occurrence of new earthquakes can be assessed from the data on the distribution of horizontal compression, stretching and shear areas of the Earth’s crust and the upper mantle in the Kuril region.

  15. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China.

    Liu, Xu; Tang, Bihan; Yang, Hongyang; Liu, Yuan; Xue, Chen; Zhang, Lulu

    2015-12-04

    Performance assessments of earthquake medical rapid response teams (EMRRTs), particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  16. The Technical Efficiency of Earthquake Medical Rapid Response Teams Following Disasters: The Case of the 2010 Yushu Earthquake in China

    Xu Liu

    2015-12-01

    Full Text Available Purpose: Performance assessments of earthquake medical rapid response teams (EMRRTs, particularly the first responders deployed to the hardest hit areas following major earthquakes, should consider efficient and effective use of resources. This study assesses the daily technical efficiency of EMRRTs in the emergency period immediately following the 2010 Yushu earthquake in China. Methods: Data on EMRRTs were obtained from official daily reports of the general headquarters for Yushu earthquake relief, the emergency office of the National Ministry of Health, and the Health Department of Qinghai Province, for a sample of data on 15 EMRRTs over 62 days. Data envelopment analysis was used to examine the technical efficiency in a constant returns to scale model, a variable returns to scale model, and the scale efficiency of EMRRTs. Tobit regression was applied to analyze the effects of corresponding influencing factors. Results: The average technical efficiency scores under constant returns to scale, variable returns to scale, and the scale efficiency scores of the 62 units of analysis were 77.95%, 89.00%, and 87.47%, respectively. The staff-to-bed ratio was significantly related to global technical efficiency. The date of rescue was significantly related to pure technical efficiency. The type of institution to which an EMRRT belonged and the staff-to-bed ratio were significantly related to scale efficiency. Conclusions: This study provides evidence that supports improvements to EMRRT efficiency and serves as a reference for earthquake emergency medical rapid assistance leaders and teams.

  17. The Pocatello Valley, Idaho, earthquake

    Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

    1975-01-01

    A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26. 

  18. The threat of silent earthquakes

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  19. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  20. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  1. Earthquake hazard zonation using peak ground acceleration (PGA) approach

    Irwansyah, E; Winarko, E; Rasjid, Z E; Bekti, R D

    2013-01-01

    The objective of this research is to develop seismic hazard area zones in the building infrastructure of the Banda Aceh City Indonesia using peak ground acceleration (PGA) measured using global and local attenuation function. PGA is calculated using attenuation function that describes the correlation between the local ground movement intensity the earthquake magnitude and the distance from the earthquake's epicentre. The data used comes from the earthquake damage catalogue available from the Indonesia meteorology, climatology and geophysics agency (BMKG) with range from year 1973 – 2011. The research methodology consists of six steps, which is developing the grid, calculation of the distance from the epicentre to the centroid of the grid, calculation of PGA values, developing the computer application, plotting the PGA values to the centroid grid, and developing the earthquake hazard zones using kriging algorithm. The conclusion of this research is that the global attenuation function that was developed by [20] can be applied to calculate the PGA values in the city of Banda Aceh. Banda Aceh city in micro scale can be divided into three hazard zones which is low hazard zone with PGA value of 0.8767 gals up to 0.8780 gals, medium hazard zone with PGA values of 0.8781 up to 0.8793 gals and high hazard zone with PGA values of 0.8794 up to 0.8806 gals.

  2. Development of fragility functions to estimate homelessness after an earthquake

    Brink, Susan A.; Daniell, James; Khazai, Bijan; Wenzel, Friedemann

    2014-05-01

    used to estimate homelessness as a function of information that is readily available immediately after an earthquake. These fragility functions could be used by relief agencies and governments to provide an initial assessment of the need for allocation of emergency shelter immediately after an earthquake. Daniell JE (2014) The development of socio-economic fragility functions for use in worldwide rapid earthquake loss estimation procedures, Ph.D. Thesis (in publishing), Karlsruhe, Germany. Daniell, J. E., Khazai, B., Wenzel, F., & Vervaeck, A. (2011). The CATDAT damaging earthquakes database. Natural Hazards and Earth System Science, 11(8), 2235-2251. doi:10.5194/nhess-11-2235-2011 Daniell, J.E., Wenzel, F. and Vervaeck, A. (2012). "The Normalisation of socio-economic losses from historic worldwide earthquakes from 1900 to 2012", 15th WCEE, Lisbon, Portugal, Paper No. 2027. Jaiswal, K., & Wald, D. (2010). An Empirical Model for Global Earthquake Fatality Estimation. Earthquake Spectra, 26(4), 1017-1037. doi:10.1193/1.3480331

  3. An Experimental Study of a Midbroken 2-Bay 6-Storey Reinforced Concrete Frame subject to Earthquakes

    Skjærbæk, P. S.; Taskin, B.; Kirkegaard, Poul Henning

    1997-01-01

    A 2-bay, 6-storey model test reinforced concrete frame (scale 1:5) subjected to sequential earthquakes of increasing magnitude is considered in this paper. The frame was designed with a weak storey, in which the columns are weakened by using thinner and weaker reinforcement bars. The aim of the w......A 2-bay, 6-storey model test reinforced concrete frame (scale 1:5) subjected to sequential earthquakes of increasing magnitude is considered in this paper. The frame was designed with a weak storey, in which the columns are weakened by using thinner and weaker reinforcement bars. The aim...... of the work is to study global response to a damaging strong motion earthquake event of such buildings. Special emphasis is put on examining to what extent damage in the weak storey can be identified from global response measurements during an earthquake where the structure survives, and what level...

  4. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

  5. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  6. On the reported ionospheric precursor of the 1999 Hector Mine, California earthquake

    Thomas, Jeremy N.; Love, Jeffrey J.; Komjathy, Attila; Verkhoglyadova, Olga P.; Butala, Mark; Rivera, Nicholas

    2012-01-01

    Using Global Positioning System (GPS) data from sites near the 16 Oct. 1999 Hector Mine, California earthquake, Pulinets et al. (2007) identified anomalous changes in the ionospheric total electron content (TEC) starting one week prior to the earthquake. Pulinets (2007) suggested that precursory phenomena of this type could be useful for predicting earthquakes. On the other hand, and in a separate analysis, Afraimovich et al. (2004) concluded that TEC variations near the epicenter were controlled by solar and geomagnetic activity that were unrelated to the earthquake. In an investigation of these very different results, we examine TEC time series of long duration from GPS stations near and far from the epicenter of the Hector Mine earthquake, and long before and long after the earthquake. While we can reproduce the essential time series results of Pulinets et al., we find that the signal they identify as anomalous is not actually anomalous. Instead, it is just part of normal global-scale TEC variation. We conclude that the TEC anomaly reported by Pulinets et al. is unrelated to the Hector Mine earthquake.

  7. Retrospective evaluation of the five-year and ten-year CSEP-Italy earthquake forecasts

    Stefan Wiemer

    2010-11-01

    Full Text Available On August 1, 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP launched a prospective and comparative earthquake predictability experiment in Italy. The goal of this CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented 18 five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We have considered here the twelve time-independent earthquake forecasts among this set, and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. We present the results of the tests that measure the consistencies of the forecasts according to past observations. As well as being an evaluation of the time-independent forecasts submitted, this exercise provides insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between robustness of results and experiment duration. We conclude with suggestions for the design of future earthquake predictability experiments.

  8. Fault parameters and macroseismic observations of the May 10, 1997 Ardekul-Ghaen earthquake

    Amini, H.; Zare, M.; Ansari, A.

    2018-01-01

    The Ardekul (Zirkuh) earthquake (May 10, 1997) is the largest recent earthquake that occurred in the Ardekul-Ghaen region of Eastern Iran. The greatest destruction was concentrated around Ardekul, Haji-Abad, Esfargh, Pishbar, Bashiran, Abiz-Qadim, and Fakhr-Abad (completely destroyed). The total surface fault rupture was about 125 km with the longest un-interrupted segment in the south of the region. The maximum horizontal and vertical displacements were reported in Korizan and Bohn-Abad with about 210 and 70 cm, respectively; moreover, other building damages and environmental effects were also reported for this earthquake. In this study, the intensity value XI on the European Macroseismic Scale (EMS) and Environmental Seismic Intensity (ESI) scale was selected for this earthquake according to the maximum effects on macroseismic data points affected by this earthquake. Then, according to its macroseismic data points of this earthquake and Boxer code, some macroseismic parameters including magnitude, location, source dimension, and orientation of this earthquake were also estimated at 7.3, 33.52° N-59.99° E, 75 km long and 21 km wide, and 152°, respectively. As the estimated macroseismic parameters are consistent with the instrumental ones (Global Centroid Moment Tensor (GCMT) location and magnitude equal 33.58° N-60.02° E, and 7.2, respectively), this method and dataset are suggested not only for other instrumental earthquakes, but also for historical events.

  9. Studying geodesy and earthquake hazard in and around the New Madrid Seismic Zone

    Boyd, Oliver Salz; Magistrale, Harold

    2011-01-01

    Workshop on New Madrid Geodesy and the Challenges of Understanding Intraplate Earthquakes; Norwood, Massachusetts, 4 March 2011 Twenty-six researchers gathered for a workshop sponsored by the U.S. Geological Survey (USGS) and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazards. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. The workshop presentations and conclusions will be available in a forthcoming USGS open-file report (http://pubs.usgs.gov).

  10. An Atlas of ShakeMaps and population exposure catalog for earthquake loss modeling

    Allen, T.I.; Wald, D.J.; Earle, P.S.; Marano, K.D.; Hotovec, A.J.; Lin, K.; Hearne, M.G.

    2009-01-01

    We present an Atlas of ShakeMaps and a catalog of human population exposures to moderate-to-strong ground shaking (EXPO-CAT) for recent historical earthquakes (1973-2007). The common purpose of the Atlas and exposure catalog is to calibrate earthquake loss models to be used in the US Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER). The full ShakeMap Atlas currently comprises over 5,600 earthquakes from January 1973 through December 2007, with almost 500 of these maps constrained-to varying degrees-by instrumental ground motions, macroseismic intensity data, community internet intensity observations, and published earthquake rupture models. The catalog of human exposures is derived using current PAGER methodologies. Exposure to discrete levels of shaking intensity is obtained by correlating Atlas ShakeMaps with a global population database. Combining this population exposure dataset with historical earthquake loss data, such as PAGER-CAT, provides a useful resource for calibrating loss methodologies against a systematically-derived set of ShakeMap hazard outputs. We illustrate two example uses for EXPO-CAT; (1) simple objective ranking of country vulnerability to earthquakes, and; (2) the influence of time-of-day on earthquake mortality. In general, we observe that countries in similar geographic regions with similar construction practices tend to cluster spatially in terms of relative vulnerability. We also find little quantitative evidence to suggest that time-of-day is a significant factor in earthquake mortality. Moreover, earthquake mortality appears to be more systematically linked to the population exposed to severe ground shaking (Modified Mercalli Intensity VIII+). Finally, equipped with the full Atlas of ShakeMaps, we merge each of these maps and find the maximum estimated peak ground acceleration at any grid point in the world for the past 35 years. We subsequently compare this "composite ShakeMap" with existing global

  11. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    Funning, G. J.; Cockett, R.

    2012-12-01

    models. We envisage that the ensemble of contributed models will be useful both as a research resource and in the classroom. Locations of earthquakes derived from InSAR data have already been demonstrated to differ significantly from those obtained from global seismic networks (Weston et al., 2011), and the locations obtained by our users will enable us to identify systematic mislocations that are likely due to errors in Earth velocity models used to locate earthquakes. If the tool is incorporated into geophysics, tectonics and/or structural geology classes, in addition to familiarizing students with InSAR and elastic deformation modeling, the spread of different results for each individual earthquake will allow the teaching of concepts such as model uncertainty and non-uniqueness when modeling real scientific data. Additionally, the process students go through to optimize their estimates of fault parameters can easily be tied into teaching about the concepts of forward and inverse problems, which are common in geophysics.

  12. Centrality in earthquake multiplex networks

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  13. Automated Determination of Magnitude and Source Length of Large Earthquakes

    Wang, D.; Kawakatsu, H.; Zhuang, J.; Mori, J. J.; Maeda, T.; Tsuruoka, H.; Zhao, X.

    2017-12-01

    Rapid determination of earthquake magnitude is of importance for estimating shaking damages, and tsunami hazards. However, due to the complexity of source process, accurately estimating magnitude for great earthquakes in minutes after origin time is still a challenge. Mw is an accurate estimate for large earthquakes. However, calculating Mw requires the whole wave trains including P, S, and surface phases, which takes tens of minutes to reach stations at tele-seismic distances. To speed up the calculation, methods using W phase and body wave are developed for fast estimating earthquake sizes. Besides these methods that involve Green's Functions and inversions, there are other approaches that use empirically simulated relations to estimate earthquake magnitudes, usually for large earthquakes. The nature of simple implementation and straightforward calculation made these approaches widely applied at many institutions such as the Pacific Tsunami Warning Center, the Japan Meteorological Agency, and the USGS. Here we developed an approach that was originated from Hara [2007], estimating magnitude by considering P-wave displacement and source duration. We introduced a back-projection technique [Wang et al., 2016] instead to estimate source duration using array data from a high-sensitive seismograph network (Hi-net). The introduction of back-projection improves the method in two ways. Firstly, the source duration could be accurately determined by seismic array. Secondly, the results can be more rapidly calculated, and data derived from farther stations are not required. We purpose to develop an automated system for determining fast and reliable source information of large shallow seismic events based on real time data of a dense regional array and global data, for earthquakes that occur at distance of roughly 30°- 85° from the array center. This system can offer fast and robust estimates of magnitudes and rupture extensions of large earthquakes in 6 to 13 min (plus

  14. Change of Japanese risk perception after Tohoku earthquake March 11, 2011

    Nakajima, Reiko

    2011-01-01

    The present study reports change of Japanese risk perception based on the results of national surveys carried out in 2010 and 2011. Major earthquake and nuclear power plant risk items were perceived much more serious, while other risks such as global warming, illicit drugs, terrorism were perceiver less serious, after Tohoku earthquake. Anxiety about radiological material was differently according to the distance from the Fukushima No. 1 nuclear power plant. (author)

  15. Exceptional Ground Accelerations and Velocities Caused by Earthquakes

    Anderson, John

    2008-01-17

    This project aims to understand the characteristics of the free-field strong-motion records that have yielded the 100 largest peak accelerations and the 100 largest peak velocities recorded to date. The peak is defined as the maximum magnitude of the acceleration or velocity vector during the strong shaking. This compilation includes 35 records with peak acceleration greater than gravity, and 41 records with peak velocities greater than 100 cm/s. The results represent an estimated 150,000 instrument-years of strong-motion recordings. The mean horizontal acceleration or velocity, as used for the NGA ground motion models, is typically 0.76 times the magnitude of this vector peak. Accelerations in the top 100 come from earthquakes as small as magnitude 5, while velocities in the top 100 all come from earthquakes with magnitude 6 or larger. Records are dominated by crustal earthquakes with thrust, oblique-thrust, or strike-slip mechanisms. Normal faulting mechanisms in crustal earthquakes constitute under 5% of the records in the databases searched, and an even smaller percentage of the exceptional records. All NEHRP site categories have contributed exceptional records, in proportions similar to the extent that they are represented in the larger database.

  16. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  17. Risk assessment study of fire following earthquake: a case study of petrochemical enterprises in China

    Li, J.; Wang, Y.; Chen, H.; Lin, L.

    2013-04-01

    After an earthquake, the fire risk of petrochemistry enterprises is higher than that of other enterprises as it involves production processes with inflammable and explosive characteristics. Using Chinese petrochemical enterprises as the research object, this paper uses a literature review and case summaries to study, amongst others, the classification of petrochemical enterprises, the proportion of daily fires, and fire loss ratio. This paper builds a fire following earthquake risk assessment model of petrochemical enterprises based on a previous earthquake fire hazard model, and the earthquake loss prediction assessment method, calculates the expected loss of the fire following earthquake in various counties and draws a risk map. Moreover, this research identifies high-risk areas, concentrating on the Beijing-Tianjin-Tangshan region, and Shandong, Jiangsu, and Zhejiang provinces. Differences in enterprise type produce different levels and distribution of petrochemical enterprises earthquake fire risk. Furthermore, areas at high risk of post-earthquake fires and with low levels of seismic fortification require extra attention to ensure appropriate mechanisms are in place.

  18. Risk assessment study of fire following an earthquake: a case study of petrochemical enterprises in China

    Li, J.; Wang, Y.; Chen, H.; Lin, L.

    2014-04-01

    After an earthquake, the fire risk of petrochemical enterprises is higher than that of other enterprises as it involves production processes with inflammable and explosive characteristics. Using Chinese petrochemical enterprises as the research object, this paper uses a literature review and case summaries to study, amongst others, the classification of petrochemical enterprises, the proportion of daily fires, and fire loss ratio. This paper builds a fire following an earthquake risk assessment model of petrochemical enterprises based on a previous earthquake fire hazard model, and the earthquake loss prediction assessment method, calculates the expected loss of the fire following an earthquake in various counties and draws a risk map. Moreover, this research identifies high-risk areas, concentrating on the Beijing-Tianjin-Tangshan region, and Shandong, Jiangsu, and Zhejiang provinces. Differences in enterprise type produce different levels and distribution of petrochemical enterprise earthquake fire risk. Furthermore, areas at high risk of post-earthquake fires and with low levels of seismic fortification require extra attention to ensure appropriate mechanisms are in place.

  19. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  20. Mapping Tectonic Stress Using Earthquakes

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-01-01

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust

  1. Swedish earthquakes and acceleration probabilities

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  2. Building with Earthquakes in Mind

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  3. Large earthquakes and creeping faults

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  4. Earthquake damage to underground facilities

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  5. Evidence for Ancient Mesoamerican Earthquakes

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  6. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  7. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  8. Do earthquakes exhibit self-organized criticality?

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  9. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  10. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  11. Earthquake Prediction in a Big Data World

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  12. Performance of underground coal mines during the 1976 Tangshan earthquake

    Lee, C.F.

    1987-01-01

    The Tangshan earthquake of 1976 costs 242 000 lives and was responsible for 164 000 serious injuries and structural damage of immense proportion. The area has eight coal mines, which together form the largest underground coal mining operation in China. Approximately 10 000 miners were working underground at the time of the earthquake. With few exceptions they survived and returned safely to the surface, only to find their families and belongings largely destroyed. Based on a comprehensive survey of the miners' observations, subsurface intensity profiles were drawn up. The profiles clearly indicated that seismic damage in the underground mines was far less severe than at the surface. 16 refs., 4 figs., 2 tabs.

  13. Laboratory generated M -6 earthquakes

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  14. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  15. Three Millennia of Seemingly Time-Predictable Earthquakes, Tell Ateret

    Agnon, Amotz; Marco, Shmuel; Ellenblum, Ronnie

    2014-05-01

    Among various idealized recurrence models of large earthquakes, the "time-predictable" model has a straightforward mechanical interpretation, consistent with simple friction laws. On a time-predictable fault, the time interval between an earthquake and its predecessor is proportional to the slip during the predecessor. The alternative "slip-predictable" model states that the slip during earthquake rupture is proportional to the preceding time interval. Verifying these models requires extended records of high precision data for both timing and amount of slip. The precision of paleoearthquake data can rarely confirm or rule out predictability, and recent papers argue for either time- or slip-predictable behavior. The Ateret site, on the trace of the Dead Sea fault at the Jordan Gorge segment, offers unique precision for determining space-time patterns. Five consecutive slip events, each associated with deformed and offset sets of walls, are correlated with historical earthquakes. Two correlations are based on detailed archaeological, historical, and numismatic evidence. The other three are tentative. The offsets of three of the events are determined with high precision; the other two are not as certain. Accepting all five correlations, the fault exhibits a striking time-predictable behavior, with a long term slip rate of 3 mm/yr. However, the 30 October 1759 ~0.5 m rupture predicts a subsequent rupture along the Jordan Gorge toward the end of the last century. We speculate that earthquakres on secondary faults (the 25 November 1759 on the Rachaya branch and the 1 January 1837 on the Roum branch, both M≥7) have disrupted the 3 kyr time-predictable pattern.

  16. The music of earthquakes and Earthquake Quartet #1

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  17. ELER software - a new tool for urban earthquake loss assessment

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  18. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  19. Book review: Earthquakes and water

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  20. Relations between source parameters for large Persian earthquakes

    Majid Nemati

    2015-11-01

    Full Text Available Empirical relationships for magnitude scales and fault parameters were produced using 436 Iranian intraplate earthquakes of recently regional databases since the continental events represent a large portion of total seismicity of Iran. The relations between different source parameters of the earthquakes were derived using input information which has usefully been provided from the databases after 1900. Suggested equations for magnitude scales relate the body-wave, surface-wave as well as local magnitude scales to scalar moment of the earthquakes. Also, dependence of source parameters as surface and subsurface rupture length and maximum surface displacement on the moment magnitude for some well documented earthquakes was investigated. For meeting this aim, ordinary linear regression procedures were employed for all relations. Our evaluations reveal a fair agreement between obtained relations and equations described in other worldwide and regional works in literature. The M0-mb and M0-MS equations are correlated well to the worldwide relations. Also, both M0-MS and M0-ML relations have a good agreement with regional studies in Taiwan. The equations derived from this study mainly confirm the results of the global investigations about rupture length of historical and instrumental events. However, some relations like MW-MN and MN-ML which are remarkably unlike to available regional works (e.g., American and Canadian were also found.

  1. The search for Infrared radiation prior to major earthquakes

    Ouzounov, D.; Taylor, P.; Pulinets, S.

    2004-12-01

    This work describes our search for a relationship between tectonic stresses and electro-chemical and thermodynamic processes in the Earth and increases in mid-IR flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. Recent analysis of continuous ongoing long- wavelength Earth radiation (OLR) indicates significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and gas composition prior to the earthquake. The OLR anomaly covers large areas surrounding the main epicenter. We have use the NOAA IR data to differentiate between the global and seasonal variability and these transient local anomalies. Indeed, on the basis of a temporal and spatial distribution analysis, an anomaly pattern is found to occur several days prior some major earthquakes. The significance of these observations was explored using data sets of some recent worldwide events.

  2. Estimating economic losses from earthquakes using an empirical approach

    Jaiswal, Kishor; Wald, David J.

    2013-01-01

    We extended the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) empirical fatality estimation methodology proposed by Jaiswal et al. (2009) to rapidly estimate economic losses after significant earthquakes worldwide. The requisite model inputs are shaking intensity estimates made by the ShakeMap system, the spatial distribution of population available from the LandScan database, modern and historic country or sub-country population and Gross Domestic Product (GDP) data, and economic loss data from Munich Re's historical earthquakes catalog. We developed a strategy to approximately scale GDP-based economic exposure for historical and recent earthquakes in order to estimate economic losses. The process consists of using a country-specific multiplicative factor to accommodate the disparity between economic exposure and the annual per capita GDP, and it has proven successful in hindcast-ing past losses. Although loss, population, shaking estimates, and economic data used in the calibration process are uncertain, approximate ranges of losses can be estimated for the primary purpose of gauging the overall scope of the disaster and coordinating response. The proposed methodology is both indirect and approximate and is thus best suited as a rapid loss estimation model for applications like the PAGER system.

  3. Unbonded Prestressed Columns for Earthquake Resistance

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  4. Extreme value distribution of earthquake magnitude

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  5. Retrospective Evaluation of the Long-Term CSEP-Italy Earthquake Forecasts

    Werner, M. J.; Zechar, J. D.; Marzocchi, W.; Wiemer, S.

    2010-12-01

    On 1 August 2009, the global Collaboratory for the Study of Earthquake Predictability (CSEP) launched a prospective and comparative earthquake predictability experiment in Italy. The goal of the CSEP-Italy experiment is to test earthquake occurrence hypotheses that have been formalized as probabilistic earthquake forecasts over temporal scales that range from days to years. In the first round of forecast submissions, members of the CSEP-Italy Working Group presented eighteen five-year and ten-year earthquake forecasts to the European CSEP Testing Center at ETH Zurich. We considered the twelve time-independent earthquake forecasts among this set and evaluated them with respect to past seismicity data from two Italian earthquake catalogs. Here, we present the results of tests that measure the consistency of the forecasts with the past observations. Besides being an evaluation of the submitted time-independent forecasts, this exercise provided insight into a number of important issues in predictability experiments with regard to the specification of the forecasts, the performance of the tests, and the trade-off between the robustness of results and experiment duration.

  6. Dynamic triggering of low magnitude earthquakes in the Middle American Subduction Zone

    Escudero, C. R.; Velasco, A. A.

    2010-12-01

    We analyze global and Middle American Subduction Zone (MASZ) seismicity from 1998 to 2008 to quantify the transient stresses effects at teleseismic distances. We use the Bulletin of the International Seismological Centre Catalog (ISCCD) published by the Incorporated Research Institutions for Seismology (IRIS). To identify MASZ seismicity changes due to distant, large (Mw >7) earthquakes, we first identify local earthquakes that occurred before and after the mainshocks. We then group the local earthquakes within a cluster radius between 75 to 200 km. We obtain statistics based on characteristics of both mainshocks and local earthquakes clusters, such as local cluster-mainshock azimuth, mainshock focal mechanism, and local earthquakes clusters within the MASZ. Due to lateral variations of the dip along the subducted oceanic plate, we divide the Mexican subduction zone in four segments. We then apply the Paired Samples Statistical Test (PSST) to the sorted data to identify increment, decrement or either in the local seismicity associated with distant large earthquakes. We identify dynamic triggering for all MASZ segments produced by large earthquakes emerging from specific azimuths, as well as, a decrease for some cases. We find no depend of seismicity changes due to focal mainshock mechanism.

  7. The 2008 M7.9 Wenchuan earthquake - a human-caused event

    Klose, C. D.

    2013-12-01

    A catalog of global human-caused earthquakes shows statistical evidence that the triggering of earthquakes by large-scale geoengineering activities depends on geological and tectonic constrains (in Klose 2013). Such geoengineering activities also include the filling of water reservoirs. This presentation illuminates mechanical and statistical aspects of the 2008 M7.9 Wenchuan earthquake in light of the hypothesis of being NOT human-caused. However, available data suggest that the Wenchuan earthquake was triggered by the filling of the Zipungpu water reservoir 30 months prior to the mainshock. The reservoir spatially extended parallel and near to the main Beichuan fault zone in a highly stressed reverse fault regime. It is mechanically evident that reverse faults tend to be very trigger-sensitive due to mass shifts (static loads) that occur on the surface of the Earth's crust. These circumstances made a triggering of a seismic event of this magnitude at this location possible (in Klose 2008, 2012). The data show that the Wenchuan earthquake is not an outlier. From a statistical view point, the earthquake falls into the upper range of the family of reverse fault earthquakes that were caused by humans worldwide.

  8. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  9. EARTHQUAKE RESEARCH PROBLEMS OF NUCLEAR POWER GENERATORS

    Housner, G. W.; Hudson, D. E.

    1963-10-15

    Earthquake problems associated with the construction of nuclear power generators require a more extensive and a more precise knowledge of earthquake characteristics and the dynamic behavior of structures than was considered necessary for ordinary buildings. Economic considerations indicate the desirability of additional research on the problems of earthquakes and nuclear reactors. The nature of these earthquake-resistant design problems is discussed and programs of research are recommended. (auth)

  10. Historical earthquake investigations in Greece

    K. Makropoulos

    2004-06-01

    Full Text Available The active tectonics of the area of Greece and its seismic activity have always been present in the country?s history. Many researchers, tempted to work on Greek historical earthquakes, have realized that this is a task not easily fulfilled. The existing catalogues of strong historical earthquakes are useful tools to perform general SHA studies. However, a variety of supporting datasets, non-uniformly distributed in space and time, need to be further investigated. In the present paper, a review of historical earthquake studies in Greece is attempted. The seismic history of the country is divided into four main periods. In each one of them, characteristic examples, studies and approaches are presented.

  11. Demonstration of the Cascadia G‐FAST geodetic earthquake early warning system for the Nisqually, Washington, earthquake

    Crowell, Brendan; Schmidt, David; Bodin, Paul; Vidale, John; Gomberg, Joan S.; Hartog, Renate; Kress, Victor; Melbourne, Tim; Santillian, Marcelo; Minson, Sarah E.; Jamison, Dylan

    2016-01-01

    A prototype earthquake early warning (EEW) system is currently in development in the Pacific Northwest. We have taken a two‐stage approach to EEW: (1) detection and initial characterization using strong‐motion data with the Earthquake Alarm Systems (ElarmS) seismic early warning package and (2) the triggering of geodetic modeling modules using Global Navigation Satellite Systems data that help provide robust estimates of large‐magnitude earthquakes. In this article we demonstrate the performance of the latter, the Geodetic First Approximation of Size and Time (G‐FAST) geodetic early warning system, using simulated displacements for the 2001Mw 6.8 Nisqually earthquake. We test the timing and performance of the two G‐FAST source characterization modules, peak ground displacement scaling, and Centroid Moment Tensor‐driven finite‐fault‐slip modeling under ideal, latent, noisy, and incomplete data conditions. We show good agreement between source parameters computed by G‐FAST with previously published and postprocessed seismic and geodetic results for all test cases and modeling modules, and we discuss the challenges with integration into the U.S. Geological Survey’s ShakeAlert EEW system.

  12. Fault failure with moderate earthquakes

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  13. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  14. 13 CFR 120.174 - Earthquake hazards.

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  15. Multi-Parameter Observation and Detection of Pre-Earthquake Signals in Seismically Active Areas

    Ouzounov, D.; Pulinets, S.; Parrot, M.; Liu, J. Y.; Hattori, K.; Kafatos, M.; Taylor, P.

    2012-01-01

    The recent large earthquakes (M9.0 Tohoku, 03/2011; M7.0 Haiti, 01/2010; M6.7 L Aquila, 04/2008; and M7.9 Wenchuan 05/2008) have renewed interest in pre-anomalous seismic signals associated with them. Recent workshops (DEMETER 2006, 2011 and VESTO 2009 ) have shown that there were precursory atmospheric /ionospheric signals observed in space prior to these events. Our initial results indicate that no single pre-earthquake observation (seismic, magnetic field, electric field, thermal infrared [TIR], or GPS/TEC) can provide a consistent and successful global scale early warning. This is most likely due to complexity and chaotic nature of earthquakes and the limitation in existing ground (temporal/spatial) and global satellite observations. In this study we analyze preseismic temporal and spatial variations (gas/radon counting rate, atmospheric temperature and humidity change, long-wave radiation transitions and ionospheric electron density/plasma variations) which we propose occur before the onset of major earthquakes:. We propose an Integrated Space -- Terrestrial Framework (ISTF), as a different approach for revealing pre-earthquake phenomena in seismically active areas. ISTF is a sensor web of a coordinated observation infrastructure employing multiple sensors that are distributed on one or more platforms; data from satellite sensors (Terra, Aqua, POES, DEMETER and others) and ground observations, e.g., Global Positioning System, Total Electron Content (GPS/TEC). As a theoretical guide we use the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC) model to explain the generation of multiple earthquake precursors. Using our methodology, we evaluated retrospectively the signals preceding the most devastated earthquakes during 2005-2011. We observed a correlation between both atmospheric and ionospheric anomalies preceding most of these earthquakes. The second phase of our validation include systematic retrospective analysis for more than 100 major earthquakes (M>5

  16. Computational methods in earthquake engineering

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  17. Earthquake Education in Prime Time

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  18. Radon as an earthquake precursor

    Planinic, J.; Radolic, V.; Vukovic, B.

    2004-01-01

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude ≥3 at epicentral distances ≤200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined

  19. Radon as an earthquake precursor

    Planinic, J. E-mail: planinic@pedos.hr; Radolic, V.; Vukovic, B

    2004-09-11

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude {>=}3 at epicentral distances {<=}200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined.

  20. Geodetic constraints on afterslip characteristics following the March 9, 2011, Sanriku-oki earthquake, Japan

    Ohta, Yusaku; Hino, Ryota; Inazu, Daisuke; Ohzono, Mako; Ito, Yoshihiro; Mishina, Masaaki; Iinuma, Takeshi; Nakajima, Junichi; Osada, Yukihito; Suzuki, Kensuke; Fujimoto, Hiromi; Tachibana, Kenji; Demachi, Tomotsugu; Miura, Satoshi

    2012-08-01

    A magnitude 7.3 foreshock occurred at the subducting Pacific plate interface on March 9, 2011, 51 h before the magnitude 9.0 Tohoku earthquake off the Pacific coast of Japan. We propose a coseismic and postseismic afterslip model of the magnitude 7.3 event based on a global positioning system network and ocean bottom pressure gauge sites. The estimated coseismic slip and afterslip areas show complementary spatial distributions; the afterslip distribution is located up-dip of the coseismic slip for the foreshock and northward of hypocenter of the Tohoku earthquake. The slip amount for the afterslip is roughly consistent with that determined by repeating earthquake analysis carried out in a previous study. The estimated moment release for the afterslip reached magnitude 6.8, even within a short time period of 51h. A volumetric strainmeter time series also suggests that this event advanced with a rapid decay time constant compared with other typical large earthquakes.

  1. The use of waveform shapes to automatically determine earthquake focal depth

    Sipkin, S.A.

    2000-01-01

    Earthquake focal depth is an important parameter for rapidly determining probable damage caused by a large earthquake. In addition, it is significant both for discriminating between natural events and explosions and for discriminating between tsunamigenic and nontsunamigenic earthquakes. For the purpose of notifying emergency management and disaster relief organizations as well as issuing tsunami warnings, potential time delays in determining source parameters are particularly detrimental. We present a method for determining earthquake focal depth that is well suited for implementation in an automated system that utilizes the wealth of broadband teleseismic data that is now available in real time from the global seismograph networks. This method uses waveform shapes to determine focal depth and is demonstrated to be valid for events with magnitudes as low as approximately 5.5.

  2. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  3. PROPORTIONS AND HUMAN SCALE IN DAMASCENE COURTYARD HOUSES

    M. Salim Ferwati

    2008-03-01

    Full Text Available Interior designers, architects, landscape architects, and even urban designers, agree that environment, as a form of non-verbal communication means, has a symbolic dimension to it. As for its aesthetic dimension, it seems that beauty is related to a certain proportion, partially and as a whole. Suitable proportion leaves a good impression upon the beholders, especially when it matches human proportion. That in fact was the underlining belief of LeCorbusier, according to which he developed his Modular concept. The study searches for a modular, or proportion, system that governs the design of Damascene traditional house. By geometrical and mathematical examinations of 28 traditional houses, it was found that a certain proportional relationship existed; however, these proportional relationships were not fixed ones. The study relied on analyzing the Iwan elevation as well as the inner courtyard proportion in relation to the building area. Charts, diagrams and tables were produced to summarize the results.

  4. Coping with the challenges of early disaster response: 24 years of field hospital experience after earthquakes.

    Bar-On, Elhanan; Abargel, Avi; Peleg, Kobi; Kreiss, Yitshak

    2013-10-01

    To propose strategies and recommendations for future planning and deployment of field hospitals after earthquakes by comparing the experience of 4 field hospitals deployed by The Israel Defense Forces (IDF) Medical Corps in Armenia, Turkey, India and Haiti. Quantitative data regarding the earthquakes were collected from published sources; data regarding hospital activity were collected from IDF records; and qualitative information was obtained from structured interviews with key figures involved in the missions. The hospitals started operating between 89 and 262 hours after the earthquakes. Their sizes ranged from 25 to 72 beds, and their personnel numbered between 34 and 100. The number of patients treated varied from 1111 to 2400. The proportion of earthquake-related diagnoses ranged from 28% to 67% (P earthquakes, patient caseload and treatment requirements varied widely. The variables affecting the patient profile most significantly were time until deployment, total number of injured, availability of adjacent medical facilities, and possibility of evacuation from the disaster area. When deploying a field hospital in the early phase after an earthquake, a wide variability in patient caseload should be anticipated. Customization is difficult due to the paucity of information. Therefore, early deployment necessitates full logistic self-sufficiency and operational versatility. Also, collaboration with local and international medical teams can greatly enhance treatment capabilities.

  5. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  6. Earthquake predictions using seismic velocity ratios

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  7. Measuring the size of an earthquake

    Spence, W.; Sipkin, S.A.; Choy, G.L.

    1989-01-01

    Earthquakes range broadly in size. A rock-burst in an Idaho silver mine may involve the fracture of 1 meter of rock; the 1965 Rat Island earthquake in the Aleutian arc involved a 650-kilometer length of the Earth's crust. Earthquakes can be even smaller and even larger. If an earthquake is felt or causes perceptible surface damage, then its intensity of shaking can be subjectively estimated. But many large earthquakes occur in oceanic areas or at great focal depths and are either simply not felt or their felt pattern does not really indicate their true size.

  8. Earthquakes-Rattling the Earth's Plumbing System

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  9. Scientists Examine Challenges and Lessons From Japan's Earthquake and Tsunami

    Showstack, Randy

    2011-03-01

    A week after the magnitude 9.0 great Tohoku earthquake and the resulting tragic and damaging tsunami of 11 March struck Japan, the ramifications continued, with a series of major aftershocks (as Eos went to press, there had been about 4 dozen with magnitudes greater than 6); the grim search for missing people—the death toll was expected to approximate 10,000; the urgent assistance needed for the more than 400,000 homeless and the 1 million people without water; and the frantic efforts to avert an environmental catastrophe at Japan's damaged Fukushima Daiichi Nuclear Power Station, about 225 kilometers northeast of Tokyo, where radiation was leaking. The earthquake offshore of Honshu in northeastern Japan (see Figure 1) was a plate boundary rupture along the Japan Trench subduction zone, with the source area of the earthquake estimated at 400-500 kilometers long with a maximum slip of 20 meters, determined through various means including Global Positioning System (GPS) and seismographic data, according to Kenji Satake, professor at the Earthquake Research Institute of the University of Tokyo. In some places the tsunami may have topped 7 meters—the maximum instrumental measurement at many coastal tide gauges—and some parts of the coastline may have been inundated more than 5 kilometers inland, Satake indicated. The International Tsunami Information Center (ITIC) noted that eyewitnesses reported that the highest tsunami waves were 13 meters high. Satake also noted that continuous GPS stations indicate that the coast near Sendai—which is 130 kilometers west of the earthquake and is the largest city in the Tohoku region of Honshu—moved more than 4 meters horizontally and subsided about 0.8 meter.

  10. Investigation of Back-Projection Uncertainties with M6 Earthquakes

    Fan, W.; Shearer, P. M.

    2017-12-01

    We investigate possible biasing effects of inaccurate timing corrections on teleseismic P-wave back-projection imaging of large earthquake ruptures. These errors occur because empirically-estimated time shifts based on aligning P-wave first arrivals are exact only at the hypocenter and provide approximate corrections for other parts of the rupture. Using the Japan subduction zone as a test region, we analyze 46 M6-7 earthquakes over a ten-year period, including many aftershocks of the 2011 M9 Tohoku earthquake, performing waveform cross-correlation of their initial P-wave arrivals to obtain hypocenter timing corrections to global seismic stations. We then compare back-projection images for each earthquake using its own timing corrections with those obtained using the time corrections for other earthquakes. This provides a measure of how well sub-events can be resolved with back-projection of a large rupture as a function of distance from the hypocenter. Our results show that back-projection is generally very robust and that sub-event location errors average about 20 km across the entire study region ( 700 km). The back-projection coherence loss and location errors do not noticeably converge to zero even when the event pairs are very close (<20 km). This indicates that most of the timing differences are due to 3D structure close to each of the hypocenter regions, which limits the effectiveness of attempts to refine back-projection images using aftershock calibration, at least in this region.

  11. Use of Fault Displacement Vector to Identify Future Zones of Seismicity: An Example from the Earthquakes of Nepal Himalayas.

    Naim, F.; Mukherjee, M. K.

    2017-12-01

    Earthquakes occur due to fault slip in the subsurface. They can occur either as interplate or intraplate earthquakes. The region of study is the Nepal Himalayas that defines the boundary of Indian-Eurasian plate and houses the focus of the most devastating earthquakes. The aim of the study was to analyze all the earthquakes that occurred in the Nepal Himalayas upto May 12, 2015 earthquake in order to mark the regions still under stress and vulnerable for future earthquakes. Three different fault systems in the Nepal Himalayas define the tectonic set up of the area. They are: (1) Main Frontal Thrust(MFT), (2) Main Central Thrust(MCT) and (3) Main Boundary Thrust(MBT) that extend from NW to SE. Most of the earthquakes were observed to occur between the MBT and MCT. Since the thrust faults are dipping towards NE, the focus of most of the earthquakes lies on the MBT. The methodology includes estimating the dip of the fault by considering the depths of different earthquake events and their corresponding distance from the MBT. In order to carry out stress analysis on the fault, the beach ball diagrams associated with the different earthquakes were plotted on a map. Earthquakes in the NW and central region of the fault zone were associated with reverse fault slip while that on the South-Eastern part were associated with a strike slip component. The direction of net slip on the fault associated with the different earthquakes was known and from this a 3D slip diagram of the fault was constructed. The regions vulnerable for future earthquakes in the Nepal Himalaya were demarcated on the 3D slip diagram of the fault. Such zones were marked owing to the fact that the slips due to earthquakes cause the adjoining areas to come under immense stress and this stress is directly proportional to the amount of slip occuring on the fault. These vulnerable zones were in turn projected on the map to show their position and are predicted to contain the epicenter of the future earthquakes.

  12. Rapid estimation of earthquake magnitude from the arrival time of the peak high‐frequency amplitude

    Noda, Shunta; Yamamoto, Shunroku; Ellsworth, William L.

    2016-01-01

    We propose a simple approach to measure earthquake magnitude M using the time difference (Top) between the body‐wave onset and the arrival time of the peak high‐frequency amplitude in an accelerogram. Measured in this manner, we find that Mw is proportional to 2logTop for earthquakes 5≤Mw≤7, which is the theoretical proportionality if Top is proportional to source dimension and stress drop is scale invariant. Using high‐frequency (>2  Hz) data, the root mean square (rms) residual between Mw and MTop(M estimated from Top) is approximately 0.5 magnitude units. The rms residuals of the high‐frequency data in passbands between 2 and 16 Hz are uniformly smaller than those obtained from the lower‐frequency data. Top depends weakly on epicentral distance, and this dependence can be ignored for distances earthquake produces a final magnitude estimate of M 9.0 at 120 s after the origin time. We conclude that Top of high‐frequency (>2  Hz) accelerograms has value in the context of earthquake early warning for extremely large events.

  13. Evaluation of earthquake parameters used in the Indonesian Tsunami Early Warning System

    Madlazim; Prastowo, Tjipto

    2016-02-01

    Twenty-two of a total of 30 earthquake events reported by the Indonesian Agency for Geophysics, Climatology and Meteorology during the time period 2007-2010 were falsely issued as tsunamigenic by the Indonesian Tsunami Early Warning System (Ina-TEWS). These 30 earthquakes were of different magnitudes and occurred in different locations. This study aimed to evaluate the performance of the Ina-TEWS using common earthquake parameters, including the earthquake magnitude, origin time, depth, and epicenter. In total, 298 datasets assessed by the Ina-TEWS and the global centroid moment tensor (CMT) method were assessed. The global CMT method is considered by almost all seismologists to be a reference for the determination of these parameters as they have been proved to be accurate. It was found that the earthquake magnitude, origin time, and depth provided by the Ina-TEWS were significantly different from those given in the global CMT catalog, whereas the latitude and longitude positions of the events provided by both tsunami assessment systems were coincident. The performance of the Ina-TEWS, particularly in terms of accuracy, remains questionable and needs to be improved.

  14. Summary of earthquake experience database

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  15. Earthquake design for controlled structures

    Nikos G. Pnevmatikos

    2017-04-01

    Full Text Available An alternative design philosophy, for structures equipped with control devices, capable to resist an expected earthquake while remaining in the elastic range, is described. The idea is that a portion of the earthquake loading is under¬taken by the control system and the remaining by the structure which is designed to resist elastically. The earthquake forces assuming elastic behavior (elastic forces and elastoplastic behavior (design forces are first calculated ac¬cording to the codes. The required control forces are calculated as the difference from elastic to design forces. The maximum value of capacity of control devices is then compared to the required control force. If the capacity of the control devices is larger than the required control force then the control devices are accepted and installed in the structure and the structure is designed according to the design forces. If the capacity is smaller than the required control force then a scale factor, α, reducing the elastic forces to new design forces is calculated. The structure is redesigned and devices are installed. The proposed procedure ensures that the structure behaves elastically (without damage for the expected earthquake at no additional cost, excluding that of buying and installing the control devices.

  16. Using Smartphones to Detect Earthquakes

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  17. Explanation of earthquake response spectra

    Douglas, John

    2017-01-01

    This is a set of five slides explaining how earthquake response spectra are derived from strong-motion records and simple models of structures and their purpose within seismic design and assessment. It dates from about 2002 and I have used it in various introductory lectures on engineering seismology.

  18. Injuries and Traumatic Psychological Exposures Associated with the South Napa Earthquake - California, 2014.

    Attfield, Kathleen R; Dobson, Christine B; Henn, Jennifer B; Acosta, Meileen; Smorodinsky, Svetlana; Wilken, Jason A; Barreau, Tracy; Schreiber, Merritt; Windham, Gayle C; Materna, Barbara L; Roisman, Rachel

    2015-09-11

    On August 24, 2014, at 3:20 a.m., a magnitude 6.0 earthquake struck California, with its epicenter in Napa County (1). The earthquake was the largest to affect the San Francisco Bay area in 25 years and caused significant damage in Napa and Solano counties, including widespread power outages, five residential fires, and damage to roadways, waterlines, and 1,600 buildings (2). Two deaths resulted (2). On August 25, Napa County Public Health asked the California Department of Public Health (CDPH) for assistance in assessing postdisaster health effects, including earthquake-related injuries and effects on mental health. On September 23, Solano County Public Health requested similar assistance. A household-level Community Assessment for Public Health Emergency Response (CASPER) was conducted for these counties in two cities (Napa, 3 weeks after the earthquake, and Vallejo, 6 weeks after the earthquake). Among households reporting injuries, a substantial proportion (48% in Napa and 37% in western Vallejo) reported that the injuries occurred during the cleanup period, suggesting that increased messaging on safety precautions after a disaster might be needed. One fifth of respondents overall (27% in Napa and 9% in western Vallejo) reported one or more traumatic psychological exposures in their households. These findings were used by Napa County Mental Health to guide immediate-term mental health resource allocations and to conduct public training sessions and education campaigns to support persons with mental health risks following the earthquake. In addition, to promote community resilience and future earthquake preparedness, Napa County Public Health subsequently conducted community events on the earthquake anniversary and provided outreach workers with psychological first aid training.

  19. Scientific Information Platform for the 2008 Great Wenchuan Earthquake

    Liang, C.

    2012-12-01

    The 2008 MS 8.0 Wenchuan earthquake is one of the deadliest in recent human history. This earthquake has not just united the whole world to help local people to lead their life through the difficult time, it has also fostered significant global cooperation to study this event from various aspects: including pre-seismic events (such as the seismicity, gravity, electro-magnetic fields, well water level, radon level in water etc), co-seismic events (fault slipping, landslides, man-made structure damages etc) and post-seismic events (such as aftershocks, well water level changing etc) as well as the disaster relief efforts. In the last four years, more than 300 scientific articles have been published on peer-reviewed journals, among them about 50% are published in Chinese, 30% in English, and about 20% in both languages. These researches have advanced our understanding of earthquake science in general. It has also sparked open debates in many aspects. Notably, the role of the Zipingpu reservoir (built not long ago before the earthquake) in the triggering of this monstrous earthquake is still one of many continuing debates. Given that all these articles are ssporadically spread out on different journals and numerous issues and in different languages, it can be very inefficient, sometimes impossible, to dig out the information that are in need. The Earthquake Research Group in the Chengdu University of Technology (ERGCDUT) has initiated an effort to develop an information platform to collect and analyze scientific research on or related to this earthquake, the hosting faults and the surrounding tectonic regions. A preliminary website has been setup for this purpose: http://www.wenchuaneqresearch.org. Up to this point (July 2012), articles published in 6 Chinese journals and 7 international journals have been collected. Articles are listed journal by journal, and also grouped by contents into four major categories, including pre-seismic events, co-seismic events, post

  20. Pressure control valve using proportional electro-magnetic solenoid actuator

    Yun, So Nam; Ham, Young Bog; Park, Pyoung Won

    2006-01-01

    This paper presents an experimental characteristics of electro-hydraulic proportional pressure control valve. In this study, poppet and valve body which are assembled into the proportional solenoid were designed and manufactured. The constant force characteristics of proportional solenoid actuator in the control region should be independent of the plunger position in order to be used to control the valve position in the fluid flow control system. The stroke-force characteristics of the proportional solenoid actuator is determined by the shape (or parameters) of the control cone. In this paper, steady state and transient characteristics of the solenoid actuator for electro-hydraulic proportional valve are analyzed using finite element method and it is confirmed that the proportional solenoid actuator has a constant attraction force in the control region independently on the stroke position. The effects of the parameters such as control cone length, thickness and taper length are also discussed

  1. Development of extruded resistive plastic tubes for proportional chamber cathodes

    Kondo, K.

    1982-01-01

    Carbon mixed plastic tubes with resistivity of 10 3 approx. 10 4 Ωcm have been molded with an extrusion method and used for the d.c. cathode of a proportional counter and a multi-wire proportional chamber. The signal by gas multiplication was picked up from a strip r.f. cathode set outside the tube. The characteristics of the counter in the proportional and limited streamer modes have been studied

  2. Multiwire proportional chamber for Moessbauer spectroscopy: development and results

    Costa, M.S. da.

    1985-12-01

    A new Multiwere proportional Chamber designed for Moessbauer Spectroscopy is presented. This detector allows transmission backscattering experiments using either photons or electrons. The Moessbauer data acquisition system, partially developed for this work is described. A simple method for determining the frontier between true proportional and semi-proportional regions of operation in gaseous detectors is proposed. The study of the tertiary gas mixture He-Ar-CH 4 leads to a straight forward way of energy calibration of the electron spectra. Moessbauer spectra using Fe-57 source are presented. In particular those obtained with backsattered electrons show the feasibility of depth selective analysis with gaseous proportional counters. (author) [pt

  3. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  4. Rapid characterization of the 2015 Mw 7.8 Gorkha, Nepal, earthquake sequence and its seismotectonic context

    Hayes, Gavin; Briggs, Richard; Barnhart, William D.; Yeck, William; McNamara, Daniel E.; Wald, David J.; Nealy, Jennifer; Benz, Harley M.; Gold, Ryan D.; Jaiswal, Kishor S.; Marano, Kristin; Earle, Paul S.; Hearne, Mike; Smoczyk, Gregory M.; Wald, Lisa A.; Samsonov, Sergey

    2015-01-01

    Earthquake response and related information products are important for placing recent seismic events into context and particularly for understanding the impact earthquakes can have on the regional community and its infrastructure. These tools are even more useful if they are available quickly, ahead of detailed information from the areas affected by such earthquakes. Here we provide an overview of the response activities and related information products generated and provided by the U.S. Geological Survey National Earthquake Information Center in association with the 2015 M 7.8 Gorkha, Nepal, earthquake. This group monitors global earthquakes 24  hrs/day and 7  days/week to provide rapid information on the location and size of recent events and to characterize the source properties, tectonic setting, and potential fatalities and economic losses associated with significant earthquakes. We present the timeline over which these products became available, discuss what they tell us about the seismotectonics of the Gorkha earthquake and its aftershocks, and examine how their information is used today, and might be used in the future, to help mitigate the impact of such natural disasters.

  5. Ionospheric Anomalies of the 2011 Tohoku Earthquake with Multiple Observations during Magnetic Storm Phase

    Liu, Yang

    2017-04-01

    Ionospheric anomalies linked with devastating earthquakes have been widely investigated by scientists. It was confirmed that GNSS TECs suffered from drastically increase or decrease in some diurnal periods prior to the earthquakes. Liu et al (2008) applied a TECs anomaly calculation method to analyze M>=5.9 earthquakes in Indonesia and found TECs decadence within 2-7 days prior to the earthquakes. Nevertheless, strong TECs enhancement was observed before M8.0 Wenchuan earthquake (Zhao et al 2008). Moreover, the ionospheric plasma critical frequency (foF2) has been found diminished before big earthquakes (Pulinets et al 1998; Liu et al 2006). But little has been done regarding ionospheric irregularities and its association with earthquake. Still it is difficult to understand real mechanism between ionospheric anomalies activities and its precursor for the huge earthquakes. The M9.0 Tohoku earthquake, happened on 11 March 2011, at 05:46 UT time, was recognized as one of the most dominant events in related research field (Liu et al 2011). A median geomagnetic disturbance also occurred accompanied with the earthquake, which makes the ionospheric anomalies activities more sophisticated to study. Seismic-ionospheric disturbance was observed due to the drastic activities of earth. To further address the phenomenon, this paper investigates different categories of ionospheric anomalies induced by seismology activity, with multiple data sources. Several GNSS ground data were chosen along epicenter from IGS stations, to discuss the spatial-temporal correlations of ionospheric TECs in regard to the distance of epicenter. We also apply GIM TEC maps due to its global coverage to find diurnal differences of ionospheric anomalies compared with geomagnetic quiet day in the same month. The results in accordance with Liu's conclusions that TECs depletion occurred at days quite near the earthquake day, however the variation of TECs has special regulation contrast to the normal quiet

  6. Countermeasures to earthquakes in nuclear plants

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  7. Update earthquake risk assessment in Cairo, Egypt

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  8. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  9. X-ray proportional counter for the Viking Lander

    Glesius, F.L.; Kroon, J.C.; Castro, A.J.; Clark, B.C.

    1978-01-01

    A set of four sealed proportional counters with optimized energy response is employed in the X-ray fluorescence spectrometer units aboard the two Viking Landers. The instruments have provided quantitative elemental analyses of soil samples taken from the Martian surface. This paper discusses the design and development of these miniature proportional counters, and describes their performance on Mars

  10. Putative golden proportions as predictors of facial esthetics in adolescents.

    Kiekens, Rosemie M A; Kuijpers-Jagtman, Anne Marie; van 't Hof, Martin A; van 't Hof, Bep E; Maltha, Jaap C

    2008-10-01

    In orthodontics, facial esthetics is assumed to be related to golden proportions apparent in the ideal human face. The aim of the study was to analyze the putative relationship between facial esthetics and golden proportions in white adolescents. Seventy-six adult laypeople evaluated sets of photographs of 64 adolescents on a visual analog scale (VAS) from 0 to 100. The facial esthetic value of each subject was calculated as a mean VAS score. Three observers recorded the position of 13 facial landmarks included in 19 putative golden proportions, based on the golden proportions as defined by Ricketts. The proportions and each proportion's deviation from the golden target (1.618) were calculated. This deviation was then related to the VAS scores. Only 4 of the 19 proportions had a significant negative correlation with the VAS scores, indicating that beautiful faces showed less deviation from the golden standard than less beautiful faces. Together, these variables explained only 16% of the variance. Few golden proportions have a significant relationship with facial esthetics in adolescents. The explained variance of these variables is too small to be of clinical importance.

  11. Putative golden proportions as predictors of facial esthetics in adolescents.

    Kiekens, R.M.A.; Kuijpers-Jagtman, A.M.; Hof, M.A. van 't; Hof, B.E. van 't; Maltha, J.C.

    2008-01-01

    INTRODUCTION: In orthodontics, facial esthetics is assumed to be related to golden proportions apparent in the ideal human face. The aim of the study was to analyze the putative relationship between facial esthetics and golden proportions in white adolescents. METHODS: Seventy-six adult laypeople

  12. The principle of proportionality and European contract law

    Cauffman, C.; Rutgers, J.; Sirena, P.

    2015-01-01

    The paper investigates the role of the principle of proportionality within contract law, in balancing the rights and obligations of the contracting parties. It illustrates that the principle of proportionality is one of the general principles which govern contractual relations, and as such it is an

  13. The Improved Estimation of Ratio of Two Population Proportions

    Solanki, Ramkrishna S.; Singh, Housila P.

    2016-01-01

    In this article, first we obtained the correct mean square error expression of Gupta and Shabbir's linear weighted estimator of the ratio of two population proportions. Later we suggested the general class of ratio estimators of two population proportions. The usual ratio estimator, Wynn-type estimator, Singh, Singh, and Kaur difference-type…

  14. Attention Modulation by Proportion Congruency: The Asymmetrical List Shifting Effect

    Abrahamse, Elger L.; Duthoo, Wout; Notebaert, Wim; Risko, Evan F.

    2013-01-01

    Proportion congruency effects represent hallmark phenomena in current theorizing about cognitive control. This is based on the notion that proportion congruency determines the relative levels of attention to relevant and irrelevant information in conflict tasks. However, little empirical evidence exists that uniquely supports such an attention…

  15. 16 CFR 240.9 - Proportionally equal terms.

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Proportionally equal terms. 240.9 Section 240.9 Commercial Practices FEDERAL TRADE COMMISSION GUIDES AND TRADE PRACTICE RULES GUIDES FOR ADVERTISING ALLOWANCES AND OTHER MERCHANDISING PAYMENTS AND SERVICES § 240.9 Proportionally equal terms. (a...

  16. A smartphone application for earthquakes that matter!

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  17. Health education and promotion at the site of an emergency: experience from the Chinese Wenchuan earthquake response.

    Tian, Xiangyang; Zhao, Genming; Cao, Dequan; Wang, Duoquan; Wang, Liang

    2016-03-01

    Theories and strategies of social mobilization, capacity building, mass and interpersonal communication, as well as risk communication and behavioral change were used to develop health education and promotion campaigns to decrease and prevent injuries and infectious diseases among the survivors of the Wenchuan earthquake in May 2008. We evaluated the effectiveness of the campaigns and short-term interventions using mixed-methods. The earthquake survivors' health knowledge, skills, and practice improved significantly with respect to injury protection, food and water safety, environmental and personal hygiene, and disease prevention. No infectious disease outbreaks were reported after the earthquake, and the epidemic level was lower than before the earthquake. After a short-term intervention among the students of Leigu Township Primary and Junior School, the proportion of those with personal hygiene increased from 59.7% to 98.3% (pearthquakes play an important role in preventing injuries and infectious diseases among survivors. © The Author(s) 2014.

  18. Anomalous variation in GPS based TEC measurements prior to the 30 September 2009 Sumatra Earthquake

    Karia, Sheetal; Pathak, Kamlesh

    This paper investigates the features of pre-earthquake ionospheric anomalies in the total elec-tron content (TEC) data obtained on the basis of regular GPS observations from the GPS receiver at SVNIT Surat (21.16 N, 72.78 E Geog) located at the northern crest of equatorial anomaly region. The data has been analysed for 5 different earthquakes that occurred during 2009 in India and its neighbouring regions. Our observation shows that for the cases of the earthquake, in which the preparation area lies between the crests of the equatorial anomaly close to the geomagnetic equator the enhancement in TEC was followed by a depletion in TEC on the day of earthquake, which may be connected to the equatorial anomaly shape distortions. For the analysis of the ionospheric effects of one of such case-the 30 September 2009 Sumatra earthquake, Global Ionospheric Maps of TEC were used. The possible influence of the earth-quake preparation processes on the main low-latitude ionosphere peculiarity—the equatorial anomaly—is discussed.

  19. Abundant aftershock sequence of the 2015 Mw7.5 Hindu Kush intermediate-depth earthquake

    Li, Chenyu; Peng, Zhigang; Yao, Dongdong; Guo, Hao; Zhan, Zhongwen; Zhang, Haijiang

    2018-05-01

    The 2015 Mw7.5 Hindu Kush earthquake occurred at a depth of 213 km beneath the Hindu Kush region of Afghanistan. While many early aftershocks were missing from the global earthquake catalogues, this sequence was recorded continuously by eight broad-band stations within 500 km. Here we use a waveform matching technique to systematically detect earthquakes around the main shock. More than 3000 events are detected within 35 d after the main shock, as compared with 42 listed in the Advanced National Seismic System catalogue (or 196 in the International Seismological Centre catalogue). The aftershock sequence generally follows the Omori's law with a decay constant p = 0.92. We also apply the recently developed double-pair double-difference technique to relocate all detected aftershocks. Most of them are located to the west of the hypocentre of the main shock, consistent with the westward propagation of the main-shock rupture. The aftershocks outline a nearly vertical southward dipping plane, which matches well with one of the nodal planes of the main shock. We conclude that the aftershock sequence of this intermediate-depth earthquake shares many similarities with those for shallow earthquakes and infer that there are some common mechanisms responsible for shallow and intermediate-depth earthquakes.

  20. The 2016 Kumamoto earthquake sequence.

    Kato, Aitaro; Nakamura, Kouji; Hiyama, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An M j 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an M j 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest.

  1. Earthquake lights and rupture processes

    T. V. Losseva

    2005-01-01

    Full Text Available A physical model of earthquake lights is proposed. It is suggested that the magnetic diffusion from the electric and magnetic fields source region is a dominant process, explaining rather high localization of the light flashes. A 3D numerical code allowing to take into account the arbitrary distribution of currents caused by ground motion, conductivity in the ground and at its surface, including the existence of sea water above the epicenter or (and near the ruptured segments of the fault have been developed. Simulations for the 1995 Kobe earthquake were conducted taking into account the existence of sea water with realistic geometry of shores. The results do not contradict the eyewitness reports and scarce measurements of the electric and magnetic fields at large distances from the epicenter.

  2. The 2016 Kumamoto earthquake sequence

    KATO, Aitaro; NAKAMURA, Kouji; HIYAMA, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An Mj 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an Mj 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest. PMID:27725474

  3. Earthquake clustering in the tectonic pattern and volcanism of the Andaman Sea region

    Špičák, Aleš; Vaněk, Jiří

    2013-01-01

    Roč. 608, November (2013), s. 728-736 ISSN 0040-1951 R&D Projects: GA MŠk ME09011 Institutional support: RVO:67985530 Keywords : earthquake swarm * Andaman Sea region * global seismological data * submarine volcanism * magma intrusion Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 2.866, year: 2013

  4. Earthquake occurrence reveals magma ascent beneath volcanoes and seamounts in the Banda Region

    Špičák, Aleš; Kuna, Václav; Vaněk, Jiří

    2013-01-01

    Roč. 75, č. 777 (2013), 777/1-777/8 ISSN 0258-8900 R&D Projects: GA MŠk ME09011 Institutional support: RVO:67985530 Keywords : Banda region * global seismological data * earthquake swarm Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 2.667, year: 2013

  5. Dim prospects for earthquake prediction

    Geller, Robert J.

    I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

  6. On the plant operators performance during earthquake

    Kitada, Y.; Yoshimura, S.; Abe, M.; Niwa, H.; Yoneda, T.; Matsunaga, M.; Suzuki, T.

    1994-01-01

    There is little data on which to judge the performance of plant operators during and after strong earthquakes. In order to obtain such data to enhance the reliability on the plant operation, a Japanese utility and a power plant manufacturer carried out a vibration test using a shaking table. The purpose of the test was to investigate operator performance, i.e., the quickness and correctness in switch handling and panel meter read-out. The movement of chairs during earthquake as also of interest, because if the chairs moved significantly or turned over during a strong earthquake, some arresting mechanism would be required for the chair. Although there were differences between the simulated earthquake motions used and actual earthquakes mainly due to the specifications of the shaking table, the earthquake motions had almost no influence on the operators of their capability (performance) for operating the simulated console and the personal computers

  7. Earthquake evaluation of a substation network

    Matsuda, E.N.; Savage, W.U.; Williams, K.K.; Laguens, G.C.

    1991-01-01

    The impact of the occurrence of a large, damaging earthquake on a regional electric power system is a function of the geographical distribution of strong shaking, the vulnerability of various types of electric equipment located within the affected region, and operational resources available to maintain or restore electric system functionality. Experience from numerous worldwide earthquake occurrences has shown that seismic damage to high-voltage substation equipment is typically the reason for post-earthquake loss of electric service. In this paper, the authors develop and apply a methodology to analyze earthquake impacts on Pacific Gas and Electric Company's (PG and E's) high-voltage electric substation network in central and northern California. The authors' objectives are to identify and prioritize ways to reduce the potential impact of future earthquakes on our electric system, refine PG and E's earthquake preparedness and response plans to be more realistic, and optimize seismic criteria for future equipment purchases for the electric system

  8. Earthquake forewarning in the Cascadia region

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  9. Data base pertinent to earthquake design basis

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  10. Communicating Earthquake Preparedness: The Influence of Induced Mood, Perceived Risk, and Gain or Loss Frames on Homeowners' Attitudes Toward General Precautionary Measures for Earthquakes.

    Marti, Michèle; Stauffacher, Michael; Matthes, Jörg; Wiemer, Stefan

    2018-04-01

    Despite global efforts to reduce seismic risk, actual preparedness levels remain universally low. Although earthquake-resistant building design is the most efficient way to decrease potential losses, its application is not a legal requirement across all earthquake-prone countries and even if, often not strictly enforced. Risk communication encouraging homeowners to take precautionary measures is therefore an important means to enhance a country's earthquake resilience. Our study illustrates that specific interactions of mood, perceived risk, and frame type significantly affect homeowners' attitudes toward general precautionary measures for earthquakes. The interdependencies of the variables mood, risk information, and frame type were tested in an experimental 2 × 2 × 2 design (N = 156). Only in combination and not on their own, these variables effectively influence attitudes toward general precautionary measures for earthquakes. The control variables gender, "trait anxiety" index, and alteration of perceived risk adjust the effect. Overall, the group with the strongest attitudes toward general precautionary actions for earthquakes are homeowners with induced negative mood who process high-risk information and gain-framed messages. However, the conditions comprising induced negative mood, low-risk information and loss-frame and induced positive mood, low-risk information and gain-framed messages both also significantly influence homeowners' attitudes toward general precautionary measures for earthquakes. These results mostly confirm previous findings in the field of health communication. For practitioners, our study emphasizes that carefully compiled communication measures are a powerful means to encourage precautionary attitudes among homeowners, especially for those with an elevated perceived risk. © 2017 Society for Risk Analysis.

  11. How citizen seismology is transforming rapid public earthquake information and interactions between seismologists and society

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Fréderic; Caroline, Etivant

    2015-04-01

    Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.

  12. Investigation of the TEC Changes in the vicinity of the Earthquake Preparation Zone

    Ulukavak, Mustafa; Yalcinkaya, Mualla

    2016-04-01

    Recently, investigation of the anomalies in the ionosphere before the earthquake has taken too much attention. The Total Electron Content (TEC) data has been used to monitor the changes in the ionosphere. Hence, researchers use the TEC changes before the strong earthquakes to monitor the anomalies in the ionosphere. In this study, the GPS-TEC variations, obtained from the GNSS stations in the vicinity of the earthquake preparation zone, was investigated. Nidra earthquake (M6.5), which was occurred on the north-west of Greece on November 17th, 2015 (38.755°N, 20.552°E), was selected for this study. First, the equation proposed by Dobrovolsky et al. (1979) was used to calculate the radius of the earthquake preparation zone. International GNSS Service (IGS) stations in the region were classified with respect to the radius of the earthquake preparation zone. The observation data of each station was obtained from the Crustal Dynamics Data and Information System (CDDIS) archive to estimate GPS-TEC variations between 16 October 2015 and 16 December 2015. Global Ionosphere Maps (GIM) products, obtained from the IGS, was used to check the robustness of the GPS-TEC variations. Possible anomalies were analyzed for each GNSS station by using the 15-day moving median method. In order to analyze these pre-earthquake ionospheric anomalies, we investigated three indices (Kp, F10.7 and Dst) related to the space weather conditions between 16 October 2015 and 16 December 2015. Solar and geomagnetic indices were obtained from The Oceanic and Atmospheric Administration (NOAA), The Canadian Space Weather Forecast Centre (CSWFC), and the Data Analysis Center for Geomagnetism and Space Magnetism Graduate School of Science, Kyoto University (WDC). This study aims at investigating the possible effects of the earthquake on the TEC variations.

  13. The typical seismic behavior in the vicinity of a large earthquake

    Rodkin, M. V.; Tikhonov, I. N.

    2016-10-01

    The Global Centroid Moment Tensor catalog (GCMT) was used to construct the spatio-temporal generalized vicinity of a large earthquake (GVLE) and to investigate the behavior of seismicity in GVLE. The vicinity is made of earthquakes falling into the zone of influence of a large number (100, 300, or 1000) of largest earthquakes. The GVLE construction aims at enlarging the available statistics, diminishing a strong random component, and revealing typical features of pre- and post-shock seismic activity in more detail. As a result of the GVLE construction, the character of fore- and aftershock cascades was examined in more detail than was possible without of the use of the GVLE approach. As well, several anomalies in the behavior exhibited by a variety of earthquake parameters were identified. The amplitudes of all these anomalies increase with the approaching time of the generalized large earthquake (GLE) as the logarithm of the time interval from the GLE occurrence. Most of the discussed anomalies agree with common features well expected in the evolution of instability. In addition to these common type precursors, one earthquake-specific precursor was found. The decrease in mean earthquake depth presumably occurring in a smaller GVLE probably provides evidence of a deep fluid being involved in the process. The typical features in the evolution of shear instability as revealed in GVLE agree with results obtained in laboratory studies of acoustic emission (AE). The majority of the anomalies in earthquake parameters appear to have a secondary character, largely connected with an increase in mean magnitude and decreasing fraction of moderate size events (mw5.0-6.0) in the immediate GLE vicinity. This deficit of moderate size events could hardly be caused entirely by their incomplete reporting and can presumably reflect some features in the evolution of seismic instability.

  14. Deformation analysis of Aceh April 11{sup th} 2012 earthquake using GPS observation data

    Maulida, Putra, E-mail: putra.maulida@gmail.com [Bandung Institute of Technology (ITB), Jalan Ganesha 10, Bandung 40132 (Indonesia); Meilano, Irwan; Sarsito, Dina A. [Bandung Institute of Technology (ITB), Jalan Ganesha 10, Bandung 40132 (Indonesia); Geodesy Research Group, geodesy and geomatic Engineering, ITB (Indonesia); Susilo [Bandung Institute of Technology (ITB), Jalan Ganesha 10, Bandung 40132 (Indonesia); Geospatial Information Agency (BIG) (Indonesia)

    2015-04-24

    This research tries to estimate the co-seismic deformation of intraplate earthquake occurred off northern Sumatra coast which is about 100-200 km southwest of Sumatrasubduction zone. The earthquake mechanism was strike-slip with magnitude 8.6 and triggering aftershock with magnitude 8.2 two hours later. We estimated the co-seismic deformation by using the GPS (Global Positioning System) continuous data along western Sumatra coast. The GPS observation derived from Sumatran GPS Array (SuGAr) and Geospatial Information Agency (BIG). For data processing we used GPS Analyze at Massachusetts Institute of Technology (GAMIT) software and Global Kalman Filter (GLOBK) to estimate the co-seismic deformation. From the GPS daily solution, the result shows that the earthquake caused displacement for the GPS stations in Sumatra. GPS stations in northern Sumatra showed the displacement to the northeast with the average displacement was 15 cm. The biggest displacement was found at station BSIM which is located at Simeuleu Island off north west Sumatra coast. GPS station in middle part of Sumatra, the displacement was northwest. The earthquake also caused subsidence for stations in northern Sumatra, but from the time series there was not sign of subsidence was found at middle part of Sumatra. In addition, the effect of the earthquake was worldwide and affected the other GPS Stations around Hindia oceanic.

  15. Deformation analysis of Aceh April 11th 2012 earthquake using GPS observation data

    Maulida, Putra; Meilano, Irwan; Sarsito, Dina A.; Susilo

    2015-04-01

    This research tries to estimate the co-seismic deformation of intraplate earthquake occurred off northern Sumatra coast which is about 100-200 km southwest of Sumatrasubduction zone. The earthquake mechanism was strike-slip with magnitude 8.6 and triggering aftershock with magnitude 8.2 two hours later. We estimated the co-seismic deformation by using the GPS (Global Positioning System) continuous data along western Sumatra coast. The GPS observation derived from Sumatran GPS Array (SuGAr) and Geospatial Information Agency (BIG). For data processing we used GPS Analyze at Massachusetts Institute of Technology (GAMIT) software and Global Kalman Filter (GLOBK) to estimate the co-seismic deformation. From the GPS daily solution, the result shows that the earthquake caused displacement for the GPS stations in Sumatra. GPS stations in northern Sumatra showed the displacement to the northeast with the average displacement was 15 cm. The biggest displacement was found at station BSIM which is located at Simeuleu Island off north west Sumatra coast. GPS station in middle part of Sumatra, the displacement was northwest. The earthquake also caused subsidence for stations in northern Sumatra, but from the time series there was not sign of subsidence was found at middle part of Sumatra. In addition, the effect of the earthquake was worldwide and affected the other GPS Stations around Hindia oceanic.

  16. Global Citizenship Education

    Roesgaard, Marie Højlund

    2016-01-01

    published after 2000 was written by researchers based in the US and if you add other English-speaking countries such as Canada, England, Australia and New Zealand, the proportion is even higher. English in the field of education research often serves as the international lingua franca. Since there is also......Global citizenship as an idea has become an increasingly important issue on the educational agenda since the late 1970’s. The importance allotted to this issue is clear in the attention given to it by for example UNESCO where global citizenship education (GCED) is an area of strategic focus....... Increasingly schools all over the world are attempting to or expected to educate the global citizen, but how exactly do you educate the global citizen? What does this global citizenship consist of? While surely the type of training and education needed to train a global citizen will vary greatly depending...

  17. Which Mixed-Member Proportional Electoral Formula Fits You Best? Assessing the Proportionality Principle of Positive Vote Transfer Systems

    Bochsler, Daniel

    2014-01-01

    Mixed-member proportional systems (MMP) are a family of electoral systems which combine district-based elections with a proportional seat allocation. Positive vote transfer systems belong to this family. This article explains why they might be better than their siblings, and examines under which ...

  18. Neoliberalism and the regulation of global labor mobility

    Overbeek, H.W.

    2002-01-01

    Globalization involves the international expansion of market relations and the global pursuit of economic liberalism. The essential factor in this process is commodification, including the commodification of human labor. Globalization integrates an increasing proportion of the world population

  19. Understanding Great Earthquakes in Japan's Kanto Region

    Kobayashi, Reiji; Curewitz, Daniel

    2008-10-01

    Third International Workshop on the Kanto Asperity Project; Chiba, Japan, 16-19 February 2008; The 1703 (Genroku) and 1923 (Taisho) earthquakes in Japan's Kanto region (M 8.2 and M 7.9, respectively) caused severe damage in the Tokyo metropolitan area. These great earthquakes occurred along the Sagami Trough, where the Philippine Sea slab is subducting beneath Japan. Historical records, paleoseismological research, and geophysical/geodetic monitoring in the region indicate that such great earthquakes will repeat in the future.

  20. Earthquake-triggered landslides in southwest China

    X. L. Chen; Q. Zhou; H. Ran; R. Dong

    2012-01-01

    Southwest China is located in the southeastern margin of the Tibetan Plateau and it is a region of high seismic activity. Historically, strong earthquakes that occurred here usually generated lots of landslides and brought destructive damages. This paper introduces several earthquake-triggered landslide events in this region and describes their characteristics. Also, the historical data of earthquakes with a magnitude of 7.0 or greater, having occurred in this region, is col...

  1. Crowdsourcing earthquake damage assessment using remote sensing imagery

    Stuart Gill

    2011-06-01

    Full Text Available This paper describes the evolution of recent work on using crowdsourced analysis of remote sensing imagery, particularly high-resolution aerial imagery, to provide rapid, reliable assessments of damage caused by earthquakes and potentially other disasters. The initial effort examined online imagery taken after the 2008 Wenchuan, China, earthquake. A more recent response to the 2010 Haiti earthquake led to the formation of an international consortium: the Global Earth Observation Catastrophe Assessment Network (GEO-CAN. The success of GEO-CAN in contributing to the official damage assessments made by the Government of Haiti, the United Nations, and the World Bank led to further development of a web-based interface. A current initiative in Christchurch, New Zealand, is underway where remote sensing experts are analyzing satellite imagery, geotechnical engineers are marking liquefaction areas, and structural engineers are identifying building damage. The current site includes online training to improve the accuracy of the assessments and make it possible for even novice users to contribute to the crowdsourced solution. The paper discusses lessons learned from these initiatives and presents a way forward for using crowdsourced remote sensing as a tool for rapid assessment of damage caused by natural disasters around the world.

  2. THE MAY 23TH 2007 GULF OF MEXICO EARTHQUAKE

    Yamamoto, J.; Jimenez, Z.

    2009-12-01

    On the 23th of May 2007 at 14:09 local time (19:09 UT) an insolated earthquake of local magnitude 5.2 occurred offshore northern Veracruz in the Gulf of Mexico. The seismic focus was located using local and regional data at 20.11° N, 97.38° W and 7.8 km depth at 175 km distance from Tuxpan a city of 134,394 inhabitants. The earthquake was widely felt along the costal states of southern Tamaulipas and Veracruz in which several schools and public buildings were evacuated. Neither Laguna Verde nuclear plant, located approximately 245 km from the epicenter, nor PEMEX petroleum company reported damage. First-motion data indicates that the rupture occurred as strike slip faulting along two possible planes, one oriented roughly north-south and the other east-west. In the present paper a global analysis of the earthquake is made to elucidate its origin and possible correlation with known geotectonic features of the region.

  3. The proportionate value of proportionality in palliative sedation.

    Berger, Jeffrey T

    2014-01-01

    Proportionality, as it pertains to palliative sedation, is the notion that sedation should be induced at the lowest degree effective for symptom control, so that the patient's consciousness may be preserved. The pursuit of proportionality in palliative sedation is a widely accepted imperative advocated in position statements and guidelines on this treatment. The priority assigned to the pursuit of proportionality, and the extent to which it is relevant for patients who qualify for palliative sedation, have been overstated. Copyright 2014 The Journal of Clinical Ethics. All rights reserved.

  4. An integrated photosensor readout for gas proportional scintillation counters

    Lopes, J.A.M.; Santos, J.M.F. dos; Conde, C.A.N.

    1996-01-01

    A xenon gas proportional scintillation counter has been instrumented with a novel photosensor that replaces the photomultiplier tube normally used to detect the VUV secondary scintillation light. In this implementation, the collection grid of a planar gas proportional scintillation counter also functions as a multiwire proportional chamber to amplify and detect the photoelectrons emitted by a reflective CsI photocathode in direct contact with the xenon gas. This integrated concept combines greater simplicity, compactness, and ruggedness (no optical window is used) with low power consumption. An energy resolution of 12% was obtained for 59.6 keV x-rays

  5. Retrospective analysis of the Spitak earthquake

    A. K. Tovmassian

    1995-06-01

    Full Text Available Based on the retrospective analysis of numerous data and studies of the Spitak earthquake the present work at- tempts to shed light on different aspects of that catastrophic seismic event which occurred in Northern Arme- nia on December 7, 1988. The authors follow a chronological order of presentation, namely: changes in geo- sphere, atmosphere, biosphere during the preparation of the Spitak earthquake, foreshocks, main shock, after- shocks, focal mechanisms, historical seismicity; seismotectonic position of the source, strong motion records, site effects; the macroseismic effect, collapse of buildings and structures; rescue activities; earthquake conse- quences; and the lessons of the Spitak earthquake.

  6. Smoking prevalence increases following Canterbury earthquakes.

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  7. Thermal infrared anomalies of several strong earthquakes.

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  8. Real Time Earthquake Information System in Japan

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  9. Impact- and earthquake- proof roof structure

    Shohara, Ryoichi.

    1990-01-01

    Building roofs are constituted with roof slabs, an earthquake proof layer at the upper surface thereof and an impact proof layer made of iron-reinforced concrete disposed further thereover. Since the roofs constitute an earthquake proof structure loading building dampers on the upper surface of the slabs by the concrete layer, seismic inputs of earthquakes to the buildings can be moderated and the impact-proof layer is formed, to ensure the safety to external conditions such as earthquakes or falling accidents of airplane in important facilities such as reactor buildings. (T.M.)

  10. A minimalist model of characteristic earthquakes

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    In a spirit akin to the sandpile model of self- organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake...... behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...... of the characteristic earthquake....

  11. The 1985 México earthquake The 1985 México earthquake

    Moreno Murillo Juan Manuel

    1995-10-01

    Full Text Available

    This paper includes a bibliographic review with the description of the various aspects about the (Ms = 8.1 Michoacan, Mexico earthquake, which comprised of three events. The main shock of the September 19, 1985 earthquake occurred on Thursday at 7h. 17m. 46.6s. local time in Mexico City, and had (Ms = 8.1. The focus of the event was a depth of approximately 18 km. A second shock occurred on Friday evening 21 September at 7h. 38m. p.m. local time. The last aftershock occurred on 30 April of 1986 (Ms = 7.0. A prior event occurred to the September 1985 earthquake, occurred on 28 May, 1985 (mb = 5.2 and is described too. This event, was a terrible natural disaster for that country, at least 9,500 people were killed, about 30,000 were injured, more that 100,000 were left homeless and severe damage occurred in many parts of Mexico City and several states of central Mexico. According to some sources, It is estimated that the earthquake seriously affected an area of approximately 825,000 square kilometers. This paper describes a summary of the global tectonic setting, genesis and location of the epicenter, an interpretation of the source mechanism and a analyses at these results from some stations that recorded this earthquake and at the same time, a comparison between the two largest earthquake of 1985. Moreover, this paper describes the principal damage resulting and a description of effects from tsunami produced from earthquake. The 1985 Mexico earthquake occurred as a result of slipping in the subduction process between the Cocos and American plates. This was a shallow interplate thrust type event which occurred in the intersection of the Orozco fracture with the Middle American trench.

  12. An ongoing earthquake sequence near Dhaka, Bangladesh, from regional recordings

    Howe, M.; Mondal, D. R.; Akhter, S. H.; Kim, W.; Seeber, L.; Steckler, M. S.

    2013-12-01

    of this population to earthquakes is amplified by poor infrastructure and building codes. The only event in this sequence included in the global Centroid Moment Tensor (CMT) catalog is a Mw 5.1 strike-slip event 18 km deep. At least 10 events in this sequence have been recorded globally (ISC). Many more events from the sequence have been recorded by a regional array of seismographs we have operated in Bangladesh since 2007. We apply several techniques to these data to explore source parameters and dimensions of seismogenesis in this sequence. We present both double-difference relocations and waveform modeling, which provide constraints on the source characteristics. Using the Mw 5.1 and other regional events as calibration, we obtain source parameters for several other events in the sequence. This sequence is ideal for double-difference relocation techniques because the source-receiver paths of the events in the sequence, recorded regionally, are very similar. The event relocation enables us to obtain accurate estimates of fault dimensions of this source. By combining accurate spatial dimensions of the source, the depth range of seismogenesis for the source zone, and well-constrained source parameters of events within the sequence, it we assess the maximum size of possible ruptures in this source.

  13. Offshore Earthquakes Do Not Influence Marine Mammal Stranding Risk on the Washington and Oregon Coasts

    Grant, Rachel A.; Savirina, Anna

    2018-01-01

    Simple Summary Marine mammals stranding on coastal beaches is not unusual. However, there appears to be no single cause for this, with several causes being probable, such as starvation, contact with humans (for example boat strike or entanglement with fishing gear), disease, and parasitism. We evaluated marine mammal stranding off the Washington and Oregon coasts and looked at offshore earthquakes as a possible contributing factor. Our analysis showed that offshore earthquakes did not make marine mammals more likely to strand. We also analysed a subset of data from the north of Washington State and found that non-adult animals made up a large proportion of stranded animals, and for dead animals the commonest cause of death was disease, traumatic injury, or starvation. Abstract The causes of marine mammals stranding on coastal beaches are not well understood, but may relate to topography, currents, wind, water temperature, disease, toxic algal blooms, and anthropogenic activity. Offshore earthquakes are a source of intense sound and disturbance and could be a contributing factor to stranding probability. We tested the hypothesis that the probability of marine mammal stranding events on the coasts of Washington and Oregon, USA is increased by the occurrence of offshore earthquakes in the nearby Cascadia subduction zone. The analysis carried out here indicated that earthquakes are at most, a very minor predictor of either single, or large (six or more animals) stranding events, at least for the study period and location. We also tested whether earthquakes inhibit stranding and again, there was no link. Although we did not find a substantial association of earthquakes with strandings in this study, it is likely that there are many factors influencing stranding of marine mammals and a single cause is unlikely to be responsible. Analysis of a subset of data for which detailed descriptions were available showed that most live stranded animals were pups, calves, or

  14. The Loma Prieta, California, Earthquake of October 17, 1989: Societal Response

    Coordinated by Mileti, Dennis S.

    1993-01-01

    Professional Paper 1553 describes how people and organizations responded to the earthquake and how the earthquake impacted people and society. The investigations evaluate the tools available to the research community to measure the nature, extent, and causes of damage and losses. They describe human behavior during and immediately after the earthquake and how citizens participated in emergency response. They review the challenges confronted by police and fire departments and disruptions to transbay transportations systems. And they survey the challenges of post-earthquake recovery. Some significant findings were: * Loma Prieta provided the first test of ATC-20, the red, yellow, and green tagging of buildings. It successful application has led to widespread use in other disasters including the September 11, 2001, New York City terrorist incident. * Most people responded calmly and without panic to the earthquake and acted to get themselves to a safe location. * Actions by people to help alleviate emergency conditions were proportional to the level of need at the community level. * Some solutions caused problems of their own. The police perimeter around the Cypress Viaduct isolated businesses from their customers leading to a loss of business and the evacuation of employees from those businesses hindered the movement of supplies to the disaster scene. * Emergency transbay ferry service was established 6 days after the earthquake, but required constant revision of service contracts and schedules. * The Loma Prieta earthquake produced minimal disruption to the regional economy. The total economic disruption resulted in maximum losses to the Gross Regional Product of $725 million in 1 month and $2.9 billion in 2 months, but 80% of the loss was recovered during the first 6 months of 1990. Approximately 7,100 workers were laid off.

  15. A reduced feedback proportional fair multiuser scheduling scheme

    Shaqfeh, Mohammad; Alnuweiri, Hussein M.; Alouini, Mohamed-Slim

    2011-01-01

    . A slight reduction in the prospected multiuser diversity gains is an acceptable trade-off for great savings in terms of required channel-state-information feedback messages. In this work, we propose a novel proportional fair multiuser switched

  16. DC motor proportional control system for orthotic devices

    Blaise, H. T.; Allen, J. R.

    1972-01-01

    Multi-channel proportional control system for operation of dc motors for use with externally-powered orthotic arm braces is described. Components of circuitry and principles of operation are described. Schematic diagram of control circuit is provided.

  17. Proportional feedback control of laminar flow over a hemisphere

    Lee, Jung Il [Dept. of Mechanical Engineering, Ajou University, Suwon (Korea, Republic of); Son, Dong Gun [Severe Accident and PHWR Safety Research Division, Korea Atomic Energy Research Institute (KAERI), Daejeon (Korea, Republic of)

    2016-08-15

    In the present study, we perform a proportional feedback control of laminar flow over a hemisphere at Re = 300 to reduce its lift fluctuations by attenuating the strength of the vortex shedding. As a control input, blowing/suction is distributed on the surface of hemisphere before the separation, and its strength is linearly proportional to the transverse velocity at a sensing location in the centerline of the wake. The sensing location is determined based on a correlation function between the lift force and the time derivative of sensing velocity. The optimal proportional gains for the proportional control are obtained for the sensing locations considered. The present control successfully attenuates the velocity fluctuations at the sensing location and three dimensional vertical structures in the wake, resulting in the reduction of lift fluctuations of hemisphere.

  18. Operability test report for 211BA flow proportional sampler

    Weissenfels, R.D.

    1995-01-01

    This operability report will verify that the 211-BA flow proportional sampler functions as intended by design. The sampler was installed by Project W-007H and is part of BAT/AKART for the BCE liquid effluent stream

  19. A Bayesian Approach to Real-Time Earthquake Phase Association

    Benz, H.; Johnson, C. E.; Earle, P. S.; Patton, J. M.

    2014-12-01

    Real-time location of seismic events requires a robust and extremely efficient means of associating and identifying seismic phases with hypothetical sources. An association algorithm converts a series of phase arrival times into a catalog of earthquake hypocenters. The classical approach based on time-space stacking of the locus of possible hypocenters for each phase arrival using the principal of acoustic reciprocity has been in use now for many years. One of the most significant problems that has emerged over time with this approach is related to the extreme variations in seismic station density throughout the global seismic network. To address this problem we have developed a novel, Bayesian association algorithm, which looks at the association problem as a dynamically evolving complex system of "many to many relationships". While the end result must be an array of one to many relations (one earthquake, many phases), during the association process the situation is quite different. Both the evolving possible hypocenters and the relationships between phases and all nascent hypocenters is many to many (many earthquakes, many phases). The computational framework we are using to address this is a responsive, NoSQL graph database where the earthquake-phase associations are represented as intersecting Bayesian Learning Networks. The approach directly addresses the network inhomogeneity issue while at the same time allowing the inclusion of other kinds of data (e.g., seismic beams, station noise characteristics, priors on estimated location of the seismic source) by representing the locus of intersecting hypothetical loci for a given datum as joint probability density functions.

  20. Effects in Morocco of the Lisboa earthquake 1 November 1755

    Levret, A.

    1988-05-01

    Within the framework of a cooperative agreement Sofratome/Office National d'Electricite of Morocco and Sofratome/Electricidade de Portugal, a study has been conducted as to the effects of the November 1, 1755 Lisbon earthquake in Morocco. This event, the effects of which have been described at length in Portugal, was likewise strongly felt in Morocco, especially on the Atlantic coast, which was laid waste not only through the direct agency of seismic waves, but also through that of a formidable tsunami. In old texts, the descriptions of these conjugate effects has been rendered with varying degrees of overstatement. The procedure adopted in order to arrive at a precise identification of the effects and their origin and an evaluation of intensity involves three stages: a) an assessment of the reliability of the documents used; b) a thoroughgoing analysis of the descriptions with the object of discriminating between the direct effects of the earthquake and those ascribable to the action of the tidal wave: c) a readjustment of the intensities by analysis of the global effects of the earthquake not only in Morocco but also in Portugal and Spain. Then a comparison of these with the well- documented effects of the recent, February 28, 1969 earthquake, originating at the same source. Extrapolated isoseismals for the effects in Morocco of the 1755 event derived from this study are then assigned. In the light of current knowledge concerning the historical seismicity of the Iberian African collision zone, an outline of the maximum observed intensities is proposed [fr

  1. A parsimonious model for the proportional control valve

    Elmer, KF; Gentle, CR

    2001-01-01

    A generic non-linear dynamic model of a direct-acting electrohydraulic proportional solenoid valve is presented. The valve consists of two subsystems-s-a spool assembly and one or two unidirectional proportional solenoids. These two subsystems are modelled separately. The solenoid is modelled as a non-linear resistor-inductor combination, with inductance parameters that change with current. An innovative modelling method has been used to represent these components. The spool assembly is model...

  2. Reduction of degraded events in miniaturized proportional counters

    Plaga, R.; Kirsten, T. (Max Planck Inst. fuer Kernphysik, Heidelberg (Germany))

    1991-11-15

    A method to reduce the number of degraded events in miniaturized proportional counters is described. A shaping of the outer cathode leads to a more uniform gas gain along the counter axis. The method is useful in situations in which the total number of decay events is very low. The effects leading to degraded events are studied theoretically and experimentally. The usefulness of the method is demonstrated by using it for the proportional counter of the GALLEX solar neutrino experiment. (orig.).

  3. Spatial Evaluation and Verification of Earthquake Simulators

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  4. Workshop on New Madrid geodesy and the challenges of understanding intraplate earthquakes

    Boyd, Oliver; Calais, Eric; Langbein, John; Magistrale, Harold; Stein, Seth; Zoback, Mark

    2013-01-01

    On March 4, 2011, 26 researchers gathered in Norwood, Massachusetts, for a workshop sponsored by the U.S. Geological Survey and FM Global to discuss geodesy in and around the New Madrid seismic zone (NMSZ) and its relation to earthquake hazard. The group addressed the challenge of reconciling current geodetic measurements, which show low present-day surface strain rates, with paleoseismic evidence of recent, relatively frequent, major earthquakes in the region. Several researchers were invited by the organizing committee to give overview presentations while all participants were encouraged to present their most recent ideas. The overview presentations appear in this report along with a set of recommendations.

  5. Coseismic and postseismic deformation associated with the 2016 Mw 7.8 Kaikoura earthquake, New Zealand: fault movement investigation and seismic hazard analysis

    Jiang, Zhongshan; Huang, Dingfa; Yuan, Linguo; Hassan, Abubakr; Zhang, Lupeng; Yang, Zhongrong

    2018-04-01

    The 2016 moment magnitude (Mw) 7.8 Kaikoura earthquake demonstrated that multiple fault segments can undergo rupture during a single seismic event. Here, we employ Global Positioning System (GPS) observations and geodetic modeling methods to create detailed images of coseismic slip and postseismic afterslip associated with the Kaikoura earthquake. Our optimal geodetic coseismic model suggests that rupture not only occurred on shallow crustal faults but also to some extent at the Hikurangi subduction interface. The GPS-inverted moment release during the earthquake is equivalent to a Mw 7.9 event. The near-field postseismic deformation is mainly derived from right-lateral strike-slip motions on shallow crustal faults. The afterslip did not only significantly extend northeastward on the Needles fault but also appeared at the plate interface, slowly releasing energy over the past 6 months, equivalent to a Mw 7.3 earthquake. Coulomb stress changes induced by coseismic deformation exhibit complex patterns and diversity at different depths, undoubtedly reflecting multi-fault rupture complexity associated with the earthquake. The Coulomb stress can reach several MPa during coseismic deformation, which can explain the trigger mechanisms of afterslip in two high-slip regions and the majority of aftershocks. Based on the deformation characteristics of the Kaikoura earthquake, interseismic plate coverage, and historical earthquakes, we conclude that Wellington is under higher seismic threat after the earthquake and great attention should be paid to potential large earthquake disasters in the near future.[Figure not available: see fulltext.

  6. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  7. Stress triggering of the Lushan M7. 0 earthquake by the Wenchuan Ms8. 0 earthquake

    Wu Jianchao

    2013-08-01

    Full Text Available The Wenchuan Ms8. 0 earthquake and the Lushan M7. 0 earthquake occurred in the north and south segments of the Longmenshan nappe tectonic belt, respectively. Based on the focal mechanism and finite fault model of the Wenchuan Ms8. 0 earthquake, we calculated the coulomb failure stress change. The inverted coulomb stress changes based on the Nishimura and Chenji models both show that the Lushan M7. 0 earthquake occurred in the increased area of coulomb failure stress induced by the Wenchuan Ms8. 0 earthquake. The coulomb failure stress increased by approximately 0. 135 – 0. 152 bar in the source of the Lushan M7. 0 earthquake, which is far more than the stress triggering threshold. Therefore, the Lushan M7. 0 earthquake was most likely triggered by the coulomb failure stress change.

  8. Foreshock occurrence before large earthquakes

    Reasenberg, P.A.

    1999-01-01

    Rates of foreshock occurrence involving shallow M ??? 6 and M ??? 7 mainshocks and M ??? 5 foreshocks were measured in two worldwide catalogs over ???20-year intervals. The overall rates observed are similar to ones measured in previous worldwide and regional studies when they are normalized for the ranges of magnitude difference they each span. The observed worldwide rates were compared to a generic model of earthquake clustering based on patterns of small and moderate aftershocks in California. The aftershock model was extended to the case of moderate foreshocks preceding large mainshocks. Overall, the observed worldwide foreshock rates exceed the extended California generic model by a factor of ???2. Significant differences in foreshock rate were found among subsets of earthquakes defined by their focal mechanism and tectonic region, with the rate before thrust events higher and the rate before strike-slip events lower than the worldwide average. Among the thrust events, a large majority, composed of events located in shallow subduction zones, had a high foreshock rate, while a minority, located in continental thrust belts, had a low rate. These differences may explain why previous surveys have found low foreshock rates among thrust events in California (especially southern California), while the worldwide observations suggests the opposite: California, lacking an active subduction zone in most of its territory, and including a region of mountain-building thrusts in the south, reflects the low rate apparently typical for continental thrusts, while the worldwide observations, dominated by shallow subduction zone events, are foreshock-rich. If this is so, then the California generic model may significantly underestimate the conditional probability for a very large (M ??? 8) earthquake following a potential (M ??? 7) foreshock in Cascadia. The magnitude differences among the identified foreshock-mainshock pairs in the Harvard catalog are consistent with a uniform

  9. Earthquakes, detecting and understanding them

    2008-05-01

    The signatures at the surface of the Earth is continually changing on a geological timescale. The tectonic plates, which make up this surface, are moving in relation to each other. On human timescale, these movements are the result of earthquakes, which suddenly, release energy accumulated over a period of time. The vibrations they produce propagate through the interior of the Earth: these are seismic waves. However, other phenomena can generate seismic waves, such as volcanoes, quarry blasts, etc. The surf of the ocean waves on the coasts, the wind in the trees and human activity (industry and road traffic) all contribute to the 'seismic background noise'. Sensors are able to detect signals from events which are then discriminated, analyzed and located. Earthquakes and active volcanoes are not distributed randomly over the surface of the globe: they mainly coincide with mountain chains and ocean trenches and ridges. 'An earthquake results from the abrupt release of the energy accumulated by movements and rubbing of different plates'. The study of the propagation of seismic waves has allowed to determine the outline of the plates inside the Earth and has highlighted their movements. There are seven major plates which are colliding, diverging or sliding past each other. Each year the continents move several centimeters with respect to one another. This process, known as 'continental drift', was finally explained by plate tectonics. The initial hypothesis for this science dates from the beginning of the 20. century, but it was not confirmed until the 1960's. It explains that convection inside the Earth is the source of the forces required for these movements. This science, as well as explaining these great movements, has provided a coherent, unifying and quantitative framework, which unites the explanations for all the geophysical phenomena under one mechanism. (authors)

  10. Against proportional shortfall as a priority-setting principle.

    Altmann, Samuel

    2018-05-01

    As the demand for healthcare rises, so does the need for priority setting in healthcare. In this paper, I consider a prominent priority-setting principle: proportional shortfall. My purpose is to argue that proportional shortfall, as a principle, should not be adopted. My key criticism is that proportional shortfall fails to consider past health.Proportional shortfall is justified as it supposedly balances concern for prospective health while still accounting for lifetime health, even though past health is deemed irrelevant. Accounting for this lifetime perspective means that the principle may indirectly consider past health by accounting for how far an individual is from achieving a complete, healthy life. I argue that proportional shortfall does not account for this lifetime perspective as it fails to incorporate the fair innings argument as originally claimed, undermining its purported justification.I go on to demonstrate that the case for ignoring past health is weak, and argue that past health is at least sometimes relevant for priority-setting decisions. Specifically, when an individual's past health has a direct impact on current or future health, and when one individual has enjoyed significantly more healthy life years than another.Finally, I demonstrate that by ignoring past illnesses, even those entirely unrelated to their current illness, proportional shortfall can lead to instances of double jeopardy, a highly problematic implication. These arguments give us reason to reject proportional shortfall. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Rapid Source Characterization of the 2011 Mw 9.0 off the Pacific coast of Tohoku Earthquake

    Hayes, Gavin P.

    2011-01-01

    On March 11th, 2011, a moment magnitude 9.0 earthquake struck off the coast of northeast Honshu, Japan, generating what may well turn out to be the most costly natural disaster ever. In the hours following the event, the U.S. Geological Survey National Earthquake Information Center led a rapid response to characterize the earthquake in terms of its location, size, faulting source, shaking and slip distributions, and population exposure, in order to place the disaster in a framework necessary for timely humanitarian response. As part of this effort, fast finite-fault inversions using globally distributed body- and surface-wave data were used to estimate the slip distribution of the earthquake rupture. Models generated within 7 hours of the earthquake origin time indicated that the event ruptured a fault up to 300 km long, roughly centered on the earthquake hypocenter, and involved peak slips of 20 m or more. Updates since this preliminary solution improve the details of this inversion solution and thus our understanding of the rupture process. However, significant observations such as the up-dip nature of rupture propagation and the along-strike length of faulting did not significantly change, demonstrating the usefulness of rapid source characterization for understanding the first order characteristics of major earthquakes.

  12. Statistical properties of earthquakes clustering

    A. Vecchio

    2008-04-01

    Full Text Available Often in nature the temporal distribution of inhomogeneous stochastic point processes can be modeled as a realization of renewal Poisson processes with a variable rate. Here we investigate one of the classical examples, namely, the temporal distribution of earthquakes. We show that this process strongly departs from a Poisson statistics for both catalogue and sequence data sets. This indicate the presence of correlations in the system probably related to the stressing perturbation characterizing the seismicity in the area under analysis. As shown by this analysis, the catalogues, at variance with sequences, show common statistical properties.

  13. Refresher Course on Physics of Earthquakes -98 ...

    The objective of this course is to help teachers gain an understanding of the earhquake phenomenon and the physical processes involved in its genesis as well as offhe earthquake waves which propagate the energy released by the earthquake rupture outward from the source. The Course will begin with mathematical ...

  14. Tutorial on earthquake rotational effects: historical examples

    Kozák, Jan

    2009-01-01

    Roč. 99, 2B (2009), s. 998-1010 ISSN 0037-1106 Institutional research plan: CEZ:AV0Z30120515 Keywords : rotational seismic models * earthquake rotational effects * historical earthquakes Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 1.860, year: 2009

  15. Wood-framed houses for earthquake zones

    Hansen, Klavs Feilberg

    Wood-framed houses with a sheathing are suitable for use in earthquake zones. The Direction describes a method of determining the earthquake forces in a house and shows how these forces can be resisted by diaphragm action in the walls, floors, and roof, of the house. An appendix explains how...

  16. Earthquake effect on the geological environment

    Kawamura, Makoto

    1999-01-01

    Acceleration caused by the earthquake, changes in the water pressure, and the rock-mass strain were monitored for a series of 344 earthquakes from 1990 to 1998 at Kamaishi In Situ Test Site. The largest acceleration was registered to be 57.14 gal with the earthquake named 'North coast of Iwate Earthquake' (M4.4) occurred in June, 1996. Changes of the water pressure were recorded with 27 earthquakes; the largest change was -0.35 Kgt/cm 2 . The water-pressure change by earthquake was, however, usually smaller than that caused by rainfall in this area. No change in the electric conductivity or pH of ground water was detected before and after the earthquake throughout the entire period of monitoring. The rock-mass strain was measured with a extensometer whose detection limit was of the order of 10 -8 to 10 -9 degrees and the remaining strain of about 2.5x10 -9 degrees was detected following the 'Offshore Miyagi Earthquake' (M5.1) in October, 1997. (H. Baba)

  17. Designing an Earthquake-Resistant Building

    English, Lyn D.; King, Donna T.

    2016-01-01

    How do cross-bracing, geometry, and base isolation help buildings withstand earthquakes? These important structural design features involve fundamental geometry that elementary school students can readily model and understand. The problem activity, Designing an Earthquake-Resistant Building, was undertaken by several classes of sixth- grade…

  18. Passive containment system in high earthquake motion

    Kleimola, F.W.; Falls, O.B. Jr.

    1977-01-01

    High earthquake motion necessitates major design modifications in the complex of plant structures, systems and components in a nuclear power plant. Distinctive features imposed by seismic category, safety class and quality classification requirements for the high seismic ground acceleration loadings significantly reflect in plant costs. The design features in the Passive Containment System (PCS) responding to high earthquake ground motion are described

  19. Napa Earthquake impact on water systems

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  20. Instruction system upon occurrence of earthquakes

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)