WorldWideScience

Sample records for earthquake accelerograms digitized

  1. Strong Motion Earthquake Data Values of Digitized Strong-Motion Accelerograms, 1933-1994

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Strong Motion Earthquake Data Values of Digitized Strong-Motion Accelerograms is a database of over 15,000 digitized and processed accelerograph records from...

  2. BASLIKO. A program for baseline-correction of earthquake-accelerograms

    International Nuclear Information System (INIS)

    Koschmieder, D.; Altes, J.

    1978-12-01

    In the following report a program for baseline-correction of earthquake-accelerograms is presented. By using this program errors in curves, which occur in using the chronographs and digitizers, are eliminated. (orig.) [de

  3. Phase characteristics of earthquake accelerogram and its application

    International Nuclear Information System (INIS)

    Ohsaki, Y.; Iwasaki, R.; Ohkawa, I.; Masao, T.

    1979-01-01

    As the input earthquake motion for seismic design of nuclear power plant structures and equipments, an artificial time history compatible with smoothed design response spectrum is frequently used. This paper deals with a wave generation technique based on phase characteristics in earthquake accelerograms as an alternate of envelope time function. The concept of 'phase differences' distribution' is defined to represent phase characteristics of earthquake motion. The procedure proposed in this paper consists of following steps; (1) Specify a design response spectrum and derive a corresponding initial modal amplitude. (2) Determine a phase differences' distribution corresponding to an envelope function, the shape of which is dependent on magnitude and epicentral distance of an earthquake. (3) Derive the phase angles at all modal frequencies from the phase differences' distribution. (4) Generate a time history by inverse Fourier transeform on the basis of the amplitudes and the phase angles thus determined. (5) Calculate the response spectrum. (6) Compare the specified and calculated response spectra, and correct the amplitude at each frequency so that the response spectrum will be consistent with the specified. (7) Repeat the steps 4 through 6, until the specified and calculated response spectra become consistent with sufficient accuracy. (orig.)

  4. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas

    2014-01-01

    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately...

  5. The near-source strong-motion accelerograms recorded by an experimental array in Tangshan, China

    Science.gov (United States)

    Peng, K.; Xie, Lingtian; Li, S.; Boore, D.M.; Iwan, W.D.; Teng, T.L.

    1985-01-01

    A joint research project on strong-motion earthquake studies between the People's Republic of China and the United States is in progress. As a part of this project, an experimental strong-motion array, consisting of twelve Kinemetrics PDR-1 Digital Event Recorders, was deployed in the meizoseismal area of the Ms = 7.8 Tangshan earthquake of July 28, 1976. These instruments have automatic gain ranging, a specified dynamic range of 102 dB, a 2.5 s pre-event memory, programmable triggering, and are equipped with TCG-1B Time Code Generators with a stability of 3 parts in 107 over a range of 0-50??C. In 2 y of operation beginning July, 1982 a total of 603 near-source 3-component accelerograms were gathered from 243 earthquakes of magnitude ML = 1.2-5.3. Most of these accelerograms have recorded the initial P-wave. The configuration of the experimental array and a representative set of near-source strong-motion accelerograms are presented in this paper. The set of accelerograms exhibited were obtained during the ML = 5.3 Lulong earthquake of October 19, 1982, when digital event recorders were triggered. The epicentral distances ranged from 4 to 41 km and the corresponding range of peak horizontal accelerations was 0.232g to 0.009g. A preliminary analysis of the data indicates that compared to motions in the western United States, the peak acceleration attenuates much more rapidly in the Tangshan area. The scaling of peak acceleration with magnitude, however, is similar in the two regions. Data at more distant sites are needed to confirm the more rapid attenuation. ?? 1985.

  6. Spectral Shapes for accelerograms recorded at rock sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Muralidharan, N.; Sharma, R.D.

    1986-01-01

    Earthquake accelerograms recorded on rock sites have been analysed to develop site-specific response spectra for use in aseismic design. Normalized pseudo absolute acceleration spectra for various values of damping, pertinent to nuclear power plant design in particular are presented. Various ground motion parameters, viz. peak displacement, velocity acceleration (including v/a, ad/v 2 and the ratios of the three orthogonal components) for fifty four accelerograms are examined through motion time histories to be used in structural response analysis. The analysis presented in this paper aims at specifying site specific response spectra for earthquake resistant design of structures and generation of spectrum compatible accelerograms. The salient features of the data set have been discussed. (author)

  7. Synthesis of artificial spectrum-compatible seismic accelerograms

    International Nuclear Information System (INIS)

    Vrochidou, E; Alvanitopoulos, P F; Andreadis, I; Mallousi, K; Elenas, A

    2014-01-01

    The Hilbert–Huang transform is used to generate artificial seismic signals compatible with the acceleration spectra of natural seismic records. Artificial spectrum-compatible accelerograms are utilized instead of natural earthquake records for the dynamic response analysis of many critical structures such as hospitals, bridges, and power plants. The realistic estimation of the seismic response of structures involves nonlinear dynamic analysis. Moreover, it requires seismic accelerograms representative of the actual ground acceleration time histories expected at the site of interest. Unfortunately, not many actual records of different seismic intensities are available for many regions. In addition, a large number of seismic accelerograms are required to perform a series of nonlinear dynamic analyses for a reliable statistical investigation of the structural behavior under earthquake excitation. These are the main motivations for generating artificial spectrum-compatible seismic accelerograms and could be useful in earthquake engineering for dynamic analysis and design of buildings. According to the proposed method, a single natural earthquake record is deconstructed into amplitude and frequency components using the Hilbert–Huang transform. The proposed method is illustrated by studying 20 natural seismic records with different characteristics such as different frequency content, amplitude, and duration. Experimental results reveal the efficiency of the proposed method in comparison with well-established and industrial methods in the literature. (paper)

  8. On the Relationships Between the Fundamental Parameters of Calculation Accelerograms

    Energy Technology Data Exchange (ETDEWEB)

    Savich, A. I., E-mail: office@geodyn.ru; Burdina, N. A., E-mail: nina-burdina@mail.ru [Center of the Office of Geodynamic Observations in the Power Sector, an affiliate of JSC “Institut Gidroproekt,” (Russian Federation)

    2016-05-15

    Analysis of published data on the fundamental parameters of actual accelerograms of strong earthquakes having peak ground acceleration A{sub max}, predominant period T{sub pr}, and duration τ{sub 0.5} at 0.5A{sub max} determined that, for earthquakes of intensity greater than 6.5 – 7.0, the relationship between these quantities is sufficiently well described by the parameters B = ATτ and C = AτT{sup −1.338}, the former of which depends little on earthquake intensity I and is almost completely determined by the earthquake magnitude, while the latter, on the contrary, weakly depends on magnitude and is determined principally by the quantity I. Methods are proposed for using the parameters B and C to improve the reliability of determining parameters of accelerograms used to calculate the seismic resistance of hydraulic engineering facilities.

  9. Spectral shapes for accelerograms recorded at soil sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Sharma, R.D.

    1987-01-01

    Earthquake accelerograms recorded on soil sites have been analysed to develop site-specific response spectra. This report presents the normalised pseudo-absolute acceleration spectra for various values of damping and ground motion parameters viz. v/a, ad/v 2 and the ratios of peak accelerations in the three orthogonal directions. These results will be useful in the earthquake resistant design of structures. 4 tables, 14 figures. (author)

  10. Compilation, assessment and expansion of the strong earthquake ground motion data base. Seismic Safety Margins Research Program (SSMRP)

    International Nuclear Information System (INIS)

    Crouse, C.B.; Hileman, J.A.; Turner, B.E.; Martin, G.R.

    1980-09-01

    A catalog has been prepared which contains information for: (1) world-wide, ground-motion accelerograms (2) the accelerograph sites where these records were obtained, and (3) the seismological parameters of the causative earthquakes. The catalog is limited to data for those accelerograms which have been digitized and published. In addition, the quality and completeness of these data are assessed. This catalog is unique because it is the only publication which contains comprehensive information on the recording conditions of all known digitized accelerograms. However, information for many accelerograms is missing. Although some literature may have been overlooked, most of the missing data has not been published. Nevertheless, the catalog provides a convenient reference and useful tool for earthquake engineering research and applications. (author)

  11. Seismic Safety Margins Research Program, Phase I. Project II: seismic input. Compilation, assessment and expansion of the strong earthquake ground motion data base

    Energy Technology Data Exchange (ETDEWEB)

    Crouse, C B; Hileman, J A; Turner, B E; Martin, G R

    1980-04-01

    A catalog has been prepared which contains information for: (1) world-wide, ground-motion accelerograms, (2) the accelerograph sites where these records were obtained, and (3) the seismological parameters of the causative earthquakes. The catalog is limited to data for those accelerograms which have been digitized and published. In addition, the quality and completeness of these data are assessed. This catalog is unique because it is the only publication which contains comprehensive information on the recording conditions of all known digitized accelerograms. However, information for many accelerograms is missing. Although some literature may have been overlooked, most of the missing data has not been published. Nevertheless, the catalog provides a convenient reference and useful tool for earthquake engineering research and applications.

  12. Studies on Fourier amplitude spectra of accelerograms recorded on rock sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Rao, K.S.

    1990-01-01

    Fourier spectra of 54 earthquake accelerograms recorded on rock sites in the U.S.A. have been analysed. These could be used in generation of synthetic accelerogramms for seismic design. (author). 19 figs., 1 tab., 1 appendix, 19 re fs

  13. Development of spectral shapes and attenuation relations from accelerograms recorded on rock and soil sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Rao, K.S.; Kushwaha, H.S.

    1998-06-01

    Earthquake accelerograms recorded on rock and soil sites have been analysed. Site-specific response spectra and peak ground acceleration attenuation relations have been developed. This report presents the normalised pseudo-absolute acceleration spectra for various values of damping and for various confidence levels. Scaling laws have been developed for the response spectra. The present results are based on a large database and comparison has been made with earlier results. These results will be useful in the earthquake resistant design of structures. (author)

  14. Generation of artificial accelerograms using neural networks for data of Iran

    International Nuclear Information System (INIS)

    Bargi, Kh.; Loux, C.; Rohani, H.

    2002-01-01

    A new method for generation of artificial earthquake accelerograms from response spectra is proposed by Ghaboussi and Lin in 1997 using neural networks. In this paper the methodology has been extended and enhanced for data of Iran. For this purpose, first 40 records of Iran acceleration is chosen, then an RBF neural network which called generalized regression neural network learn the inverse mapping directly from the response spectrum to the Discrete Cosine Transform of accelerograms. Discrete Cosine Transform has been used as an assisting device to extract the content of frequency domain. Learning of network is reasonable and a generalized regression neural network learns it in a few second. Outputs are presented to demonstrate the performance of this method and show its capabilities

  15. Neural Models: An Option to Estimate Seismic Parameters of Accelerograms

    Science.gov (United States)

    Alcántara, L.; García, S.; Ovando-Shelley, E.; Macías, M. A.

    2014-12-01

    Seismic instrumentation for recording strong earthquakes, in Mexico, goes back to the 60´s due the activities carried out by the Institute of Engineering at Universidad Nacional Autónoma de México. However, it was after the big earthquake of September 19, 1985 (M=8.1) when the project of seismic instrumentation assumes a great importance. Currently, strong ground motion networks have been installed for monitoring seismic activity mainly along the Mexican subduction zone and in Mexico City. Nevertheless, there are other major regions and cities that can be affected by strong earthquakes and have not yet begun their seismic instrumentation program or this is still in development.Because of described situation some relevant earthquakes (e.g. Huajuapan de León Oct 24, 1980 M=7.1, Tehuacán Jun 15, 1999 M=7 and Puerto Escondido Sep 30, 1999 M= 7.5) have not been registered properly in some cities, like Puebla and Oaxaca, and that were damaged during those earthquakes. Fortunately, the good maintenance work carried out in the seismic network has permitted the recording of an important number of small events in those cities. So in this research we present a methodology based on the use of neural networks to estimate significant duration and in some cases the response spectra for those seismic events. The neural model developed predicts significant duration in terms of magnitude, epicenter distance, focal depth and soil characterization. Additionally, for response spectra we used a vector of spectral accelerations. For training the model we selected a set of accelerogram records obtained from the small events recorded in the strong motion instruments installed in the cities of Puebla and Oaxaca. The final results show that neural networks as a soft computing tool that use a multi-layer feed-forward architecture provide good estimations of the target parameters and they also have a good predictive capacity to estimate strong ground motion duration and response spectra.

  16. SISMA (Site of Italian Strong Motion Accelerograms): a Web-Database of Ground Motion Recordings for Engineering Applications

    International Nuclear Information System (INIS)

    Scasserra, Giuseppe; Lanzo, Giuseppe; D'Elia, Beniamino; Stewart, Jonathan P.

    2008-01-01

    The paper describes a new website called SISMA, i.e. Site of Italian Strong Motion Accelerograms, which is an Internet portal intended to provide natural records for use in engineering applications for dynamic analyses of structural and geotechnical systems. SISMA contains 247 three-component corrected motions recorded at 101 stations from 89 earthquakes that occurred in Italy in the period 1972-2002. The database of strong motion accelerograms was developed in the framework of a joint project between Sapienza University of Rome and University of California at Los Angeles (USA) and is described elsewhere. Acceleration histories and pseudo-acceleration response spectra (5% damping) are available for download from the website. Recordings can be located using simple search parameters related to seismic source and the recording station (e.g., magnitude, V s30 , etc) as well as ground motion characteristics (e.g. peak ground acceleration, peak ground velocity, peak ground displacement, Arias intensity, etc.)

  17. Comparative analysis of accelerogram processing methods

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, B.

    1986-01-01

    The work described here inafter is a short development of an on-going research project, concerning high-quality processing of strong-motion recordings of earthquakes. Several processing procedures have been tested, applied to synthetic signals simulating ground-motion designed for this purpose. The methods of correction operating in the time domain are seen to be strongly dependent upon the sampling rate. Two methods of low-frequency filtering followed by an integration of accelerations yielded satisfactory results [fr

  18. Ground amplification determined from borehole accelerograms

    International Nuclear Information System (INIS)

    Archuleta, R.J.; Seale, S.H.

    1991-01-01

    The Garner Valley downhole array (GVDA) consists of one surface accelerometer and four downhole accelerometers at depths of 6 m, 15 m, 22m, and 220 m. The five, three-component vertical array of dual-gain accelerometers are capable of measuring accelerations from 3 x 10 -6 g to 2.0 g over a frequency range from 0.0 Hz (0.025, high-gain) Hz to 100 Hz. The site (33 degree 41.60' N, 116 degree 40.20 degree W) is only seven kilometers off the trace of the San Jacinto fault, the most active strand of the San Andreas fault system in southern California and only about 35 km from the San Andreas fault itself. Analysis of individual spectra and spectral ratios for the various depths shows that the zone of weathered granite has a pronounced effect on the spectral amplitudes for frequencies greater than 40 Hz. The soil layer impedance may amplify the high frequencies more than it attenuates. This result must be checked more thoroughly with special consideration of the spectra of the P-wave coda on the horizontal components. Analysis of the P-wave spectra and the spectral ratios shows an increased amplification in the same frequency range (60-90 Hz) where the S-wave spectral ratios imply a change in the attenuation. Comparison of acceleration spectra from two earthquakes, M L 4.2 and M L 2.5 that have nearly the same hypocenter, shows that the near surface amplification and attenuation is nearly the same for both earthquakes. However, the earthquakes themselves are different if we can assume that the recording at 220 m reflects the source spectra with a slight attenuation. The M L 2.5 earthquake has significantly greater high frequency content if the spectra are normalized at the low frequency, i.e., normalization by seismic moment

  19. An integrated digital system for earthquake damage reconnaissance

    Science.gov (United States)

    Deaton, Scott Lowrey

    PQuake(TM) is an integrated digital system that facilitates earthquake damage reconnaissance. It combines digital photography, handheld GPS technology and custom software for a PalmRTM handheld computer to provide a user-friendly field data collection system. It mitigates the deficiencies involved with traditional reconnaissance techniques by allowing the rapid collection of consistent quantitative and qualitative damage data for both manmade structures and natural features. At the end of each day of reconnaissance, the reconnaissance personnel can upload their data to a personal computer and in minutes using the GIS-extension, create comprehensive maps of the damage. Consequently, PQuake(TM) facilitates more sophisticated planning of the reconnaissance activities, collecting larger quantities of consistent data, collaboration among researchers, near real-time reporting, analysis, visualization and mapping of the data. Additionally, it utilizes a relational database for managing, storing and archiving damage data as well as linking data to digital photographs and GPS waypoints. Consequently, PQuake facilitates the complete workflow process from data collection through analysis and reporting. The limitations of traditional reconnaissance are illustrated through a case history utilizing reconnaissance data collected in Adapazari, Turkey, following the Kocaeli earthquake of August 17, 1999. The damage data was combined with liquefaction analyses performed on geotechnical soundings obtained by PEER months after the event to investigate the building damage associated with local site effects in Adapazari. In particular, this case history demonstrates the necessity and benefits of the PQuake system. The PQuake(TM) system was first field-tested following the Gujarat, India, earthquake in January 2001. Additionally, the system was modified following the September 11, 2001, terrorist attack on the World Trade Centers to document structural and non structural damage to the

  20. Computing broadband accelerograms using kinematic rupture modeling

    International Nuclear Information System (INIS)

    Ruiz Paredes, J.A.

    2007-05-01

    In order to make the broadband kinematic rupture modeling more realistic with respect to dynamic modeling, physical constraints are added to the rupture parameters. To improve the slip velocity function (SVF) modeling, an evolution of the k -2 source model is proposed, which consists to decompose the slip as a sum of sub-events by band of k. This model yields to SVF close to the solution proposed by Kostrov for a crack, while preserving the spectral characteristics of the radiated wave field, i.e. a w 2 model with spectral amplitudes at high frequency scaled to the coefficient of directivity C d . To better control the directivity effects, a composite source description is combined with a scaling law defining the extent of the nucleation area for each sub-event. The resulting model allows to reduce the apparent coefficient of directivity to a fraction of C d , as well as to reproduce the standard deviation of the new empirical attenuation relationships proposed for Japan. To make source models more realistic, a variable rupture velocity in agreement with the physics of the rupture must be considered. The followed approach that is based on an analytical relation between the fracture energy, the slip and the rupture velocity, leads to higher values of the peak ground acceleration in the vicinity of the fault. Finally, to better account for the interaction of the wave field with the geological medium, a semi-empirical methodology is developed combining a composite source model with empirical Green functions, and is applied to the Yamaguchi, M w 5.9 earthquake. The modeled synthetics reproduce satisfactorily well the observed main characteristics of ground motions. (author)

  1. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  2. Earthquakes.

    Science.gov (United States)

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  3. Effects of Digital Filtering in Data Processing of Seismic Acceleration Records

    Directory of Open Access Journals (Sweden)

    Mollova Guergana

    2007-01-01

    Full Text Available The paper presents an application of digital filtering in data processing of acceleration records from earthquakes. Butterworth, Chebyshev, and Bessel filters with different orders are considered to eliminate the frequency noise. A dataset under investigation includes accelerograms from three stations, located in Turkey (Dinar, Izmit, Kusadasi, all working with an analogue type of seismograph SMA-1. Records from near-source stations to the earthquakes (i.e., with a distance to the epicenter less than 20 km with different moment magnitudes , 6.4, and 7.4 have been examined. We have evaluated the influence of the type of digital filter on time series (acceleration, velocity, displacement, on some strong motion parameters (PGA, PGV, PGD, etc., and on the FAS (Fourier amplitude spectrum of acceleration. Several -damped displacement response spectra applying examined filtering techniques with different filter orders have been shown. SeismoSignal software tool has been used during the examples.

  4. An algorithm of local earthquake detection from digital records

    Directory of Open Access Journals (Sweden)

    A. PROZOROV

    1978-06-01

    Full Text Available The problem of automatical detection of earthquake signals in seismograms
    and definition of first arrivals of p and s waves is considered.
    The algorithm is based on the analysis of t(A function which represents
    the time of first appearence of a number of going one after another
    swings of amplitudes greather than A in seismic signals. It allows to explore
    such common features of seismograms of earthquakes as sudden
    first p-arrivals of amplitude greater than general amplitude of noise and
    after the definite interval of time before s-arrival the amplitude of which
    overcomes the amplitude of p-arrival. The method was applied to
    3-channel recods of Friuli aftershocks, ¿'-arrivals were defined correctly
    in all cases; p-arrivals were defined in most cases using strict criteria of
    detection. Any false signals were not detected. All p-arrivals were defined
    using soft criteria of detection but less reliability and two false events
    were obtained.

  5. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  6. Earthquakes

    Science.gov (United States)

    ... Centers Evacuation Center Play Areas Animals in Public Evacuation Centers Pet Shelters Interim Guidelines for Animal Health and Control of Disease Transmission in Pet Shelters Protect Your Pets Earthquakes Language: English (US) Español (Spanish) Recommend on Facebook ...

  7. Strong ground motion data from the 1983 Borah Peak, Idaho earthquake recorded at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Jackson, S.M.; Boatwright, J.

    1985-01-01

    The 1983 Borah Peak, Idaho Earthquake was the largest normal faulting event to occur in the last 20 years. There were no near-field recordings of ground motion during the main shock, however, thirteen accelerographs in a permanent array at the Idaho National Engineering Laboratory (INEL) recorded the event at epicentral distances of 90-110 km. Peak horizontal accelerations (PGA) recorded at accelerographs above ground-floor level range from 0.037 to 0.187 g. Accelerographs at basement and free-field sites recorded as low as 0.022 g and as high as 0.078 g. Peak vertical accelerations range from 0.016 g ground level to 0.059 g above ground floor level. A temporary array of digital seismographs deployed by the US Geological Survey (USGS) in the epicentral area recorded ground motion from six large aftershocks at epicentral distances of 4-45 km; the largest of these aftershocks also triggered four accelerographs in the INEL array. Two separate analyses were used to estimate near-field ground motion. The first analysis uses the attenuation of the aftershock PGA measurements to extrapolate the INEL main shock PGA measurements into the near-field. This estimates an upper limit of 0.8 g for near-field ground motion. In the second analysis, a set of main shock accelerograms were synthesized. Wave propagation effects were determined from aftershock recordings at one of the USGS portable stations and an INEL seismograph station. These effects were removed from one of the INEL main shock acceleration traces. The synthetic accelerograms were derived for a hypothetical station southwest of Mackay, Idaho. The PGA measured from the synthetic accelerograms were 0.08, 0.14, 0.15, 0.23 g. These estimates correlate well with ground motion expected for an area of Intensity VII. 12 references, 8 figures, 1 table

  8. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  9. Istanbul Earthquake Early Warning System

    Science.gov (United States)

    Alcik, H.; Mert, A.; Ozel, O.; Erdik, M.

    2007-12-01

    As part of the preparations for the future earthquake in Istanbul a Rapid Response and Early Warning system in the metropolitan area is in operation. For the Early Warning system ten strong motion stations were installed as close as possible to the fault zone. Continuous on-line data from these stations via digital radio modem provide early warning for potentially disastrous earthquakes. Considering the complexity of fault rupture and the short fault distances involved, a simple and robust Early Warning algorithm, based on the exceedance of specified threshold time domain amplitude levels is implemented. The band-pass filtered accelerations and the cumulative absolute velocity (CAV) are compared with specified threshold levels. When any acceleration or CAV (on any channel) in a given station exceeds specific threshold values it is considered a vote. Whenever we have 2 station votes within selectable time interval, after the first vote, the first alarm is declared. In order to specify the appropriate threshold levels a data set of near field strong ground motions records form Turkey and the world has been analyzed. Correlations among these thresholds in terms of the epicenter distance the magnitude of the earthquake have been studied. The encrypted early warning signals will be communicated to the respective end users. Depending on the location of the earthquake (initiation of fault rupture) and the recipient facility the alarm time can be as high as about 8s. The first users of the early warning signal will be the Istanbul gas company (IGDAS) and the metro line using the immersed tube tunnel (MARMARAY). Other prospective users are power plants and power distribution systems, nuclear research facilities, critical chemical factories, petroleum facilities and high-rise buildings. In this study, different algorithms based on PGA, CAV and various definitions of instrumental intensity will be discussed and triggering threshold levels of these parameters will be studied

  10. Did we really #prayfornepal? Instagram posts as a massive digital funeral in Nepal earthquake aftermath

    Science.gov (United States)

    Kamil, P. I.; Pratama, A. J.; Hidayatulloh, A.

    2016-05-01

    Social media has been part of our daily life for years, and now it has become a treasure trove of data for social scientists to mine. Using our own data mining engine we downloaded 1500 Instagram posts related to the Nepal earthquake in April 2015, a disaster which caused tremendous losses counted in human lives and infrastructures. We predicted that the social media will be a place where people respond and express themselves emotionally in light of a disaster of such massive scale, a "megadeath" event. We ended up with data on 1017 posts tracked with the hashtag #prayfornepal, consisting of the post's date, time, geolocation, image, post ID, username and ID, caption, and hashtag. We categorized the posts into 7 categories and found that most of the photos (30,29%) are related to Nepal but not directly related to the disasters, which reflects death imprint, one of psychosocial responses after a megadeath event. Other analyses were done to compare each photo category, including geo-location, hashtag network and caption network which will be visualized with ArcGIS, NodeXL, Gephi, and our own word cloud engine to examine other digital reactions to Nepal Earthquake in Instagram. This study can give an overview of how community reacts to a disaster in digital world and utilize it for disaster response and awareness.

  11. VOLUNTARY ACTIVITIES AND ONLINE EDUCATION FOR DIGITAL HERITAGE INVENTORY DEVELOPMENT AFTER THE GREAT EAST JAPAN EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    Y. Kondo

    2013-07-01

    Full Text Available Consortium for Earthquake-Damaged Cultural Heritage (CEDACH is a voluntary initiative launched just after the Great East Japan Earthquake on 11 March 2011. The consortium is developing a social network between local cultural resource managers restoring disaster-damaged cultural heritage on one side and remote researchers including historians, archaeologists and specialists of cultural information studies on the other side, in order to facilitate collaborative projects. This paper presents three projects in which CEDACH contributed to the development of a digital inventory for disaster-damaged heritage management through web-based collaborations by self-motivated workers. The first project, CEDACH GIS, developed an online archaeological site inventory for the disaster area. Although a number of individuals voluntarily participated in the project at the beginning, it gradually stagnated due to limited need for local rescue archaeology. However, the experience of online-based collaborations worked well for the second project proposed by local specialists, in which CEDACH restored the book catalogue of a tsunami-devastated research library. This experience highlighted the need for online education to improve information and communication technologies (ICT skills of data builders. Therefore, in the third project called CEDACHeLi, an e-Learning management system was developed to facilitate learning the fundamental knowledge and techniques required for information processing in rescue operations of disaster-damaged cultural heritage. This system will contribute to improved skills and motivation of potential workers for further developments in digital heritage inventory.

  12. Voluntary Activities and Online Education for Digital Heritage Inventory Development after the Great East Japan Earthquake

    Science.gov (United States)

    Kondo, Y.; Uozu, T.; Seino, Y.; Ako, T.; Goda, Y.; Fujimoto, Y.; Yamaguchi, H.

    2013-07-01

    Consortium for Earthquake-Damaged Cultural Heritage (CEDACH) is a voluntary initiative launched just after the Great East Japan Earthquake on 11 March 2011. The consortium is developing a social network between local cultural resource managers restoring disaster-damaged cultural heritage on one side and remote researchers including historians, archaeologists and specialists of cultural information studies on the other side, in order to facilitate collaborative projects. This paper presents three projects in which CEDACH contributed to the development of a digital inventory for disaster-damaged heritage management through web-based collaborations by self-motivated workers. The first project, CEDACH GIS, developed an online archaeological site inventory for the disaster area. Although a number of individuals voluntarily participated in the project at the beginning, it gradually stagnated due to limited need for local rescue archaeology. However, the experience of online-based collaborations worked well for the second project proposed by local specialists, in which CEDACH restored the book catalogue of a tsunami-devastated research library. This experience highlighted the need for online education to improve information and communication technologies (ICT) skills of data builders. Therefore, in the third project called CEDACHeLi, an e-Learning management system was developed to facilitate learning the fundamental knowledge and techniques required for information processing in rescue operations of disaster-damaged cultural heritage. This system will contribute to improved skills and motivation of potential workers for further developments in digital heritage inventory.

  13. CRITERIOS SISMOLÓGICOS PARA SELECCIONAR ACELEROGRAMAS REALES DE LA RED NACIONAL DE ACELERÓGRAFOS DE COLOMBIA PARA SU USO EN ANÁLISIS DINÁMICOS CRITÉRIOS SISMOLÓGICOS PARA SELECIONAR ACELEROGRAMAS REAIS DA REDE NACIONAL DE ACELERÓGRAFOS DE COLÔMBIA PARA SEU USO EM ANÁLISES DINÂMICAS SEISMOLOGICAL CRITERIA FOR THE SELECTION OF REAL ACCELEROGRAMS FROM THE COLOMBIAN NATIONAL NETWORK OF ACCELEROGRAMS FOR USE IN DYNAMIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    Ana Beatriz Acevedo

    2012-06-01

    que poderiam utilizar-se como dados de entrada para análises dinâmicas. No entanto, a análise efetuada mostra como, empregando unicamente a informação da RNAC, não é possível cumprir totalmente com os requisitos de seleção do regulamento colombiano.The use of real accelerograms as input in dynamic analysis is desirable as they provide realistic information about the nature of the strong ground motion; in addition, accelerograms capture different characteristics that can be produced by earthquakes at different locations. Colombian design code allows for the use of real accelerograms as input to time-history analysis. The horizontal component of a minimum of three different accelerograms is required by the code; all of the accelerograms must be representative of the expected ground motion at the site. In this paper the database of Colombian National Network of Accelerographs (Red Nacional de Acelerógrafos de Colombia, RNAC is analyzed in order to assess the availability of records for different ground motion characteristics for their use as input in dynamic analysis. Groups of accelerograms that could be used as input for dynamic analysis were identified. However, if only records from the RNAC are used, it is not possible to completely cover the selection requirements specified on the Colombian design code.

  14. Simulation of artificial earthquake records compatible with site specific response spectra using time series analysis

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Fadavi Amiri

    2017-11-01

    Full Text Available Time history analysis of infrastructures like dams, bridges and nuclear power plants is one of the fundamental parts of their design process. But there are not sufficient and suitable site specific earthquake records to do such time history analysis; therefore, generation of artificial accelerograms is required for conducting research works in this area.  Using time series analysis, wavelet transforms, artificial neural networks and genetic algorithm, a new method is introduced to produce artificial accelerograms compatible with response spectra for the specified site condition. In the proposed method, first, some recorded accelerograms are selected based on the soil condition at the recording station. The soils in these stations are divided into two groups of soil and rock according to their measured shear wave velocity. These accelerograms are then analyzed using wavelet transform. Next, artificial neural networks ability to produce reverse signal from response spectra is used to produce wavelet coefficients. Furthermore, a genetic algorithm is employed to optimize the network weight and bias matrices by searching in a wide range of values and prevent neural network convergence on local optima. At the end site specific accelerograms are produced. In this paper a number of recorded accelerograms in Iran are employed to test the neural network performances and to demonstrate the effectiveness of the method. It is shown that using synthetic time series analysis, genetic algorithm, neural network and wavelet transform will increase the capabilities of the algorithm and improve its speed and accuracy in generating accelerograms compatible with site specific response spectra for different site conditions.

  15. On the Regional Dependence of Earthquake Response Spectra

    OpenAIRE

    Douglas , John

    2007-01-01

    International audience; It is common practice to use ground-motion models, often developed by regression on recorded accelerograms, to predict the expected earthquake response spectra at sites of interest. An important consideration when selecting these models is the possible dependence of ground motions on geographical region, i.e., are median ground motions in the (target) region of interest for a given magnitude and distance the same as those in the (host) region where a ground-motion mode...

  16. Effects of Digital Filtering in Data Processing of Seismic Acceleration Records

    Directory of Open Access Journals (Sweden)

    Guergana Mollova

    2007-01-01

    Full Text Available The paper presents an application of digital filtering in data processing of acceleration records from earthquakes. Butterworth, Chebyshev, and Bessel filters with different orders are considered to eliminate the frequency noise. A dataset under investigation includes accelerograms from three stations, located in Turkey (Dinar, Izmit, Kusadasi, all working with an analogue type of seismograph SMA-1. Records from near-source stations to the earthquakes (i.e., with a distance to the epicenter less than 20 km with different moment magnitudes Mw = 3.8, 6.4, and 7.4 have been examined. We have evaluated the influence of the type of digital filter on time series (acceleration, velocity, displacement, on some strong motion parameters (PGA, PGV, PGD, etc., and on the FAS (Fourier amplitude spectrum of acceleration. Several 5%-damped displacement response spectra applying examined filtering techniques with different filter orders have been shown. SeismoSignal software tool has been used during the examples.

  17. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  18. Development of uniform hazard response spectra from accelerograms recorded on rock sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Kushwaha, H.S.

    2000-05-01

    Traditionally, the seismic design basis ground motion has been specified by response spectral shapes and the peak ground acceleration (PGA). The mean recurrence interval (MRI) is evaluated for PGA only. The present work has developed response spectra having the same MRI at all frequencies. This report extends the work of Cornell (on PGA) to consider an aerial source model and a general form of the spectral acceleration at various frequencies. The latter has been derived from a number of strong motion earthquake recorded on rock sites. Sensitivity of the results to the changes in various parameters has also been presented. These results will help to determine the seismic hazard at a given site and the associated uncertainties. (author)

  19. State-of-the-Art for Assessing Earthquake Hazards in the United States. Report 28. Recommended Accelerograms for Earthquake Ground Motions

    Science.gov (United States)

    1992-06-01

    Soft 6.2 20 MEX 40 811025 322 Sicartsa Caseta Testigo Soft 6.2 20 MEX 41 811025 322 Sicartia Acerac Soft 6.2 20 MEX 42 811025 322 Infiemillo Potabiliz...220 ____16.00 96.69 IMaestro Soft 6.5 20 293 17.46 101.46 aMaestro Soft 6.5 20 t255 1.6 114 Testigo Soft 6.5 20 100 102 270 307 23.05 2.82 14.5 9.7...17.81 101.28 Testigo Soft 6.5 20 100 102 360 264 22.90 2.85 13.7 6.4 17.81 101.28 Soft 6.5 20 153 17.46 101.46 Maq Rock 6.5 20 84 84 LONG 120 11.97 1.76

  20. Rapid estimation of earthquake magnitude from the arrival time of the peak high‐frequency amplitude

    Science.gov (United States)

    Noda, Shunta; Yamamoto, Shunroku; Ellsworth, William L.

    2016-01-01

    We propose a simple approach to measure earthquake magnitude M using the time difference (Top) between the body‐wave onset and the arrival time of the peak high‐frequency amplitude in an accelerogram. Measured in this manner, we find that Mw is proportional to 2logTop for earthquakes 5≤Mw≤7, which is the theoretical proportionality if Top is proportional to source dimension and stress drop is scale invariant. Using high‐frequency (>2  Hz) data, the root mean square (rms) residual between Mw and MTop(M estimated from Top) is approximately 0.5 magnitude units. The rms residuals of the high‐frequency data in passbands between 2 and 16 Hz are uniformly smaller than those obtained from the lower‐frequency data. Top depends weakly on epicentral distance, and this dependence can be ignored for distances earthquake produces a final magnitude estimate of M 9.0 at 120 s after the origin time. We conclude that Top of high‐frequency (>2  Hz) accelerograms has value in the context of earthquake early warning for extremely large events.

  1. Fundamental principles of earthquake resistance calculation to be reflected in the next generation regulations

    Directory of Open Access Journals (Sweden)

    Mkrtychev Oleg

    2016-01-01

    Full Text Available The article scrutinizes the pressing issues of regulation in the domain of seismic construction. The existing code of rules SNIP II-7-81* “Construction in seismic areas” provides that earthquake resistance calculation be performed on two levels of impact: basic safety earthquake (BSE and maximum considered earthquake (MCE. However, the very nature of such calculation cannot be deemed well-founded and contradicts the fundamental standards of foreign countries. The authors of the article have identified the main problems of the conceptual foundation underlying the current regulation. The first and foremost step intended to overcome the discrepancy in question is renunciation of the K1 damage tolerance factor when calculating the BSE. The second measure to be taken is implementing the response spectrum method of calculation, but the β spectral curve of the dynamic response factor must be replaced by a spectrum of worst-case accelerograms for this particular structure or a spectrum of simulated accelerograms obtained for the specific construction site. Application of the response spectrum method when calculating the MCE impact level makes it possible to proceed into the frequency domain and to eventually obtain spectra of the accelerograms. As a result we get to know the response of the building to some extent, i.e. forces, the required reinforcement, and it can be checked whether the conditions of the ultimate limit state apply. Then, the elements under the most intense load are excluded from the design model the way it is done in case of progressive collapse calculations, because the assumption is that these elements are destroyed locally by seismic load. This procedure is based on the already existing design practices of progressive collapse calculation.

  2. Short presentation on some researches activities about near field earthquakes

    International Nuclear Information System (INIS)

    Donald, John

    2002-01-01

    The major hazard posed by earthquakes is often thought to be due to moderate to large magnitude events. However, there have been many cases where earthquakes of moderate and even small magnitude have caused very significant destruction when they have coincided with population centres. Even though the area of intense ground shaking caused by such events is generally small, the epicentral motions can be severe enough to cause damage even in well-engineered structures. Two issues are addressed here, the first being the identification of the minimum earthquake magnitude likely to cause damage to engineered structures and the limits of the near-field for small-to-moderate magnitude earthquakes. The second issue addressed is whether features of near-field ground motions such as directivity, which can significantly enhance the destructive potential, occur in small-to-moderate magnitude events. The accelerograms from the 1986 San Salvador (El Salvador) earthquake indicate that it may be non conservative to assume that near-field directivity effects only need to be considered for earthquakes of moment magnitude M 6.5 and greater. (author)

  3. Stability assessment of structures under earthquake hazard through GRID technology

    Science.gov (United States)

    Prieto Castrillo, F.; Boton Fernandez, M.

    2009-04-01

    This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding

  4. Rapid estimation of earthquake magnitude from the arrival time of the peak high‐frequency amplitude

    Science.gov (United States)

    Noda, Shunta; Yamamoto, Shunroku; Ellsworth, William L.

    2016-01-01

    We propose a simple approach to measure earthquake magnitude M using the time difference (Top) between the body‐wave onset and the arrival time of the peak high‐frequency amplitude in an accelerogram. Measured in this manner, we find that Mw is proportional to 2logTop for earthquakes 5≤Mw≤7, which is the theoretical proportionality if Top is proportional to source dimension and stress drop is scale invariant. Using high‐frequency (>2  Hz) data, the root mean square (rms) residual between Mw and MTop(M estimated from Top) is approximately 0.5 magnitude units. The rms residuals of the high‐frequency data in passbands between 2 and 16 Hz are uniformly smaller than those obtained from the lower‐frequency data. Top depends weakly on epicentral distance, and this dependence can be ignored for distances 2  Hz) accelerograms has value in the context of earthquake early warning for extremely large events.

  5. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  6. Analyses on various parameters for the simulation of three-dimensional earthquake ground motions

    International Nuclear Information System (INIS)

    Watabe, M.; Tohdo, M.

    1979-01-01

    In these analyses, stochastic tools are extensively utilized with hundreds of strong motion accelerograms obtained in both Japan and the United States. Stochastic correlations between the maxima of the earthquake ground motions, such as maximum acceleration, velocity, displacement and spectral intensity, are introduced in the first part. Some equations of correlating such maxima to focal distance and magnitude of earthquake are also introduced. Then, discussions on the meaning of effective peak acceleration in view of practical engineering purposes are mentioned. A new concept of deterministic intensity function derived from mathematical models through the check of Run and Chi Square is introduced in the middle part. With this concept, deterministic intensity function for horizontal component as well as vertical, are obtained and shown. The relation between duration time and magnitude is also introduced here. (orig.)

  7. Development of uniform hazard response spectra for rock sites considering line and point sources of earthquakes

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Kushwaha, H.S.

    2001-12-01

    Traditionally, the seismic design basis ground motion has been specified by normalised response spectral shapes and peak ground acceleration (PGA). The mean recurrence interval (MRI) used to computed for PGA only. It is shown that the MRI associated with such response spectra are not the same at all frequencies. The present work develops uniform hazard response spectra i.e. spectra having the same MRI at all frequencies for line and point sources of earthquakes by using a large number of strong motion accelerograms recorded on rock sites. Sensitivity of the number of the results to the changes in various parameters has also been presented. This work is an extension of an earlier work for aerial sources of earthquakes. These results will help to determine the seismic hazard at a given site and the associated uncertainities. (author)

  8. Nowcasting Earthquakes

    Science.gov (United States)

    Rundle, J. B.; Donnellan, A.; Grant Ludwig, L.; Turcotte, D. L.; Luginbuhl, M.; Gail, G.

    2016-12-01

    Nowcasting is a term originating from economics and finance. It refers to the process of determining the uncertain state of the economy or markets at the current time by indirect means. We apply this idea to seismically active regions, where the goal is to determine the current state of the fault system, and its current level of progress through the earthquake cycle. In our implementation of this idea, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. Our method does not involve any model other than the idea of an earthquake cycle. Rather, we define a specific region and a specific large earthquake magnitude of interest, ensuring that we have enough data to span at least 20 or more large earthquake cycles in the region. We then compute the earthquake potential score (EPS) which is defined as the cumulative probability distribution P(nearthquakes in the region. From the count of small earthquakes since the last large earthquake, we determine the value of EPS = P(nearthquake cycle in the defined region at the current time.

  9. Earthquake Facts

    Science.gov (United States)

    ... estimated 830,000 people. In 1976 another deadly earthquake struck in Tangshan, China, where more than 250,000 people were killed. Florida and North Dakota have the smallest number of earthquakes in the United States. The deepest earthquakes typically ...

  10. Using wavelet analysisto obtain characteristics of accelerograms Применение вейвлет-анализа для получения характеристик акселерограмм

    Directory of Open Access Journals (Sweden)

    Mkrtychev Oleg Vartanovich

    2013-07-01

    Full Text Available Application of accelerograms to the analysis of structures, exposed to seismic loads, and generation of synthetic accelerograms may only be implemented if their varied characteristics are available. The wavelet analysis may serve as a method for identification of the above characteristics. The wavelet analysis is an effective tool for identification of versatile regularities of signals. Wavelets can be used to detect inflection points, extremes, etc. Also, wavelets can be used to filter signals.The authors discuss particular theoretical principles of the wavelet analysis and the multiresolution analysis. The authors present formulas designated for the practical application. The authors implemented a wavelet transform in respect of a specific accelerogram.The recording of the horizontal component (N00E of the Spitak earthquake (Armenia, 1988 was exposed to the analysis as an accelerogram. An accelerogram was considered as a non-stationary random process in the course of its decomposition into the envelope and the non-stationary part. This non-stationary random process was presented as a multiplication envelope of a stationary random process. Parameters of exposure of a construction site to the seismic impact can be used to synthesize accelerograms.Для применения акселерограмм в расчетах конструкций на сейсмические воздействия, а также для генерирования синтезированных акселерограмм необходимо иметь информацию об их характеристиках. В качестве средства нахождения этих характеристик может выступать аппарат вейвлет-анализа. Рассмотрены некоторые теоретические положения вейвлет-анализа, выполнено вейвлет-преобразование к

  11. Analysis of strong-motion data of the 1990 Eastern Sicily earthquake

    Directory of Open Access Journals (Sweden)

    E. Boschi

    1995-06-01

    Full Text Available The strong motion accelerograms recorded during the 1990 Eastern Sicily earthquake have been analyzed to investigate source and attenuation parameters. Peak ground motions (peak acceleration, velocity and displacement overestimate the values predicted by the empirical scaling law proposed for other Italian earthquakes, suggesting that local site response and propagation path effects play an important role in interpreting the observed time histories. The local magnitude, computed from the strong motion accelerograms by synthesizing the Wood-Anderson response, is ML = 5.9, that is sensibly larger than the local magnitude estimated at regional distances from broad-band seismograms (ML = 5.4. The standard omega-square source spectral model seems to be inadequate to describe the observed spectra over the entire frequency band from 0.2 to 20 Hz. The seismic moment estimated from the strong motion accelerogram recorded at the closest rock site (Sortino is Mo = 0.8 x 1024 dyne.cm, that is roughly 4.5 times lower than the value estimated at regional distances (Mo = 3.7 x 1024 dyne.cm from broad-band seismograms. The corner frequency estimated from the accelera- tion spectra i.5 J; = 1.3 Hz, that is close to the inverse of the dUl.ation of displacement pulses at the two closest recording sites. This value of corner tì.equency and the two values of seismic moment yield a Brune stress drop larger than 500 bars. However, a corner frequency value off; = 0.6 Hz and the seismic moment resulting from regional data allows the acceleration spectra to be reproduced on the entire available frequency band yielding to a Brune stress drop of 210 bars. The ambiguity on the corner frequency value associated to this earthquake is due to the limited frequency bandwidth available on the strong motion recordil1gs. Assuming the seismic moment estimated at regional distances from broad-band data, the moment magnitude for this earthquake is 5.7. The higher local magnitude (5

  12. Selection of maximum design earthquake parameters for a dam safety project in British Columbia

    International Nuclear Information System (INIS)

    Smith, H.R.; Wightman, A.; Naumann, C.M.

    1991-01-01

    A study was carried out for the Greater Vancouver Water District to determine maximum design earthquake (MDE) parameters for dam safety projects. Three types of maximum credible earthquake (MCE) were investigated: a mega thrust earthquake (interplate event) of magnitude 9 on the Richter scale under the west coast of Vancouver Island; a magnitude 7.5 earthquake (intraplate event) under Georgia Straight at a depth of ca 60 km; and a local magnitude 6.5 shallow earthquake near the study site, on the north shore mountains near Vancouver. Conclusions of the study include the following. Strong motion records are recorded on three component accelerograms, and considering the individual components rather than the maximum ground motion can result in underestimation of seismic loading. It is recommended that the peak ground motions be defined by the envelope of the larger component, which would include peak ground acceleration, peak velocity and response spectra. The peak ground acceleration of the most critical earthquake, of magnitude 6.5 at 10 km, was estimated at 0.5 gravities. A variety of check methodologies yielded peak horizontal ground acceleration (PGA) ranging from 0.44 to 0.55 gravities. PGA values chosen for seismic analysis must consider duration and direction of peak as well as type of analysis, failure mode and material types. 9 refs., 10 figs., 2 tabs

  13. Undead earthquakes

    Science.gov (United States)

    Musson, R. M. W.

    This short communication deals with the problem of fake earthquakes that keep returning into circulation. The particular events discussed are some very early earthquakes supposed to have occurred in the U.K., which all originate from a single enigmatic 18th century source.

  14. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  15. Slip-weakening distance and energy budget inferred from near-fault ground deformation during the 2016 Mw7.8 Kaikōura earthquake

    Science.gov (United States)

    Kaneko, Yoshihiro; Fukuyama, Eiichi; Hamling, Ian James

    2017-05-01

    The 2016 M7.8 Kaikōura (New Zealand) earthquake struck the east coast of the northern South Island, resulting in strong ground shaking and large surface fault slip. Since the earthquake was well recorded by a local strong-motion seismic network, near-fault data may provide direct measurements of dynamic parameters associated with the fault-weakening process. Here we estimate a proxy for slip-weakening distance Dc '', defined as double the fault-parallel displacement at the time of peak ground velocity, from accelerograms recorded at a near-fault station. Three-component ground displacements were recovered from the double numerical integration of accelerograms, and the corresponding final displacements are validated against coseismic displacement from geodetic data. The estimated Dc '' is 4.9 m at seismic station KEKS located ˜2.7 km from a segment of the Kekerengu fault where large surface fault slip (˜12 m) has been observed. The inferred Dc '' is the largest value ever estimated from near-fault strong motion data, yet it appears to follow the scaling of Dc '' with final slip for several large strike-slip earthquakes. The energy budget of the M7.8 Kaikōura earthquake inferred from the scaling of Dc '' with final slip indicates that a large amount of energy was dissipated by on- and off-fault inelastic deformation during the propagation of the earthquake rupture, resulting in a slower average rupture speed (≲2.0 km/s).

  16. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  17. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  18. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  19. Effects of earthquake rupture shallowness and local soil conditions on simulated ground motions

    International Nuclear Information System (INIS)

    Apsel, Randy J.; Hadley, David M.; Hart, Robert S.

    1983-03-01

    The paucity of strong ground motion data in the Eastern U.S. (EUS), combined with well recognized differences in earthquake source depths and wave propagation characteristics between Eastern and Western U.S. (WUS) suggests that simulation studies will play a key role in assessing earthquake hazard in the East. This report summarizes an extensive simulation study of 5460 components of ground motion representing a model parameter study for magnitude, distance, source orientation, source depth and near-surface site conditions for a generic EUS crustal model. The simulation methodology represents a hybrid approach to modeling strong ground motion. Wave propagation is modeled with an efficient frequency-wavenumber integration algorithm. The source time function used for each grid element of a modeled fault is empirical, scaled from near-field accelerograms. This study finds that each model parameter has a significant influence on both the shape and amplitude of the simulated response spectra. The combined effect of all parameters predicts a dispersion of response spectral values that is consistent with strong ground motion observations. This study provides guidelines for scaling WUS data from shallow earthquakes to the source depth conditions more typical in the EUS. The modeled site conditions range from very soft soil to hard rock. To the extent that these general site conditions model a specific site, the simulated response spectral information can be used to either correct spectra to a site-specific environment or used to compare expected ground motions at different sites. (author)

  20. Fault location and source process of the Boumerdes, Algeria, earthquake inferred from geodetic and strong motion data

    Science.gov (United States)

    Semmane, Fethi; Campillo, Michel; Cotton, Fabrice

    2005-01-01

    The Boumerdes earthquake occurred on a fault whose precise location, offshore the Algerian coast, was unknown. Geodetic data are used to determine the absolute position of the fault. The fault might emerge at about 15 km offshore. Accelerograms are used to infer the space-time history of the rupture using a two-step inversion in the spectral domain. The observed strong motion records agree with the synthetics for the fault location inferred from geodetic data. The fault plane ruptured for about 18 seconds. The slip distribution on the fault indicates one asperity northwest of the hypocenter with maximum slip amplitude about 3 m. This asperity is probably responsible for most of the damage. Another asperity with slightly smaller slip amplitude is located southeast of the hypocenter. The rupture stops its westward propagation close to the Thenia fault, a structure almost perpendicular to the main fault.

  1. Darwin's earthquake.

    Science.gov (United States)

    Lee, Richard V

    2010-07-01

    Charles Darwin experienced a major earthquake in the Concepción-Valdivia region of Chile 175 years ago, in February 1835. His observations dramatically illustrated the geologic principles of James Hutton and Charles Lyell which maintained that the surface of the earth was subject to alterations by natural events, such as earthquakes, volcanoes, and the erosive action of wind and water, operating over very long periods of time. Changes in the land created new environments and fostered adaptations in life forms that could lead to the formation of new species. Without the demonstration of the accumulation of multiple crustal events over time in Chile, the biologic implications of the specific species of birds and tortoises found in the Galapagos Islands and the formulation of the concept of natural selection might have remained dormant.

  2. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  3. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  4. Predictable earthquakes?

    Science.gov (United States)

    Martini, D.

    2002-12-01

    acceleration) and global number of earthquake for this period from published literature which give us a great picture about the dynamical geophysical phenomena. Methodology: The computing of linear correlation coefficients gives us a chance to quantitatively characterise the relation among the data series, if we suppose a linear dependence in the first step. The correlation coefficients among the Earth's rotational acceleration and Z-orbit acceleration (perpendicular to the ecliptic plane) and the global number of the earthquakes were compared. The results clearly demonstrate the common feature of both the Earth's rotation and Earth's Z-acceleration around the Sun and also between the Earth's rotational acceleration and the earthquake number. This fact might means a strong relation among these phenomena. The mentioned rather strong correlation (r = 0.75) and the 29 year period (Saturn's synodic period) was clearly shown in the counted cross correlation function, which gives the dynamical characteristic of correlation, of Earth's orbital- (Z-direction) and rotational acceleration. This basic period (29 year) was also obvious in the earthquake number data sets with clear common features in time. Conclusion: The Core, which involves the secular variation of the Earth's magnetic field, is the only sufficiently mobile part of the Earth with a sufficient mass to modify the rotation which probably effects on the global time distribution of the earthquakes. Therefore it might means that the secular variation of the earthquakes is inseparable from the changes in Earth's magnetic field, i.e. the interior process of the Earth's core belongs to the dynamical state of the solar system. Therefore if the described idea is real the global distribution of the earthquakes in time is predictable.

  5. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  6. It's Our Fault: better defining earthquake risk in Wellington, New Zealand

    Science.gov (United States)

    Van Dissen, R.; Brackley, H. L.; Francois-Holden, C.

    2012-12-01

    increasing the region's resilience to earthquakes. We present latest results on ground motion simulations for large plate interface earthquakes under Wellington in terms of response spectra and acceleration time histories. We derive realistic broadband accelerograms based on a stochastic modelling technique. First we characterise the potential interface rupture area based on previous geodetically-derived estimates interface of slip deficit. Then, we entertain a suitable range of source parameters, including various rupture areas, moment magnitudes, stress drops, slip distributions and rupture propagation directions. The resulting rupture scenarios all produce long duration shaking, and peak ground accelerations that, typically, range between 0.2-0.7 g in Wellington city. Many of these scenarios also produce long period motions that are currently not captured by the current NZ design spectra.

  7. Deterministic modeling for microzonation of Bucharest: Case study for August 30, 1986, and May 30-31, 1990. Vrancea earthquakes

    International Nuclear Information System (INIS)

    Cioflan, C.O.; Apostol, B.F.; Moldoveanu, C.L.; Marmureanu, G.; Panza, G.F.

    2002-03-01

    The mapping of the seismic ground motion in Bucharest, due to the strong Vrancea earthquakes is carried out using a complex hybrid waveform modeling method which combines the modal summation technique, valid for laterally homogenous anelastic media, with finite-differences technique and optimizes the advantages of both methods. For recent earthquakes, it is possible to validate the modeling by comparing the synthetic seismograms with the records. As controlling records we consider the accelerograms of the Magurele station, low pass filtered with a cut off frequency of 1.0 Hz, of the 3 last major strong (M w >6) Vrancea earthquakes. Using the hybrid method with a double-couple- seismic source approximation, scaled for the source dimensions and relatively simple regional (bedrock) and local structure models we succeeded in reproducing the recorded ground motion in Bucharest, at a satisfactory level for seismic engineering. Extending the modeling to the whole territory of the Bucharest area, we construct a new seismic microzonation map, where five different zones are identified by their characteristic response spectra. (author)

  8. Instrumental shaking thresholds for seismically induced landslides and preliminary report on landslides triggered by the October 17, 1989, Loma Prieta, California earthquake

    Science.gov (United States)

    Harp, E.L.

    1993-01-01

    The generation of seismically induced landslide depends on the characteristics of shaking as well as mechanical properties of geologic materials. A very important parameter in the study of seismically induced landslide is the intensity based on a strong-motion accelerogram: it is defined as Arias intensity and is proportional to the duration of the shaking record as well as the amplitude. Having a theoretical relationship between Arias intensity, magnitude and distance it is possible to predict how far away from the seismic source landslides are likely to occur for a given magnitude earthquake. Field investigations have established that the threshold level of Arias intensity depends also on site effects, particularly the fracture characteristics of the outcrops present. -from Author

  9. Computing broadband accelerograms using kinematic rupture modeling; Generation d'accelerogrammes synthetiques large-bande par modelisation cinematique de la rupture sismique

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz Paredes, J.A

    2007-05-15

    In order to make the broadband kinematic rupture modeling more realistic with respect to dynamic modeling, physical constraints are added to the rupture parameters. To improve the slip velocity function (SVF) modeling, an evolution of the k{sup -2} source model is proposed, which consists to decompose the slip as a sum of sub-events by band of k. This model yields to SVF close to the solution proposed by Kostrov for a crack, while preserving the spectral characteristics of the radiated wave field, i.e. a w{sup 2} model with spectral amplitudes at high frequency scaled to the coefficient of directivity C{sub d}. To better control the directivity effects, a composite source description is combined with a scaling law defining the extent of the nucleation area for each sub-event. The resulting model allows to reduce the apparent coefficient of directivity to a fraction of C{sub d}, as well as to reproduce the standard deviation of the new empirical attenuation relationships proposed for Japan. To make source models more realistic, a variable rupture velocity in agreement with the physics of the rupture must be considered. The followed approach that is based on an analytical relation between the fracture energy, the slip and the rupture velocity, leads to higher values of the peak ground acceleration in the vicinity of the fault. Finally, to better account for the interaction of the wave field with the geological medium, a semi-empirical methodology is developed combining a composite source model with empirical Green functions, and is applied to the Yamaguchi, M{sub w} 5.9 earthquake. The modeled synthetics reproduce satisfactorily well the observed main characteristics of ground motions. (author)

  10. Earthquake friction

    Science.gov (United States)

    Mulargia, Francesco; Bizzarri, Andrea

    2016-12-01

    Laboratory friction slip experiments on rocks provide firm evidence that the static friction coefficient μ has values ∼0.7. This would imply large amounts of heat produced by seismically active faults, but no heat flow anomaly is observed, and mineralogic evidence of frictional heating is virtually absent. This stands for lower μ values ∼0.2, as also required by the observed orientation of faults with respect to the maximum compressive stress. We show that accounting for the thermal and mechanical energy balance of the system removes this inconsistence, implying a multi-stage strain release process. The first stage consists of a small and slow aseismic slip at high friction on pre-existent stress concentrators within the fault volume but angled with the main fault as Riedel cracks. This introduces a second stage dominated by frictional temperature increase inducing local pressurization of pore fluids around the slip patches, which is in turn followed by a third stage in which thermal diffusion extends the frictionally heated zones making them coalesce into a connected pressurized region oriented as the fault plane. Then, the system enters a state of equivalent low static friction in which it can undergo the fast elastic radiation slip prescribed by dislocation earthquake models.

  11. Characteristics of Earthquake Ground Motion in Tapachula, Chiapas (mexico) from Empirical and Theorical Methods

    Science.gov (United States)

    Vidal, F.; Alguacil, G.; Rodríguez, L.; Navarro, M.; Ruiz, A.; Aguirre, J.; Acosta, M.; Gonzalez, R.; Mora, J.; Reyes, M.

    2013-05-01

    The high seismic hazard level of Tapachula city (Chiapas, Mexico) requires a better understanding of the characteristics of earthquake ground motion to implement risk reduction policies in this urban area. A map of ground predominant period estimated with Nakamura technique already shows four different zones: the largest one in the downtown with 0.2-0.4s, two small zones (concentric to the previous one) of 0.4-0.7s and 0.7-0.9 s, respectively, and the smallest zone (on the edge of the city) with the higher values 0.9-1.1s. During 44 days more than 220 events were recorded by a temporal seismic network installed by the UNAM at 6 sites distributed in Tapachula. The magnitude Mw and hypocentral distance of the events were reassessed and range from 3.3 to 4.5 and 60 to 190 km, respectively. After selecting the accelerograms with the best signal/noise ratio, a set of key engineering ground-motion parameters such as peak values of strong motion, acceleration and velocity response spectra, Arias intensity, cumulative absolute velocity, relative significant duration, the Housner spectrum-intensity, the energy input spectrum and H/V spectral ratio were calculated for the selected events. The ground-motion prediction equations (GMPE) of each parameter as function of magnitude and distance were also estimated. On the other hand, synthetic seismic traces were obtained at each station site after modeling a seismic source of magnitude 7.2 by using the empirical Green's function method. Thus, a shake-map scenario was generated for an earthquake similar to that of the September 10, 1993. The parameters here obtained show different shake levels and frequency content at each site. All sites present amplification for 0.25 and 0.5 s. TACA, TAPP y TATC stations, located near the two rivers bordering Tapachula, are those with the largest ground amplification. The characteristics of strong ground motion obtained from synthetic accelerograms are in agreement with those from the empirical

  12. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  13. Redefining Earthquakes and the Earthquake Machine

    Science.gov (United States)

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  14. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  15. Operational earthquake forecasting can enhance earthquake preparedness

    Science.gov (United States)

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  16. Expanding Horizons in Mitigating Earthquake Related Disasters in Urban Areas: Global Development of Real-Time Seismology

    OpenAIRE

    Utkucu, Murat; Küyük, Hüseyin Serdar; Demir, İsmail Hakkı

    2016-01-01

    Abstract Real-time seismology is a newly developing alternative approach in seismology to mitigate earthquake hazard. It exploits up-to-date advances in seismic instrument technology, data acquisition, digital communications and computer systems for quickly transforming data into earthquake information in real-time to reduce earthquake losses and its impact on social and economic life in the earthquake prone densely populated urban and industrial areas.  Real-time seismology systems are not o...

  17. The history of historical earthquake research in Germany

    Directory of Open Access Journals (Sweden)

    G. Grünthal

    2004-06-01

    Full Text Available The paper summarizes the history of collecting and evaluating information on earthquakes in Germany. A rich literature mentioning historical and contemporary earthquakes has existed since the 16th century. Early earthquake catalogues began to appear in the middle of the 16th century, some of which report earthquakes in Germany dating back to the 9th century. Modern seismological views were introduced in connection with intense philosophical analysis of the 1755 Lisbon earthquake, which was largely observed in Central Europe. The 19th century was characterized by a tremendous increase in detailed earthquake studies as well as earthquake compilations in the form of catalogues. The most comprehensive non-parametric catalogues were created in the middle of the 20th century, while the first digital parametric catalogues were published in the 1980s. This was also the time when critical studies on the re-interpretation of historical earthquakes began. Only in the 1990s was such analysis made in a systematic manner resulting in numerous publications and the current development of a modern earthquake catalogue.

  18. Topographic changes and their driving factors after 2008 Wenchuan Earthquake

    Science.gov (United States)

    Li, C.; Wang, M.; Xie, J.; Liu, K.

    2017-12-01

    The Wenchuan Ms 8.0 Earthquake caused topographic change in the stricken areas because of the formation of numerous coseismic landslides. The emergence of new landslides and debris flows and movement of loose materials under the driving force of heavy rainfall could further shape the local topography. Dynamic topographic changes in mountainous areas stricken by major earthquakes have a strong linkage to the development and occurrence of secondary disasters. However, little attention has been paid to continuously monitoring mountain environment change after such earthquakes. A digital elevation model (DEM) is the main feature of the terrain surface, in our research, we extracted DEM in 2013 and 2015 of a typical mountainous area severely impacted by the 2008 Wenchuan earthquake from the ZY-3 stereo pair images with validation by field measurement. Combined with the elevation dataset in 2002 and 2010, we quantitatively assessed elevation changes in different years and qualitatively analyzed spatiotemporal variation of the terrain and mass movement across the study area. The results show that the earthquake stricken area experienced substantial elevation changes caused by seismic forces and subsequent rainfalls. Meanwhile, deposits after the earthquake are mainly accumulated on the river-channels and mountain ridges and deep gullies which increase the risk of other geo-hazards. And the heavy rainfalls after the earthquake have become the biggest driver of elevation reduction, which overwhelmed elevation increase during the major earthquake. Our study provided a better understanding of subsequent hazards and risks faced by residents and communities stricken by major earthquakes.

  19. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  20. Digital broadcasting

    International Nuclear Information System (INIS)

    Park, Ji Hyeong

    1999-06-01

    This book contains twelve chapters, which deals with digitization of broadcast signal such as digital open, digitization of video signal and sound signal digitization of broadcasting equipment like DTPP and digital VTR, digitization of equipment to transmit such as digital STL, digital FPU and digital SNG, digitization of transmit about digital TV transmit and radio transmit, digital broadcasting system on necessity and advantage, digital broadcasting system abroad and Korea, digital broadcasting of outline, advantage of digital TV, ripple effect of digital broadcasting and consideration of digital broadcasting, ground wave digital broadcasting of DVB-T in Europe DTV in U.S.A and ISDB-T in Japan, HDTV broadcasting, satellite broadcasting, digital TV broadcasting in Korea, digital radio broadcasting and new broadcasting service.

  1. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  2. Discrimination between earthquakes and chemical explosions using artificial neural networks

    International Nuclear Information System (INIS)

    Kundu, Ajit; Bhadauria, Y.S.; Roy, Falguni

    2012-05-01

    An Artificial Neural Network (ANN) for discriminating between earthquakes and chemical explosions located at epicentral distances, Δ <5 deg from Gauribidanur Array (GBA) has been developed using the short period digital seismograms recorded at GBA. For training the ANN spectral amplitude ratios between P and Lg phases computed at 13 different frequencies in the frequency range of 2-8 Hz, corresponding to 20 earthquakes and 23 chemical explosions were used along with other parameters like magnitude, epicentral distance and amplitude ratios Rg/P and Rg/Lg. After training and development, the ANN has correctly identified a set of 21 test events, comprising 6 earthquakes and 15 chemical explosions. (author)

  3. Earthquakes and faults in southern California (1970-2010)

    Science.gov (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  4. Predictive Control for Earthquake Response Mitigation of Buildings Using Semiactive Fluid Dampers

    Directory of Open Access Journals (Sweden)

    F. Oliveira

    2014-01-01

    Full Text Available A predictive control strategy in conjunction with semiactive control algorithms is proposed for damping control of base-isolated structures employing semiactive fluid dampers when subjected to earthquake loads. The controller considers the delays resulting from the device’s dynamics and an observer for state estimation. Twenty artificial accelerograms were generated according to the Eurocode 8 for the Portuguese territory and considered for the numerical simulations of the base-isolated structure representative model. The results of a parametric study on a single degree of freedom model provide an indication for controller design in this type of problems. To evaluate the effectiveness of the proposed strategies, the response of a 10-storey base-isolated dual frame-wall building employing semiactive systems is compared with the original, passive solution and with an earlier proposed optimal controller for this type of problems. It is shown that a well-tuned controller could outperform the original structure, the structural system with a passive device (optimized as well as with the semiactive optimal controller, in terms of relative displacement and absolute acceleration reductions.

  5. Learning from Earthquakes: 2014 Napa Valley Earthquake Reconnaissance Report

    OpenAIRE

    Fischer, Erica

    2014-01-01

    Structural damage was observed during reconnaissance after the 2014 South Napa Earthquake, and included damage to wine storage and fermentation tanks, collapse of wine storage barrel racks, unreinforced masonry building partial or full collapse, and residential building damage. This type of damage is not unique to the South Napa Earthquake, and was observed after other earthquakes such as the 1977 San Juan Earthquake, and the 2010 Maule Earthquake. Previous research and earthquakes have demon...

  6. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  7. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  8. Mechanism of post-seismic floods after the Wenchuan earthquake in ...

    Indian Academy of Sciences (India)

    Ding Hairong

    2017-10-06

    Oct 6, 2017 ... and sediment variations and deforestation in the upper of. Minjiang River; Master thesis, Southwest University (in. Chinese with English abstract). Zhou R J, Li Y and Densmore A L et al. 2011 The strong motion records of the Ms 8.0 Wenchuan Earthquake by the digital strong earthquake network in Sichuan ...

  9. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  10. The Antiquity of Earthquakes

    Indian Academy of Sciences (India)

    Department of Earth. Sciences, University of. Roorkee. Her interest is in computer based solutions to geophysical and other earth science problems. If we adopt the definition that an earthquake is shaking of the earth due to natural causes, then we may argue that earthquakes have been occurring since the very beginning.

  11. Bam Earthquake in Iran

    CERN Document Server

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  12. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  13. The Antiquity of Earthquakes

    Indian Academy of Sciences (India)

    there are few estimates about this earthquake as it probably occurred in that early period of the earth's history about which astronomers, physicists, chemists and earth scientists are still sorting out their ideas. Yet, the notion of the earliest earthquake excites interest. We explore this theme here partly also because.

  14. Spatiotermporal correlations of earthquakes

    International Nuclear Information System (INIS)

    Farkas, J.; Kun, F.

    2007-01-01

    Complete text of publication follows. An earthquake is the result of a sudden release of energy in the Earth's crust that creates seismic waves. At the present technological level, earthquakes of magnitude larger than three can be recorded all over the world. In spite of the apparent randomness of earthquake occurrence, long term measurements have revealed interesting scaling laws of earthquake characteristics: the rate of aftershocks following major earthquakes has a power law decay (Omori law); the magnitude distribution of earthquakes exhibits a power law behavior (Gutenberg-Richter law), furthermore, it has recently been pointed out that epicenters form fractal networks in fault zones (Kagan law). The theoretical explanation of earthquakes is based on plate tectonics: the earth's crust has been broken into plates which slowly move under the action of the flowing magma. Neighboring plates touch each other along ridges (fault zones) where a large amount of energy is stored in deformation. Earthquakes occur when the stored energy exceeds a material dependent threshold value and gets released in a sudden jump of the plate. The Burridge-Knopoff (BK) model of earthquakes represents earth's crust as a coupled system of driven oscillators where nonlinearity occurs through a stick-slip frictional instability. Laboratory experiments have revealed that under a high pressure the friction of rock interfaces exhibits a weakening with increasing velocity. In the present project we extend recent theoretical studies of the BK model by taking into account a realistic velocity weakening friction force between tectonic plates. Varying the strength of weakening a broad spectrum of interesting phenomena is obtained: the model reproduces the Omori and Gutenberg-Richter laws of earthquakes, furthermore, it provides information on the correlation of earthquake sequences. We showed by computer simulations that the spatial and temporal correlations of consecutive earthquakes are very

  15. Earthquake responses of a beam supported by a mechanical snubber

    International Nuclear Information System (INIS)

    Ohmata, Kenichiro; Ishizu, Seiji.

    1989-01-01

    The mechanical snubber is an earthquakeproof device for piping systems under particular circumstances such as high temperature and radioactivity. It has nonlinearities in both load and frequency response. In this report, the resisting force characteristics of the snubber and earthquake responses of piping (a simply supported beam) which is supported by the snubber are simulated using Continuous System Simulation Language (CSSL). Digital simulations are carried out for various kinds of physical properties of the snubber. The restraint effect and the maximum resisting force of the snubber during earthquakes are discussed and compared with the case of an oil damper. The earthquake waves used here are E1 Centro N-S and Akita Harbour N-S (Nihonkai-Chubu earthquake). (author)

  16. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  17. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  18. Implications of the Mw9.0 Tohoku-Oki earthquake for ground motion scaling with source, path, and site parameters

    Science.gov (United States)

    Stewart, Jonathan P.; Midorikawa, Saburoh; Graves, Robert W.; Khodaverdi, Khatareh; Kishida, Tadahiro; Miura, Hiroyuki; Bozorgnia, Yousef; Campbell, Kenneth W.

    2013-01-01

    The Mw9.0 Tohoku-oki Japan earthquake produced approximately 2,000 ground motion recordings. We consider 1,238 three-component accelerograms corrected with component-specific low-cut filters. The recordings have rupture distances between 44 km and 1,000 km, time-averaged shear wave velocities of VS30 = 90 m/s to 1,900 m/s, and usable response spectral periods of 0.01 sec to >10 sec. The data support the notion that the increase of ground motions with magnitude saturates at large magnitudes. High-frequency ground motions demonstrate faster attenuation with distance in backarc than in forearc regions, which is only captured by one of the four considered ground motion prediction equations for subduction earthquakes. Recordings within 100 km of the fault are used to estimate event terms, which are generally positive (indicating model underprediction) at short periods and zero or negative (overprediction) at long periods. We find site amplification to scale minimally with VS30 at high frequencies, in contrast with other active tectonic regions, but to scale strongly with VS30 at low frequencies.

  19. Earthquakes and emergence

    Science.gov (United States)

    Earthquakes and emerging infections may not have a direct cause and effect relationship like tax evasion and jail, but new evidence suggests that there may be a link between the two human health hazards. Various media accounts have cited a massive 1993 earthquake in Maharashtra as a potential catalyst of the recent outbreak of plague in India that has claimed more than 50 lives and alarmed the world. The hypothesis is that the earthquake may have uprooted underground rat populations that carry the fleas infected with the bacterium that causes bubonic plague and can lead to the pneumonic form of the disease that is spread through the air.

  20. Seismic ground motion modelling and damage earthquake scenarios: A bridge between seismologists and seismic engineers

    International Nuclear Information System (INIS)

    Panza, G.F.; Romanelli, F.; Vaccari. F.; . E-mails: Luis.Decanini@uniroma1.it; Fabrizio.Mollaioli@uniroma1.it)

    2002-07-01

    The input for the seismic risk analysis can be expressed with a description of 'roundshaking scenarios', or with probabilistic maps of perhaps relevant parameters. The probabilistic approach, unavoidably based upon rough assumptions and models (e.g. recurrence and attenuation laws), can be misleading, as it cannot take into account, with satisfactory accuracy, some of the most important aspects like rupture process, directivity and site effects. This is evidenced by the comparison of recent recordings with the values predicted by the probabilistic methods. We prefer a scenario-based, deterministic approach in view of the limited seismological data, of the local irregularity of the occurrence of strong earthquakes, and of the multiscale seismicity model, that is capable to reconcile two apparently conflicting ideas: the Characteristic Earthquake concept and the Self Organized Criticality paradigm. Where the numerical modeling is successfully compared with records, the synthetic seismograms permit the microzoning, based upon a set of possible scenario earthquakes. Where no recordings are available the synthetic signals can be used to estimate the ground motion without having to wait for a strong earthquake to occur (pre-disaster microzonation). In both cases the use of modeling is necessary since the so-called local site effects can be strongly dependent upon the properties of the seismic source and can be properly defined only by means of envelopes. The joint use of reliable synthetic signals and observations permits the computation of advanced hazard indicators (e.g. damaging potential) that take into account local soil properties. The envelope of synthetic elastic energy spectra reproduces the distribution of the energy demand in the most relevant frequency range for seismic engineering. The synthetic accelerograms can be fruitfully used for design and strengthening of structures, also when innovative techniques, like seismic isolation, are employed. For these

  1. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  2. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  3. Earthquake education in California

    Science.gov (United States)

    MacCabe, M. P.

    1980-01-01

    In a survey of community response to the earthquake threat in southern California, Ralph Turner and his colleagues in the Department of Sociology at the University of California, Los Angeles, found that the public very definitely wants to be educated about the kinds of problems and hazards they can expect during and after a damaging earthquake; and they also want to know how they can prepare themselves to minimize their vulnerability. Decisionmakers, too, are recognizing this new wave of public concern. 

  4. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  5. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  6. Fault location and source process of the 2003 Boumerdes, Algeria, earthquake inferred from geodetic and strong motion data.

    Science.gov (United States)

    Semmane, F.; Campillo, M.; Cotton, F.

    2004-12-01

    The Boumerdes earthquake occurred on a fault which precise location, offshore the algerian coast, was unknown. Geodetic data consist of GPS measurements, levelling points and coastal uplifts. They are first used to determine the absolute position of the fault. We performed a series of inversions assuming different positions and chose the model giving the smallest misfit. According to this analysis, the fault emerge at about 15 km offshore. Accelerograms are then used to infer the space-time history of rupture on the fault plane using a two-step inversion in the spectral domain. The observed strong motion records are in good agreement with the synthetics for the fault location inferred from geodetic data. The fault plane ruptured for about 16 seconds. The slip distribution on the fault indicates one asperity north-west of the hypocenter with a maximum slip amplitude larger than 2.5 m. Another asperity with slightly smaller slip amplitude is located south-east of the hypocenter. The rupture seems to stop its propagation westward when it encounters the Thenia fault, a structure almost perpendicular to the main fault. We computed the spatial distribution of ground motion predicted by this fault model and compared it with the observed damages.

  7. Injection-induced earthquakes.

    Science.gov (United States)

    Ellsworth, William L

    2013-07-12

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  8. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  9. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  10. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  11. [Digital cephalometrics

    NARCIS (Netherlands)

    Ongkosuwito, E.M.; Katsaros, C.; Bodegom, J.C.; Kuijpers-Jagtman, A.M.

    2004-01-01

    There are different methods to produce digital head films and all have advantages and disadvantages. With a digital head film and a computer programme for digital cephalometry an analysis can be performed easily. All existing computer programmes for digital cephalometry use reference values to

  12. Digital squares

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Kim, Chul E

    1988-01-01

    Digital squares are defined and their geometric properties characterized. A linear time algorithm is presented that considers a convex digital region and determines whether or not it is a digital square. The algorithm also determines the range of the values of the parameter set of its preimages. ....... The analysis involves transforming the boundary of a digital region into parameter space of slope and y-intercept......Digital squares are defined and their geometric properties characterized. A linear time algorithm is presented that considers a convex digital region and determines whether or not it is a digital square. The algorithm also determines the range of the values of the parameter set of its preimages...

  13. Earthquake impact scale

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  14. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  15. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  16. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  17. Collecting, digitizing, and distributing historical seismological data

    Science.gov (United States)

    Michelini, Alberto; De Simoni, Bruno; Amato, Alessandro; Boschi, Enzo

    The digital preservation of the unique seismological heritage consisting of historical seismograms and earthquake bulletins, and of related documentation (e.g., observatory logbooks, station books, etc.), is critically important in order to avoid deterioration and loss overtime [Kanamori, 1988]. Dissemination of this seismological material in digital form is of equal importance, to allow reanalysis of past earthquakes using modern techniques and the reevaluation of seismic hazard. This is of particular interest for those areas where little or no earthquake activity has occurred since the last significant historical earthquake.In 2001, the Istituto Nazionale di Geofisica e Vulcanologia (INGV) started an innovative project, Progetto SISMOS (i.e., SISMOgrammi Storici), to scan (i.e., convert into digital form for storage on a computer), at very high resolution,and archive seismological paper records and related material. The Italian Ministry for the Environment originally funded the project to encompass the digitization of seismogram records of the Italian seismic observatories and of associated bulletins

  18. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  19. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  20. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  1. PAGER--Rapid assessment of an earthquake?s impact

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  2. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  3. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  4. Digital Culture and Digital Library

    Directory of Open Access Journals (Sweden)

    Yalçın Yalçınkaya

    2016-12-01

    Full Text Available In this study; digital culture and digital library which have a vital connection with each other are examined together. The content of the research consists of the interaction of culture, information, digital culture, intellectual technologies, and digital library concepts. The study is an entry work to integrity of digital culture and digital library theories and aims to expand the symmetry. The purpose of the study is to emphasize the relation between the digital culture and digital library theories acting intersection of the subjects that are examined. Also the perspective of the study is based on examining the literature and analytical evaluation in both studies (digital culture and digital library. Within this context, the methodology of the study is essentially descriptive and has an attribute for the transmission and synthesis of distributed findings produced in the field of the research. According to the findings of the study results, digital culture is an inclusive term that describes the effects of intellectual technologies in the field of information and communication. Information becomes energy and the spectrum of the information is expanding in the vertical rise through the digital culture. In this context, the digital library appears as a new living space of a new environment. In essence, the digital library is information-oriented; has intellectual technology support and digital platform; is in a digital format; combines information resources and tools in relationship/communication/cooperation by connectedness, and also it is the dynamic face of the digital culture in time and space independence. Resolved with the study is that the digital libraries are active and effective in the formation of global knowing and/or mass wisdom in the process of digital culture.

  5. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  6. Solution notches, earthquakes, and sea level, Haiti

    Science.gov (United States)

    Schiffman, C. R.; Mildor, B. S.; Bilham, R. G.

    2010-12-01

    Shortly after the 12 January 2010 Haiti earthquake, we installed an array of five tide gauges to determine sea level and its variability in the region of uplifted corals on the coast SW of Leogane, Haiti, that had been uplift ≤30 cm during the earthquake. Each gauge consists of a pressure transducer bolted 50-80 cm below mean sea level, which samples the difference between atmospheric pressure and sea pressure every 10 minutes. The data are transmitted via the Iridium satellite and are publically available with a latency of 10 minutes to 2 hours. The measurements reveal a maximum tidal range of ≈50 cm with 2-4 week oscillations in mean sea level of several cm. Sea slope, revealed by differences between adjacent gauges, varies 2-5 cm per 10 km at periods of 2-5 weeks, which imposes a disappointing limit to the utility of the gauges in estimating post seismic vertical motions. A parallel study of the form and elevation of coastal notches and mushroom rocks (rocks notched on all sides, hence forming a mushroom shape), along the coast west of Petit Goave suggests that these notches may provide an uplift history of the region over the past several hundreds of years. Notch sections in two areas were contoured, digitized, and compared to mean sea level. The notches mimic the histogram of sea level, suggesting that they are formed by dissolution by acidic surface waters. Notches formed two distinct levels, one approximately 58 cm above mean sea level, and the other approximately 157 cm above mean sea level. Several landslide blocks fell into the sea during the 2010 earthquake, and we anticipate these are destined for conversion to future mushroom rocks. Surfaces have been prepared on these blocks to study the rate of notch formation in situ, and samples are being subjected to acid corrosion in laboratory conditions, with the hope that the depth of notches may provide an estimate of the time of fall of previous rocks to help constrain the earthquake history of this area

  7. An interactive program on digitizing historical seismograms

    Science.gov (United States)

    Xu, Yihe; Xu, Tao

    2014-02-01

    Retrieving information from analog seismograms is of great importance since they are considered as the unique sources that provide quantitative information of historical earthquakes. We present an algorithm for automatic digitization of the seismograms as an inversion problem that forms an interactive program using Matlab® GUI. The program integrates automatic digitization with manual digitization and users can easily switch between the two modalities and carry out different combinations for the optimal results. Several examples about applying the interactive program are given to illustrate the merits of the method.

  8. Homogeneous catalogs of earthquakes.

    Science.gov (United States)

    Knopoff, L; Gardner, J K

    1969-08-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967.

  9. HOMOGENEOUS CATALOGS OF EARTHQUAKES*

    Science.gov (United States)

    Knopoff, Leon; Gardner, J. K.

    1969-01-01

    The usual bias in earthquake catalogs against shocks of small magnitudes can be removed by testing the randomness of the magnitudes of successive shocks. The southern California catalog, 1933-1967, is found to be unbiased in the sense of the test at magnitude 4 or above; the cutoff is improved to M = 3 for the subcatalog 1953-1967. PMID:16578700

  10. Earthquake in Haiti

    DEFF Research Database (Denmark)

    Holm, Isak Winkel

    2012-01-01

    In the vocabulary of modern disaster research, Heinrich von Kleist's seminal short story "The Earthquake in Chile" from 1806 is a tale of disaster vulnerability. The story is not just about a natural disaster destroying the innocent city of Santiago but also about the ensuing social disaster...

  11. Earthquake-proof plants

    International Nuclear Information System (INIS)

    Francescutti, P.

    2008-01-01

    In the wake of the damage suffered by the Kashiwazaki-Kariwa nuclear power plant as a result of an earthquake last July, this article looks at the seismic risk affecting the Spanish plants and the safety measures in place to prevent it. (Author)

  12. Earthquakes and market crashes

    Indian Academy of Sciences (India)

    We find prominent similarities in the features of the time series for the (model earthquakes or) overlap of two Cantor sets when one set moves with uniform relative velocity over the other and time series of stock prices. An anticipation method for some of the crashes have been proposed here, based on these observations.

  13. Digital Tectonics

    DEFF Research Database (Denmark)

    Christiansen, Karl; Borup, Ruben; Søndergaard, Asbjørn

    2014-01-01

    Digital Tectonics treats the architectonical possibilities in digital generation of form and production. The publication is the first volume of a series, in which aspects of the strategic focus areas of the Aarhus School of Architecture will be disseminated.......Digital Tectonics treats the architectonical possibilities in digital generation of form and production. The publication is the first volume of a series, in which aspects of the strategic focus areas of the Aarhus School of Architecture will be disseminated....

  14. Digital skrivedidaktik

    DEFF Research Database (Denmark)

    Digital skrivedidaktik består af to dele. Første del præsenterer teori om skrivekompetence og digital skrivning. Digital skrivning er karakteriseret ved at tekster skrives på computer og med digitale værktøjer, hvilket ændrer skrivningens traditionelle praksis, produkt og processer. Hvad er digital...... om elevens skriveproces) og Blogskrivning (der styrker eleverne i at bruge blogs i undervisningen)....

  15. The HayWired earthquake scenario—Earthquake hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  16. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  17. DIGITAL FORGERY

    OpenAIRE

    Sarhan M. Musa1

    2017-01-01

    Forgery is the criminal act that provides misleading information about a product or service. It is the process of making, adapting, or imitating documents or objects with the intent to deceive. Digital forgery (or digital tampering) is the process of manipulating documents or images for the intent of financial, social or political gain. This paper provides a brief introduction to the digital forgery.

  18. Digital Citizenship

    Science.gov (United States)

    Isman, Aytekin; Canan Gungoren, Ozlem

    2014-01-01

    Era in which we live is known and referred as digital age.In this age technology is rapidly changed and developed. In light of these technological advances in 21st century, schools have the responsibility of training "digital citizen" as well as a good citizen. Digital citizens must have extensive skills, knowledge, Internet and …

  19. A new way of telling earthquake stories: MOBEE - the MOBile Earthquake Exhibition

    Science.gov (United States)

    Tataru, Dragos; Toma-Danila, Dragos; Nastase, Eduard

    2016-04-01

    In the last decades, the demand and acknowledged importance of science outreach, in general and geophysics in particular, has grown, as demonstrated by many international and national projects and other activities performed by research institutes. The National Institute for Earth Physics (NIEP) from Romania is the leading national institution on earthquake monitoring and research, having at the same time a declared focus on informing and educating a wide audience about geosciences and especially seismology. This is more than welcome, since Romania is a very active country from a seismological point of view, but not too reactive when it comes to diminishing the possible effect of a major earthquake. Over the last few decades, the country has experienced several major earthquakes which have claimed thousands of lives and millions in property damage (1940; 1977; 1986 and 1990 Vrancea earthquakes). In this context, during a partnership started in 2014 together with the National Art University and Siveco IT company, a group of researchers from NIEP initiated the MOBile Earthquake Exhibition (MOBEE) project. The main goal was to design a portable museum to bring on the road educational activities focused on seismology, seismic hazard and Earth science. The exhibition is mainly focused on school students of all ages as it explains the main topics of geophysics through a unique combination of posters, digital animations and apps, large markets and exciting hand-on experiments, 3D printed models and posters. This project is singular in Romania and aims to transmit properly reviewed actual information, regarding the definition of earthquakes, the way natural hazards can affect people, buildings and the environment and the measures to be taken for prevent an aftermath. Many of the presented concepts can be used by teachers as a complementary way of demonstrating physics facts and concepts and explaining processes that shape the dynamic Earth features. It also involves

  20. Digital preservation

    CERN Document Server

    Deegan, Marilyn

    2013-01-01

    Digital preservation is an issue of huge importance to the library and information profession right now. With the widescale adoption of the internet and the rise of the world wide web, the world has been overwhelmed by digital information. Digital data is being produced on a massive scale by individuals and institutions: some of it is born, lives and dies only in digital form, and it is the potential death of this data, with its impact on the preservation of culture, that is the concern of this book. So how can information professionals try to remedy this? Digital preservation is a complex iss

  1. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  2. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  3. The 1930 Irpinia earthquake: collection and analysis of historical waveforms

    Science.gov (United States)

    Ferrari, G.; Megna, A.; Nardi, A.; Palombo, B.; Perniola, B.; Pino, N.

    2002-12-01

    The 1930 Irpinia earthquake is one of the most destructive events recorded by instruments in Italy. Several large events occurred in the same area before (1456, 1694, 1702, 1732, 1910) and after (1962, 1980, 1983) 1930. It has been hypothesized that significant differences characterized the source geometry. Early work carried out by several authors on macroseismic studies and a single-station waveform analysis, suggests a quasi-strike slip mechanism on an approximately EW-oriented fault plain. Conversely, all the major events in the area display normal fault mechanisms on Apennine-oriented (NW-SE) fault planes. In the present work we have collected about 45 waveforms for the 1930 earthquake, recorded in various European observatories, aiming to find precious hints on source geometry and kinematics. The seismograms have been rasterized, digitized and processed within the framework of the SISMOS project. The study of this earthquake is part of a wider ongoing research program on the 20th century Irpinia earthquakes (1910, 1030, 1962 and 1980) within the collaboration between the TROMOS and SISMOS projects of the National Institute of Geophysics and Volcanology. The search and recovery of the historical recordings is a unique opportunity to shed light upon scientific aspects related to this kind of investigation. Preliminary results about the 1930 earthquake waveform analysis are presented here.

  4. Spatial Distribution of Landslides Triggered by the 2008 Wenchuan Earthquake: Implications for Landslide Fluvial Mobilization and Earthquake Source Mechanism

    Science.gov (United States)

    Li, G.; West, A.; Hilton, R. G.

    2013-12-01

    Assessing the spatial distribution of earthquake-induced landslides is important for quantifying the fluvial evacuation of landslide material [Hovius et al., 2011], for deriving information about earthquake sources [Meunier et al., 2013], and for understanding the role of earthquakes in shaping surface topography and in driving orogen evolution. The 2008 Mw 7.9 Wenchuan earthquake is characterized by large magnitude, widespread coseismic landsliding, typical mountainous ridge-and-valley topography of the region, and comprehensive geophysical observation. Previous work on landslides associated with the Wenchuan earthquake has focused on the occurrences of landslide-induced hazards and spatial relations between the landslide locations and the seismic features (i.e., the surface ruptures and the epicenter) [e.g., Dai et al., 2011; Gorum et al., 2011]. Little attention has been paid to how the landslide distribution determines the fluvial mobilization of landslide material or quantitative landslide-earthquake source mechanism inversion, even though the Wenchuan event provides an ideal case study to explore these problems for a larger magnitude earthquake than has yet been considered. We obtained a landslide inventory for the 2008 Wenchuan earthquake using high-resolution remote imagery and a semi-automated mapping algorithm. Here we report the results from spatial and statistical analysis of this landslide map using a digital elevation model (DEM) framework. We present the probability distribution of primary parameters (i.e., slope, aspect, elevation, and area density of all landslides) of the landslide inventory and discuss their relations to regional topographic features (i.e., river channels and mountain ridges). The landslide-river channel connectivity and landslide mobility were estimated using different hillslope-channel transition cutoffs. The landslide density and the probability of slope failure were calculated for all lithological units within the Longmen

  5. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  6. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  7. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  8. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  9. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  10. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  11. Tien Shan Geohazards Database: Earthquakes and landslides

    Science.gov (United States)

    Havenith, H. B.; Strom, A.; Torgoev, I.; Torgoev, A.; Lamair, L.; Ischuk, A.; Abdrakhmatov, K.

    2015-11-01

    In this paper we present new and review already existing landslide and earthquake data for a large part of the Tien Shan, Central Asia. For the same area, only partial databases for sub-regions have been presented previously. They were compiled and new data were added to fill the gaps between the databases. Major new inputs are products of the Central Asia Seismic Risk Initiative (CASRI): a tentative digital map of active faults (even with indication of characteristic or possible maximum magnitude) and the earthquake catalogue of Central Asia until 2009 that was now updated with USGS data (to May 2014). The new compiled landslide inventory contains existing records of 1600 previously mapped mass movements and more than 1800 new landslide data. Considering presently available seismo-tectonic and landslide data, a target region of 1200 km (E-W) by 600 km (N-S) was defined for the production of more or less continuous geohazards information. This target region includes the entire Kyrgyz Tien Shan, the South-Western Tien Shan in Tajikistan, the Fergana Basin (Kyrgyzstan, Tajikistan and Uzbekistan) as well as the Western part in Uzbekistan, the North-Easternmost part in Kazakhstan and a small part of the Eastern Chinese Tien Shan (for the zones outside Kyrgyzstan and Tajikistan, only limited information was available and compiled). On the basis of the new landslide inventory and the updated earthquake catalogue, the link between landslide and earthquake activity is analysed. First, size-frequency relationships are studied for both types of geohazards, in terms of Gutenberg-Richter Law for the earthquakes and in terms of probability density function for the landslides. For several regions and major earthquake events, case histories are presented to outline further the close connection between earthquake and landslide hazards in the Tien Shan. From this study, we concluded first that a major hazard component is still now insufficiently known for both types of geohazards

  12. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  13. Earthquake Forecasting System in Italy

    Science.gov (United States)

    Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

    2017-12-01

    In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

  14. Earthquake forecasting and its verification

    Directory of Open Access Journals (Sweden)

    J. R. Holliday

    2005-01-01

    Full Text Available No proven method is currently available for the reliable short time prediction of earthquakes (minutes to months. However, it is possible to make probabilistic hazard assessments for earthquake risk. In this paper we discuss a new approach to earthquake forecasting based on a pattern informatics (PI method which quantifies temporal variations in seismicity. The output, which is based on an association of small earthquakes with future large earthquakes, is a map of areas in a seismogenic region ('hotspots'' where earthquakes are forecast to occur in a future 10-year time span. This approach has been successfully applied to California, to Japan, and on a worldwide basis. Because a sharp decision threshold is used, these forecasts are binary--an earthquake is forecast either to occur or to not occur. The standard approach to the evaluation of a binary forecast is the use of the relative (or receiver operating characteristic (ROC diagram, which is a more restrictive test and less subject to bias than maximum likelihood tests. To test our PI method, we made two types of retrospective forecasts for California. The first is the PI method and the second is a relative intensity (RI forecast based on the hypothesis that future large earthquakes will occur where most smaller earthquakes have occurred in the recent past. While both retrospective forecasts are for the ten year period 1 January 2000 to 31 December 2009, we performed an interim analysis 5 years into the forecast. The PI method out performs the RI method under most circumstances.

  15. Digital mammography

    International Nuclear Information System (INIS)

    Bick, Ulrich; Diekmann, Felix

    2010-01-01

    This state-of-the-art reference book provides in-depth coverage of all aspects of digital mammography, including detector technology, image processing, computer-aided diagnosis, soft-copy reading, digital workflow, and PACS. Specific advantages and disadvantages of digital mammography in comparison to screen-film mammography are thoroughly discussed. By including authors from both North America and Europe, the book is able to outline variations in the use, acceptance, and quality assurance of digital mammography between the different countries and screening programs. Advanced imaging techniques and future developments such as contrast mammography and digital breast tomosynthesis are also covered in detail. All of the chapters are written by internationally recognized experts and contain numerous high-quality illustrations. This book will be of great interest both to clinicians who already use or are transitioning to digital mammography and to basic scientists working in the field. (orig.)

  16. Digital Signage

    OpenAIRE

    Fischer, Karl Peter

    2011-01-01

    Digital Signage for in-store advertising at gas stations/retail stores in Germany: A field study Digital Signage networks provide a novel means of advertising with the advantage of easily changeable and highly customizable animated content. Despite the potential and increasing use of these media empirical research is scarce. In a field study at 8 gas stations (with integrated convenience stores) we studied the effect of digital signage advertising on sales for different products and servi...

  17. Digital Forensics

    OpenAIRE

    Garfinkel, Simson L.

    2013-01-01

    A reprint from American Scientist the magazine of Sigma Xi, The Scientific Research Society Since the 1980s, computers have had increasing roles in all aspects of human life—including an involvement in criminal acts. This development has led to the rise of digital forensics, the uncovering and examination of evidence located on all things electronic with digital storage, including computers, cell phones, and networks. Digital forensics researchers and practitione...

  18. Digital Audiobooks

    DEFF Research Database (Denmark)

    Have, Iben; Pedersen, Birgitte Stougaard

    Audiobooks are rapidly gaining popularity with widely accessible digital downloading and streaming services. The paper is framing how the digital audiobook expands and changes the target groups for book publications and how it as an everyday activity is creating new reading experiences, places...

  19. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  20. Earthquake Preparedness Checklist for Schools.

    Science.gov (United States)

    1999

    A brochure provides a checklist highlighting the important questions and activities that should be addressed and undertaken as part of a school safety and preparedness program for earthquakes. It reminds administrators and other interested parties on what not to forget in preparing schools for earthquakes, such as staff knowledge needs, evacuation…

  1. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  2. Earthquake Loss Estimation Uncertainties

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Aleksander

    2013-04-01

    The paper addresses the reliability issues of strong earthquakes loss assessment following strong earthquakes with worldwide Systems' application in emergency mode. Timely and correct action just after an event can result in significant benefits in saving lives. In this case the information about possible damage and expected number of casualties is very critical for taking decision about search, rescue operations and offering humanitarian assistance. Such rough information may be provided by, first of all, global systems, in emergency mode. The experience of earthquakes disasters in different earthquake-prone countries shows that the officials who are in charge of emergency response at national and international levels are often lacking prompt and reliable information on the disaster scope. Uncertainties on the parameters used in the estimation process are numerous and large: knowledge about physical phenomena and uncertainties on the parameters used to describe them; global adequacy of modeling techniques to the actual physical phenomena; actual distribution of population at risk at the very time of the shaking (with respect to immediate threat: buildings or the like); knowledge about the source of shaking, etc. Needless to be a sharp specialist to understand, for example, that the way a given building responds to a given shaking obeys mechanical laws which are poorly known (if not out of the reach of engineers for a large portion of the building stock); if a carefully engineered modern building is approximately predictable, this is far not the case for older buildings which make up the bulk of inhabited buildings. The way population, inside the buildings at the time of shaking, is affected by the physical damage caused to the buildings is not precisely known, by far. The paper analyzes the influence of uncertainties in strong event parameters determination by Alert Seismological Surveys, of simulation models used at all stages from, estimating shaking intensity

  3. Digital mammography

    International Nuclear Information System (INIS)

    Cho, Nariya; Cha, Joo Hee; Moon, Woo Kyung

    2005-01-01

    Mammography is the best imaging modality for the detection of early breast cancer in asymptomatic women. However, 10-30% of cases are missed with current film-screen mammography. Digital mammography allows for the separate optimization of image acquisition and display. In addition to the obvious data storage, retrieval, and transmission advantages that digital mammography allows, additional advances such as computer-aided diagnosis, tomosynthesis and dual energy mammography are in development. This review will discuss the technology of digital mammography including detectors and displays, the results of clinical trials comparing film-screen and digital mammography, and the use of computer-aided detection. Digital mammography is a promising new technology for breast cancer detection in the Korean women

  4. Digital Humanities

    DEFF Research Database (Denmark)

    Brügger, Niels

    2016-01-01

    Digital humanities is an umbrella term for theories, methodologies, and practices related to humanities scholarship that use the digital computer as an integrated and essential part of its research and teaching activities. The computer can be used for establishing, finding, collecting, and preser......Digital humanities is an umbrella term for theories, methodologies, and practices related to humanities scholarship that use the digital computer as an integrated and essential part of its research and teaching activities. The computer can be used for establishing, finding, collecting......, and preserving material to study, as an object of study in its own right, as an analytical tool, or for collaborating, and for disseminating results. The term "digital humanities" was coined around 2001, and gained currency within academia in the following years. However, computers had been used within...

  5. Digital displacements

    DEFF Research Database (Denmark)

    Pors, Anja Svejgaard

    2014-01-01

    digital interface. However, the transformation of citizen services from traditional face-to-face interaction to digital self-service gives rise to new practices; some citizens need support to be able to manage self-service through digital tools. A mixture of support and teaching, named co...... digital reforms in Denmark and shows how citizen service is transformed from service to support. The frontline employee’s classical tasks such as casework are being displaced into educational and support-oriented tasks with less professional content. Thus an unintended effect of digitisation is blurred......In recent years digital reforms are being introduced in the municipal landscape of Denmark. The reforms address the interaction between citizen and local authority. The aim is, that by 2015 at least 80 per cent of all correspondence between citizens and public authority will be transmitted through...

  6. Sports Digitalization

    DEFF Research Database (Denmark)

    Xiao, Xiao; Hedman, Jonas; Tan, Felix Ter Chian

    2017-01-01

    evolution, as digital technologies are increasingly entrenched in a wide range of sporting activities and for applications beyond mere performance enhancement. Despite such trends, research on sports digitalization in the IS discipline is surprisingly still nascent. This paper aims at establishing......Ever since its first manifesto in Greece around 3000 years ago, sports as a field has accumulated a long history with strong traditions while at the same time, gone through tremendous changes toward professionalization and commercialization. The current waves of digitalization have intensified its...... a discourse on sports digitalization within the discipline. Toward this, we first provide an understanding of the institutional characteristics of the sports industry, establishing its theoretical importance and relevance in our discipline; second, we reveal the latest trends of digitalization in the sports...

  7. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  8. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  9. Early Earthquakes of the Americas

    Science.gov (United States)

    Ni, James

    2004-11-01

    Robert Kovach's second book looks at the interplay of earthquake and volcanic events, archeology, and history in the Americas. Throughout history, major earthquakes have caused the deaths of millions of people and have damaged countless cities. Earthquakes undoubtedly damaged prehistoric cities in the Americas, and evidence of these events could be preserved in archeological records. Kovach asks, Did indigenous native cultures-Indians of the Pacific Northwest, Aztecs, Mayas, and Incas-document their natural history? Some events have been explicitly documented, for example, in Mayan codices, but many may have been recorded as myth and legend. Kovach's discussions of how early cultures dealt with fearful events such as earthquakes and volcanic eruptions are colorful, informative, and entertaining, and include, for example, a depiction of how the Maya would talk to maize plants in their fields during earthquakes to reassure them.

  10. Are Earthquakes a Critical Phenomenon?

    Science.gov (United States)

    Ramos, O.

    2014-12-01

    Earthquakes, granular avalanches, superconducting vortices, solar flares, and even stock markets are known to evolve through power-law distributed events. During decades, the formalism of equilibrium phase transition has coined these phenomena as critical, which implies that they are also unpredictable. This work revises these ideas and uses earthquakes as the paradigm to demonstrate that slowly driven systems evolving through uncorrelated and power-law distributed avalanches (UPLA) are not necessarily critical systems, and therefore not necessarily unpredictable. By linking the correlation length to the pdf of the distribution, and comparing it with the one obtained at a critical point, a condition of criticality is introduced. Simulations in the classical Olami-Feder-Christensen (OFC) earthquake model confirm the findings, showing that earthquakes are not a critical phenomenon. However, one single catastrophic earthquake may show critical properties and, paradoxically, the emergence of this temporal critical behaviour may eventually carry precursory signs of catastrophic events.

  11. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  12. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  13. Source finiteness of large earthquakes measured from long-period Rayleigh waves

    Science.gov (United States)

    Zhang, Jiajun; Kanamori, Hiroo

    1988-10-01

    The source-finiteness parameters of 11 large shallow earthquakes were determined from long-period Rayleigh waves recorded by the Global Digital Seismograph Network and International Deployment of Accelerometers Networks. The basic data sets are the seismic spectra of periods from 150 to 300 s. In the determination of source-process times, we used Furumoto's phase method and a linear inversion method, in which we simultaneously inverted the spectra and determined the source-process time that minimizes the error in the inversion. These two methods yielded consistent results. The source-process times of the Sumbawa (Indonesia), Colombia-Ecuador, Valparaiso (Chile) and Michoacan (Mexico) earthquakes were estimated to be 79, 118, 69 and 77 s, respectively, from the linear inversion method. The source-process times determined from long-period surface waves were in general longer than those obtained from body waves. Source finiteness of large shallow earthquakes with rupture on a fault plane with a large aspect ratio was modeled with the source-finiteness function introduced by Ben-Menahem. The spectra were inverted to find the extent and direction of the rupture of the earthquake that minimize the error in the inversion. For a rupture velocity of 2.5 km s -1, the estimated rupture was unilateral, 100 km long and along the strike, N26°W, for the May 26, 1983 Akita-Oki, Japan earthquake; 165 km and S57°E for the September 19, 1985 Michoacan, Mexico earthquake; 256 km and N31°E for the December 12, 1979 Colombia-Ecuador earthquake; and 149 km and S15°W for the March 3, 1985 Valparaiso, Chile earthquake. The results for the August 19, 1977 Sumbawa, Indonesia earthquake showed that the rupture was bilateral and in the direction N60°E. These results are, in general, consistent with the rupture extent inferred from the aftershock area of these earthquakes.

  14. Digital radiography

    International Nuclear Information System (INIS)

    Coulomb, M.; Dal Soglio, S.; Pittet-Barbier, L.; Ranchoup, Y.; Thony, F.; Ferretti, G.; Robert, F.

    1992-01-01

    Digital projection radiography may replace conventional radiography some day, provided it can meet several requirements: equal or better diagnostic effectiveness of the screen-film systems; reasonable image cost; real improvement in the productivity of the Departments of Imaging. All digital radiographic systems include an X-ray source, an image acquisition and formatting sub-system, a display and manipulation sub-system, and archiving subsystem and a laser editing system, preferably shared by other sources of digital images. Three digitization processes are available: digitization of the radiographic film, digital fluorography and phospholuminescent detectors with memory. The advantages of digital fluoroscopy are appealing: real-time image acquisition, suppression of cassettes; but its disadvantages are far from negligible: it cannot be applied to bedside radiography, the field of examination is limited, and the wide-field spatial resolution is poor. Phospholuminescent detectors with memory have great advantages: they can be used for bedside radiographs and on all the common radiographic systems; spatial resolution is satisfactory; its current disadvantages are considerable. These two systems, have common properties making up the entire philosophy of digital radiology and specific features that must guide our choice according to the application. Digital fluorography is best applied in pediatric radiology. However, evaluation works have showed that it was applicable with sufficient quality to many indications of general radiology in which a fluoroscopic control and fast acquisition of the images are essential; the time gained on the examination may be considerable, as well as the savings on film. Detectors with memory are required for bedside radiographs, in osteoarticular and thoracic radiology, in all cases of traumatic emergency and in the resuscitation and intensive care departments

  15. Earthquakes and faults in the San Francisco Bay area (1970-2003)

    Science.gov (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.; Wong, Florence L.; Saucedo, George J.

    2004-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.0 in the greater San Francisco Bay area. Twenty-two earthquakes magnitude 5.0 and greater are indicated on the map and listed chronologically in an accompanying table. The data are compiled from records from 1970-2003. The bathymetry was generated from a digital version of NOAA maps and hydrogeographic data for San Francisco Bay. Elevation data are from the USGS National Elevation Database. Landsat satellite image is from seven Landsat 7 Enhanced Thematic Mapper Plus scenes. Fault data are reproduced with permission from the California Geological Survey. The earthquake data are from the Northern California Earthquake Catalog.

  16. Digital radiography

    International Nuclear Information System (INIS)

    Zani, M.L.

    2002-01-01

    X-ray radiography is a very common technique used to check the homogeneity of a material or the inside of a mechanical part. Generally the radiation that goes through the material to check, produced an image on a sensitized film. This method requires time because the film needs to be developed, digital radiography has no longer this inconvenient. In digital radiography the film is replaced by digital data and can be processed as any computer file. This new technique is promising but its main inconvenient is that today its resolution is not so good as that of film radiography. (A.C.)

  17. Digital electronics

    CERN Document Server

    Morris, John

    2013-01-01

    An essential companion to John C Morris's 'Analogue Electronics', this clear and accessible text is designed for electronics students, teachers and enthusiasts who already have a basic understanding of electronics, and who wish to develop their knowledge of digital techniques and applications. Employing a discovery-based approach, the author covers fundamental theory before going on to develop an appreciation of logic networks, integrated circuit applications and analogue-digital conversion. A section on digital fault finding and useful ic data sheets completes th

  18. Digital Leadership

    DEFF Research Database (Denmark)

    Zupancic, Tadeja; Verbeke, Johan; Achten, Henri

    2016-01-01

    Leadership is an important quality in organisations. Leadership is needed to introduce change and innovation. In our opinion, in architectural and design practices, the role of leadership has not yet been sufficiently studied, especially when it comes to the role of digital tools and media....... With this paper we intend to initiate a discussion in the eCAADe community to reflect and develop ideas in order to develop digital leadership skills amongst the membership. This paper introduces some important aspects, which may be valuable to look into when developing digital leadership skills....

  19. Digital radiography

    International Nuclear Information System (INIS)

    Kusano, Shoichi

    1993-01-01

    Firstly, from an historic point of view, fundamental concepts on digital imaging were reviewed to provide a foundation for discussion of digital radiography. Secondly, this review summarized the results of ongoing research in computed radiography that replaces the conventional film-screen system with a photo-stimulable phosphor plate; and thirdly, image quality, radiation protection, and image processing techniques were discussed with emphasis on picture archiving and communication system environment as our final goal. Finally, future expansion of digital radiography was described based on the present utilization of computed tomography at the National Defense Medical College Hospital. (author) 60 refs

  20. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  1. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  2. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  3. Dynamic Inversion of Intraslab Intermediate Depth Earthquakes

    Science.gov (United States)

    Madariaga, R. I.; Ruiz, S.

    2011-12-01

    We perform kinematic and dynamic inversions of the 24 July 2008 (Mw=6.8) Iwate northern Japan and 16 December 2007 (Mw 6.8) Michilla, Chile earthquakes using near field strong motion digital data. The data were filtered between 0.02 - 1 Hz. The rupture process is simulated with elliptical patches because we are looking for the average properties of the seismic rupture. The direct dynamic simulation problem was solved by a combination of finite difference modeling on a 32 km3 grid with 200 m spacing, and propagation from source to recorders using the AXITRA spectral program. For both earthquakes we used layered models of the structure. The Neighborhood algorithm and Monte Carlo methods are used to obtain the best fitting solutions and to explore the solution space. The optimum solutions are found comparing observed and synthetic records using an L2 norm. Both kinematic and dynamic inversions fit the observed data with misfits lower than 0.3. For both earthquakes, kinematic inversion shows strong trade-off between rupture velocity and maximum slip although the seismic moment remains invariant. Rupture velocities vary between sub-shear speeds to almost Rayleigh wave speeds. In the dynamic inversions 10 seismic source parameters were inverted for the Michilla earthquake and 8 parameters for the Iwate event, among them stress, friction and geometrical parameters. For the Iwate event the properties of the initial asperity at the source were not inverted because they could not be resolved by the data. In the dynamic inversion we observed a strong trade off among the friction law parameters. The best dynamic models form a family of that shares similar values of seismic moment and kappa (the ratio of released strain energy to energy release rate for friction). Kinematic and dynamic inversions in the 0.02 - 1 Hz frequency range form a set of non-unique solutions controlled by specific combinations of seismic source parameters. We discuss the origin of the non-uniqueness of

  4. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  5. Experience database of Romanian facilities subjected to the last three Vrancea earthquakes

    International Nuclear Information System (INIS)

    1999-01-01

    The scope of this research project is to use the past seismic experience of similar components from power and industrial facilities to establish the generic seismic resistance of nuclear power plant safe shutdown equipment. The first part of the project provide information about the Vrancea. earthquakes which affect the Romanian territory and also the Kozloduy NPP site as a background of the investigations of the seismic performance of mechanical and electrical equipment in the industrial facilities. This project has the following, objectives: collect and process all available seismic information about Vrancea earthquakes; perform probabilistic hazard analysis of the Vrancea earthquakes; determine attenuation low, correlation between the focal depth, earthquake power, soil conditions and frequency characteristics of the seismic ground motion; investigate and collect information regarding seismic behavior during the 1977, 1986 and 1990 earthquakes of mechanical and electrical components from industrial facilities. The seismic database used for the analysis of the Vrancea earthquakes includes digitized triaxial records as follows: March 4, 1977 - I station, Aug, 30 1986 - 42 stations, May 1990 - 54 stations. A catalogue of the Vrancea earthquakes occurred during the period 1901-1994, is presented as well

  6. PRELIMINARY RESULTS OF EARTHQUAKE-INDUCED BUILDING DAMAGE DETECTION WITH OBJECT-BASED IMAGE CLASSIFICATION

    Directory of Open Access Journals (Sweden)

    A. Sabuncu

    2016-06-01

    Full Text Available Earthquakes are the most destructive natural disasters, which result in massive loss of life, infrastructure damages and financial losses. Earthquake-induced building damage detection is a very important step after earthquakes since earthquake-induced building damage is one of the most critical threats to cities and countries in terms of the area of damage, rate of collapsed buildings, the damage grade near the epicenters and also building damage types for all constructions. Van-Ercis (Turkey earthquake (Mw= 7.1 was occurred on October 23th, 2011; at 10:41 UTC (13:41 local time centered at 38.75 N 43.36 E that places the epicenter about 30 kilometers northern part of the city of Van. It is recorded that, 604 people died and approximately 4000 buildings collapsed or seriously damaged by the earthquake. In this study, high-resolution satellite images of Van-Ercis, acquired by Quickbird-2 (© Digital Globe Inc. after the earthquake, were used to detect the debris areas using an object-based image classification. Two different land surfaces, having homogeneous and heterogeneous land covers, were selected as case study areas. As a first step of the object-based image processing, segmentation was applied with a convenient scale parameter and homogeneity criterion parameters. As a next step, condition based classification was used. In the final step of this preliminary study, outputs were compared with streetview/ortophotos for the verification and evaluation of the classification accuracy.

  7. Designing plants to withstand earthquakes

    International Nuclear Information System (INIS)

    Nedderman, J.

    1995-01-01

    The approach used in Japan to design nuclear plants capable of withstanding earthquakes is described. Earthquakes are classified into two types S 1 and S 2 . In an S 1 earthquake a nuclear plant must be capable of surviving essentially undamaged. In the more severe S 2 earthquake, some damage may occur but there should be no release of radioactivity to the outside world. The starting point for the designer is the ground response spectrum of the earthquake which shows both the ground acceleration and the frequencies of the vibrations. From the ground response spectra synthetic seismic waves for S 1 and S 2 earthquakes are developed which can then be used to analyse a ''lumped-mass'' model of the reactor building to arrive at floor response spectra. These spectra are then used in further analyses of the design of reactor equipment, piping systems and instrument racks and supports. When a plant is constructed, results from tests with a vibration exciter are used to verify the floor response spectra and principle building resonances. Much of the equipment can be tested on vibrating tables. One large table with a maximum loading capacity of 1000 t is used to test large-scale models of containment vessels, pressure vessels and steam generators. Such tests have shown that the plants have considerable safety margins in their ability to withstand the design basis earthquakes. (UK)

  8. How citizen seismology is transforming rapid public earthquake information and interactions between seismologists and society

    Science.gov (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Fréderic; Caroline, Etivant

    2015-04-01

    Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.

  9. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  10. Fracking, wastewater disposal, and earthquakes

    Science.gov (United States)

    McGarr, Arthur

    2016-03-01

    In the modern oil and gas industry, fracking of low-permeability reservoirs has resulted in a considerable increase in the production of oil and natural gas, but these fluid-injection activities also can induce earthquakes. Earthquakes induced by fracking are an inevitable consequence of the injection of fluid at high pressure, where the intent is to enhance permeability by creating a system of cracks and fissures that allow hydrocarbons to flow to the borehole. The micro-earthquakes induced during these highly-controlled procedures are generally much too small to be felt at the surface; indeed, the creation or reactivation of a large fault would be contrary to the goal of enhancing permeability evenly throughout the formation. Accordingly, the few case histories for which fracking has resulted in felt earthquakes have been due to unintended fault reactivation. Of greater consequence for inducing earthquakes, modern techniques for producing hydrocarbons, including fracking, have resulted in considerable quantities of coproduced wastewater, primarily formation brines. This wastewater is commonly disposed by injection into deep aquifers having high permeability and porosity. As reported in many case histories, pore pressure increases due to wastewater injection were channeled from the target aquifers into fault zones that were, in effect, lubricated, resulting in earthquake slip. These fault zones are often located in the brittle crystalline rocks in the basement. Magnitudes of earthquakes induced by wastewater disposal often exceed 4, the threshold for structural damage. Even though only a small fraction of disposal wells induce earthquakes large enough to be of concern to the public, there are so many of these wells that this source of seismicity contributes significantly to the seismic hazard in the United States, especially east of the Rocky Mountains where standards of building construction are generally not designed to resist shaking from large earthquakes.

  11. Digital Snaps

    DEFF Research Database (Denmark)

    Sandbye, Mette; Larsen, Jonas

    Purikura Photography / Mette Sandbye -- ch. 7. 'Buying an Instrument Does Not Necessarily Make You a Musician': Studio Photography and the Digital Revolution / Sigrid Lien -- pt. III. NEW PUBLIC FORMS -- ch. 8 Paparazzi Photography, Seriality and the Digital Photo Archive / Anne Jerslev and Mette Mortensen......The New Face of Snapshot Photography / Jonas Larsen and Mette Sandbye -- pt. I. IMAGES ON WEB 2.0 AND THE CAMERA PHONE -- ch. 1. Overlooking, Rarely Looking and Not Looking / Martin Lister -- ch. 2. The (Im)mobile Life of Digital Photographs: The Case of Tourist Photography / Jonas Larsen -- ch. 3....... Distance as the New Punctum / Mikko Villi -- pt. II. FAMILY ALBUMS IN TRANSITION -- ch. 4. How Digital Technologies Do Family Snaps, Only Better / Gillian Rose -- ch. 5. Friendship Photography: Memory, Mobility and Social Networking / Joanne Garde-Hansen -- ch. 6. Play, Process and Materiality in Japanese...

  12. Digital Discretion

    DEFF Research Database (Denmark)

    Busch, Peter Andre; Zinner Henriksen, Helle

    2018-01-01

    discretion is suggested to reduce this footprint by influencing or replacing their discretionary practices using ICT. What is less researched is whether digital discretion can cause changes in public policy outcomes, and under what conditions such changes can occur. Using the concept of public service values......This study reviews 44 peer-reviewed articles on digital discretion published in the period from 1998 to January 2017. Street-level bureaucrats have traditionally had a wide ability to exercise discretion stirring debate since they can add their personal footprint on public policies. Digital......, we suggest that digital discretion can strengthen ethical and democratic values but weaken professional and relational values. Furthermore, we conclude that contextual factors such as considerations made by policy makers on the macro-level and the degree of professionalization of street...

  13. Digital fabrication

    CERN Document Server

    2012-01-01

    The Winter 2012 (vol. 14 no. 3) issue of the Nexus Network Journal features seven original papers dedicated to the theme “Digital Fabrication”. Digital fabrication is changing architecture in fundamental ways in every phase, from concept to artifact. Projects growing out of research in digital fabrication are dependent on software that is entirely surface-oriented in its underlying mathematics. Decisions made during design, prototyping, fabrication and assembly rely on codes, scripts, parameters, operating systems and software, creating the need for teams with multidisciplinary expertise and different skills, from IT to architecture, design, material engineering, and mathematics, among others The papers grew out of a Lisbon symposium hosted by the ISCTE-Instituto Universitario de Lisboa entitled “Digital Fabrication – A State of the Art”. The issue is completed with four other research papers which address different mathematical instruments applied to architecture, including geometric tracing system...

  14. Becoming digital

    DEFF Research Database (Denmark)

    Pors, Anja Svejgaard

    2015-01-01

    . Originality/value: The study contributes to ethnographic research in public administration by combining two separate subfields, e-government and street-level bureaucracy, to discern recent transformations in public service delivery. In the digital era, tasks, control and equality are distributed in ways......The purpose of this paper is to examine the impact of e-government reforms on street-level bureaucrats’ professionalism and relation to citizens, thus demonstrating how the bureaucratic encounter unfolds in the digital era. Design/methodology/approach: The paper is based on an ethnographic study....... An ethnographic account of how digital reforms are implemented in practice shows how street-level bureaucrat’s classic tasks such as specialized casework are being reconfigured into educational tasks that promote the idea of “becoming digital”. In the paper, the author argues that the work of “becoming digital...

  15. An Interactive Program on Digitizing Historical Seismograms

    Science.gov (United States)

    Xu, Y.; Xu, T.

    2013-12-01

    Retrieving information from historical seismograms is of great importance since they are considered the unique sources that provide quantitative information of historical earthquakes. Modern techniques of seismology require digital forms of seismograms that are essentially a sequence of time-amplitude pairs. However, the historical seismograms, after scanned into computers, are two dimensional arrays. Each element of the arrays contains the grayscale value or RGB value of the corresponding pixel. The problem of digitizing historical seismograms, referred to as converting historical seismograms to digital seismograms, can be formulated as an inverse problem that generating sequences of time-amplitude pairs from a two dimension arrays. This problem has infinite solutions. The algorithm for automatic digitization of historical seismogram presented considers several features of seismograms, including continuity, smoothness of the seismic traces as the prior information, and assumes that the amplitude is a single-valued function of time. An interactive program based on the algorithm is also presented. The program is developed using Matlab GUI and has both automatic and manual modality digitization. Users can easily switch between them, and try different combinations to get the optimal results. Several examples are given to illustrate the results of digitizing seismograms using the program, including a photographic record and a wide-angle reflection/refraction seismogram. Digitized result of the program (redrawn using Golden Software Surfer for high resolution image). (a) shows the result of automatic digitization, and (b) is the result after manual correction.

  16. Digital Bangladesh

    OpenAIRE

    Masudul Alam Choudhury

    2013-01-01

    The present fever to launch an extensive digitalization program is sweeping the Bangladesh political, business, and elitist minds. In the face of an overarching outlook of sustainable development the Bangladesh digitalization program runs into some grave questions. The paper points out that ethics as a strongly endogenous force in development is indispensable to keep in view the simultaneity of attaining growth and social justice. These targets are variously manifested in different sectors an...

  17. Digital Leadership

    OpenAIRE

    Zupanzic, Tadeja; Verbeke, Johan; Achten, Henri; Herneoja, Aulikki

    2016-01-01

    Leadership is an important quality in organisations. Leadership is needed to introduce change and innovation. In our opinion, in architectural and design practices, the role of leadership has not yet been sufficiently studied, especially when it comes to the role of digital tools and media. With this paper, we intend to initiate a discussion in the eCAADe community to reflect and develop ideas in order to develop digital leadership skills amongst the membership. This paper introduces some imp...

  18. Earthquakes, July-August 1992

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤Mearthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  19. Earthquake Zoning Maps of Turkey

    International Nuclear Information System (INIS)

    Pampal, S.

    2007-01-01

    Earthquake Zoning Maps (1945, 1947, 1963, 1972 and 1996) and Specifications for Construction in Disaster Areas (1947, 1953, 1962, 1968, 1975, 1996, 1997 and 2006) have been changed many times following the developments in engineering seismology, tectonic and seismo-tectonic invention and improved earthquake data collection. The aim of this study is to give information about this maps, which come into force at different dates since the introduction of the firs official Earthquake Zoning Map published in 1945 and is to assist for better understanding of the development phases of these maps

  20. Seismology: dynamic triggering of earthquakes.

    Science.gov (United States)

    Gomberg, Joan; Johnson, Paul

    2005-10-06

    After an earthquake, numerous smaller shocks are triggered over distances comparable to the dimensions of the mainshock fault rupture, although they are rare at larger distances. Here we analyse the scaling of dynamic deformations (the stresses and strains associated with seismic waves) with distance from, and magnitude of, their triggering earthquake, and show that they can cause further earthquakes at any distance if their amplitude exceeds several microstrain, regardless of their frequency content. These triggering requirements are remarkably similar to those measured in the laboratory for inducing dynamic elastic nonlinear behaviour, which suggests that the underlying physics is similar.

  1. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  2. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  3. Segmented seismicity of the Mw 6.2 Baladeh earthquake sequence (Alborz mountains, Iran) revealed from regional moment tensors

    DEFF Research Database (Denmark)

    Donner, Stefanie; Rössler, Dirk; Krüger, Frank

    2013-01-01

    The M w 6.2 Baladeh earthquake occurred on 28 May 2004 in the Alborz Mountains, northern Iran. This earthquake was the first strong shock in this intracontinental orogen for which digital regional broadband data are available. The Baladeh event provides a rare opportunity to study fault geometry...... model, regional waveform data of the mainshock and larger aftershocks (M w  ≥3.3) were inverted for moment tensors. For the Baladeh mainshock, this included inversion for kinematic parameters. All analysed earthquakes show dominant thrust mechanisms at depths between 14 and 26 km, with NW–SE striking...

  4. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  5. Preparing Haitian youth for digital jobs | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Preparing Haitian youth for digital jobs. Haiti is currently the poorest country in the western hemisphere, with 80% of the population living under the poverty line and 54% living in abject poverty. Unemployment, at 40%, is the highest in the region, and the 2010 earthquake further inflicted $7.8 billion in damage, causing the ...

  6. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  7. Haiti Earthquake: Crisis and Response

    Science.gov (United States)

    2010-02-19

    years ago, in 1860. Haitian ministries are addressing issues such as long-term housing for those left homeless by the earthquake as they operate out...CRS Report for Congress Prepared for Members and Committees of Congress Haiti Earthquake: Crisis and Response Rhoda Margesson... Crisis and Response 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK

  8. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  9. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  10. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  11. Earthquake damage to underground facilities

    Energy Technology Data Exchange (ETDEWEB)

    Pratt, H.R.; Hustrulid, W.A. Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository.

  12. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  13. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  14. Digital evidence

    Directory of Open Access Journals (Sweden)

    Lukić Tatjana

    2012-01-01

    Full Text Available Although computer makes human activities faster and easier, innovating and creating new forms of work and other kinds of activities, it also influenced the criminal activity. The development of information technology directly affects the development of computer forensics without which, it can not even imagine the discovering and proving the computer offences and apprehending the perpetrator. Information technology and computer forensic allows us to detect and prove the crimes committed by computer and capture the perpetrators. Computer forensics is a type of forensics which can be defined as a process of collecting, preserving, analyzing and presenting digital evidence in court proceedings. Bearing in mind, that combat against crime, in which computers appear as an asset or object of the offense, requires knowledge of digital evidence as well as specific rules and procedures, the author in this article specifically addresses the issues of digital evidence, forensic (computer investigation, specific rules and procedures for detecting, fixing and collecting digital evidence and use of this type of evidence in criminal proceedings. The author also delas with international standards regarding digital evidence and cyber-space investigation.

  15. Digital watermark

    Directory of Open Access Journals (Sweden)

    Jasna Maver

    2000-01-01

    Full Text Available The huge amount of multimedia contents available on the World-Wide-Web is beginning to raise the question of their protection. Digital watermarking is a technique which can serve various purposes, including intellectual property protection, authentication and integrity verification, as well as visible or invisible content labelling of multimedia content. Due to the diversity of digital watermarking applicability, there are many different techniques, which can be categorised according to different criteria. A digital watermark can be categorised as visible or invisible and as robust or fragile. In contrast to the visible watermark where a visible pattern or image is embedded into the original image, the invisible watermark does not change the visual appearance of the image. The existence of such a watermark can be determined only through a watermark ex¬traction or detection algorithm. The robust watermark is used for copyright protection, while the fragile watermark is designed for authentication and integrity verification of multimedia content. A watermark must be detectable or extractable to be useful. In some watermarking schemes, a watermark can be extracted in its exact form, in other cases, we can detect only whether a specific given watermarking signal is present in an image. Digital libraries, through which cultural institutions will make multimedia contents available, should support a wide range of service models for intellectual property protection, where digital watermarking may play an important role.

  16. Digital Snaps

    DEFF Research Database (Denmark)

    Sandbye, Mette; Larsen, Jonas

    The New Face of Snapshot Photography / Jonas Larsen and Mette Sandbye -- pt. I. IMAGES ON WEB 2.0 AND THE CAMERA PHONE -- ch. 1. Overlooking, Rarely Looking and Not Looking / Martin Lister -- ch. 2. The (Im)mobile Life of Digital Photographs: The Case of Tourist Photography / Jonas Larsen -- ch. ...... -- ch. 9. Retouch Yourself: The Pleasures and Politics of Digital Cosmetic Surgery / Tanya Sheehan -- ch. 10. Virtual Selves: Art and Digital Autobiography / Louise Wolthers -- ch. 11. Mobile-Media Photography: New Modes of Engagement / Michael Shanks and Connie Svabo.......The New Face of Snapshot Photography / Jonas Larsen and Mette Sandbye -- pt. I. IMAGES ON WEB 2.0 AND THE CAMERA PHONE -- ch. 1. Overlooking, Rarely Looking and Not Looking / Martin Lister -- ch. 2. The (Im)mobile Life of Digital Photographs: The Case of Tourist Photography / Jonas Larsen -- ch. 3....... Distance as the New Punctum / Mikko Villi -- pt. II. FAMILY ALBUMS IN TRANSITION -- ch. 4. How Digital Technologies Do Family Snaps, Only Better / Gillian Rose -- ch. 5. Friendship Photography: Memory, Mobility and Social Networking / Joanne Garde-Hansen -- ch. 6. Play, Process and Materiality in Japanese...

  17. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  18. Digital Relationships

    DEFF Research Database (Denmark)

    Ledborg Hansen, Richard

    -­rich information and highly interesting communication are sky-­high and rising. With a continuous increase in digitized communication follows a decrease in face-­to-­face encounters and our ability to engage in inter-­personal relationships are suffering for it (Davis, 2013). The behavior described in this paper...... of residual deposits from technology in organizations and its effect on individuals ability to connect to one another. Based on the case study the paper describes indications and suggests potential implication hereof. Given the inherent enhancement possibilities of technology our expectation for entertainment......-­‐Jones, 2011) for increases in effectiveness and efficiency we indiscriminately embrace digital communication and digitized information dissemination with enthusiasm – at the risk of ignoring the potentially dark side of technology. However, technology also holds a promise for better understanding precisely...

  19. Digital "X"

    DEFF Research Database (Denmark)

    Baiyere, Abayomi; Grover, Varun; Gupta, Alok

    2017-01-01

    Interest in using digital before existing research concepts seem to be on the rise in the IS field. This panel is positioned to explore what value lies in labelling our research as digital “x” as opposed to the well established IT “x” (where “x” can be strategy, infrastructure, innovation, artifact......, capability e.t.c). The question this raises is that of how much this contributes novel insight to IS scholarship versus how much this is merely a relabeling of old wines in new wine bottles. The panel is expected to provide conceptual clarity on the use of the digital “x” concept and provide a delineation...

  20. Digital Creativity

    DEFF Research Database (Denmark)

    Petersson Brooks, Eva; Brooks, Anthony Lewis

    2014-01-01

    This paper reports on a study exploring the outcomes from children’s play with technology in early childhood learning practices. The paper addresses questions related to how digital technology can foster creativity in early childhood learning environments. It consists of an analysis of children......’s interaction with the KidSmart furniture focusing on digital creativity potentials and play values suggested by the technology. The study applied a qualitative approach and included125 children (aged three to five), 10 pedagogues, and two librarians. The results suggests that educators should sensitively...... consider intervening when children are interacting with technology, and rather put emphasize into the integration of the technology into the environment and to the curriculum in order to shape playful structures for children’s digital creativity....

  1. Digital radiography

    International Nuclear Information System (INIS)

    Rath, M.; Lissner, J.; Rienmueller, R.; Haendle, J.; Siemens A.G., Erlangen

    1984-01-01

    Using a prototype of an electronic, universal examination unit equipped with a special X-ray TV installation, spotfilm exposures and digital angiographies with high spatial resolution and wide-range contrast could be made in the clinic for the first time. With transvenous contrast medium injection, the clinical results of digital angiography show excellent image quality in the region of the carotids and renal arteries as well as the arteries of the extremities. The electronic series exposures have an image quality almost comparable to the quality obtained with cutfilm changers in conventional angiography. There are certain limitations due to the input field of 25 cm X-ray image intensified used. In respect of the digital angiography imaging technique, the electronic universal unit is fully suitable for clinical application. (orig.) [de

  2. Digital photogrammetry

    CERN Document Server

    Egels, Yves

    2003-01-01

    Photogrammetry is the use of photography for surveying primarily and is used for the production of maps from aerial photographs. Along with remote sensing, it represents the primary means of generating data for Geographic Information Systems (GIS). As technology develops, it is becoming easier to gain access to it. The cost of digital photogrammetric workstations are falling quickly and these new tools are therefore becoming accessible to more and more users. Digital Photogrammetry is particularly useful as a text for graduate students in geomantic and is also suitable for people with a good basic scientific knowledge who need to understand photogrammetry, and who wish to use the book as a reference.

  3. Digital voltmeter

    International Nuclear Information System (INIS)

    Yohannes Kamadi; Soekarno.

    1976-01-01

    The electrical voltage measuring equipment with digital display has been made. This equipment uses four digits display with single polarity measurement and integrating system. Pulses from the oscillator will be counted and converted to the staircase voltages, and compared to the voltage measured. When the balance is already achieved, the pulse will appear at the comparator circuit. This pulse will be used to trigger univibrator circuit. The univibrator output is used as signal for stopping the counting, and when reading time T already stops, the counting system will be reset. (authors)

  4. Digital caliper

    Science.gov (United States)

    Cable, Louella E.

    1967-01-01

    The large number of measurements needed to describe fully the characteristics of biological specimens and other objects has always been tedious and time consuming. The work can be done much more rapidly and with greater accuracy with a digital caliper recently developed by us. The digital caliper is a new electronic instrument built to measure objects precisely throughout the range of 0.1 mm to 1.0 m. Calipers of several different discrete sizes make it possible to select the most convenient unit for the particular range of length and degree of accuracy desired.

  5. Digital literacies

    CERN Document Server

    Hockly, Nicky; Pegrum, Mark

    2014-01-01

    Dramatic shifts in our communication landscape have made it crucial for language teaching to go beyond print literacy and encompass the digital literacies which are increasingly central to learners' personal, social, educational and professional lives. By situating these digital literacies within a clear theoretical framework, this book provides educators and students alike with not just the background for a deeper understanding of these key 21st-century skills, but also the rationale for integrating these skills into classroom practice. This is the first methodology book to address not jus

  6. Digital communication

    CERN Document Server

    Das, Apurba

    2010-01-01

    ""Digital Communications"" presents the theory and application of the philosophy of Digital Communication systems in a unique but lucid form. This book inserts equal importance to the theory and application aspect of the subject whereby the authors selected a wide class of problems. The Salient features of the book are: the foundation of Fourier series, Transform and wavelets are introduces in a unique way but in lucid language; the application area is rich and resemblance to the present trend of research, as we are attached with those areas professionally; a CD is included which contains code

  7. Digital radiography

    International Nuclear Information System (INIS)

    Elander, S.

    1986-01-01

    The report deals with a project for the development of digital, cerebral angiography competence in Norway. An IIS image processor and a DIGITAL VAX 11/750 were used for the processing of X-ray pictures. The pictures were scanned on an OPTRON C4100 and photographed on a MATRIX INSTR. 3000 videoprinter. The highpass functions Laplace, Roberts, and Sobel were utilized to enhance edges. Further, the room-variant contrast-stretch method WALLIS and the Local Adaptive Histogram Equalization (LAHE) from the SPIDER software package were applied

  8. Digital filters

    CERN Document Server

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  9. Digital Radiography

    Science.gov (United States)

    1986-01-01

    System One, a digital radiography system, incorporates a reusable image medium (RIM) which retains an image. No film is needed; the RIM is read with a laser scanner, and the information is used to produce a digital image on an image processor. The image is stored on an optical disc. System allows the radiologist to "dial away" unwanted images to compare views on three screens. It is compatible with existing equipment and cost efficient. It was commercialized by a Stanford researcher from energy selective technology developed under a NASA grant.

  10. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  11. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  12. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  13. Protracted fluvial recovery from medieval earthquakes, Pokhara, Nepal

    Science.gov (United States)

    Stolle, Amelie; Bernhardt, Anne; Schwanghart, Wolfgang; Andermann, Christoff; Schönfeldt, Elisabeth; Seidemann, Jan; Adhikari, Basanta R.; Merchel, Silke; Rugel, Georg; Fort, Monique; Korup, Oliver

    2016-04-01

    River response to strong earthquake shaking in mountainous terrain often entails the flushing of sediments delivered by widespread co-seismic landsliding. Detailed mass-balance studies following major earthquakes in China, Taiwan, and New Zealand suggest fluvial recovery times ranging from several years to decades. We report a detailed chronology of earthquake-induced valley fills in the Pokhara region of western-central Nepal, and demonstrate that rivers continue to adjust to several large medieval earthquakes to the present day, thus challenging the notion of transient fluvial response to seismic disturbance. The Pokhara valley features one of the largest and most extensively dated sedimentary records of earthquake-triggered sedimentation in the Himalayas, and independently augments paleo-seismological archives obtained mainly from fault trenches and historic documents. New radiocarbon dates from the catastrophically deposited Pokhara Formation document multiple phases of extremely high geomorphic activity between ˜700 and ˜1700 AD, preserved in thick sequences of alternating fluvial conglomerates, massive mud and silt beds, and cohesive debris-flow deposits. These dated fan-marginal slackwater sediments indicate pronounced sediment pulses in the wake of at least three large medieval earthquakes in ˜1100, 1255, and 1344 AD. We combine these dates with digital elevation models, geological maps, differential GPS data, and sediment logs to estimate the extent of these three pulses that are characterized by sedimentation rates of ˜200 mm yr-1 and peak rates as high as 1,000 mm yr-1. Some 5.5 to 9 km3 of material infilled the pre-existing topography, and is now prone to ongoing fluvial dissection along major canyons. Contemporary river incision into the Pokhara Formation is rapid (120-170 mm yr-1), triggering widespread bank erosion, channel changes, and very high sediment yields of the order of 103 to 105 t km-2 yr-1, that by far outweigh bedrock denudation rates

  14. Earthquake fault superhighways

    Science.gov (United States)

    Robinson, D. P.; Das, S.; Searle, M. P.

    2010-10-01

    Motivated by the observation that the rare earthquakes which propagated for significant distances at supershear speeds occurred on very long straight segments of faults, we examine every known major active strike-slip fault system on land worldwide and identify those with long (> 100 km) straight portions capable not only of sustained supershear rupture speeds but having the potential to reach compressional wave speeds over significant distances, and call them "fault superhighways". The criteria used for identifying these are discussed. These superhighways include portions of the 1000 km long Red River fault in China and Vietnam passing through Hanoi, the 1050 km long San Andreas fault in California passing close to Los Angeles, Santa Barbara and San Francisco, the 1100 km long Chaman fault system in Pakistan north of Karachi, the 700 km long Sagaing fault connecting the first and second cities of Burma, Rangoon and Mandalay, the 1600 km Great Sumatra fault, and the 1000 km Dead Sea fault. Of the 11 faults so classified, nine are in Asia and two in North America, with seven located near areas of very dense populations. Based on the current population distribution within 50 km of each fault superhighway, we find that more than 60 million people today have increased seismic hazards due to them.

  15. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  16. Overview of Historical Earthquake Document Database in Japan and Future Development

    Science.gov (United States)

    Nishiyama, A.; Satake, K.

    2014-12-01

    In Japan, damage and disasters from historical large earthquakes have been documented and preserved. Compilation of historical earthquake documents started in the early 20th century and 33 volumes of historical document source books (about 27,000 pages) have been published. However, these source books are not effectively utilized for researchers due to a contamination of low-reliability historical records and a difficulty for keyword searching by characters and dates. To overcome these problems and to promote historical earthquake studies in Japan, construction of text database started in the 21 century. As for historical earthquakes from the beginning of the 7th century to the early 17th century, "Online Database of Historical Documents in Japanese Earthquakes and Eruptions in the Ancient and Medieval Ages" (Ishibashi, 2009) has been already constructed. They investigated the source books or original texts of historical literature, emended the descriptions, and assigned the reliability of each historical document on the basis of written age. Another database compiled the historical documents for seven damaging earthquakes occurred along the Sea of Japan coast in Honshu, central Japan in the Edo period (from the beginning of the 17th century to the middle of the 19th century) and constructed text database and seismic intensity data base. These are now publicized on the web (written only in Japanese). However, only about 9 % of the earthquake source books have been digitized so far. Therefore, we plan to digitize all of the remaining historical documents by the research-program which started in 2014. The specification of the data base will be similar for previous ones. We also plan to combine this database with liquefaction traces database, which will be constructed by other research program, by adding the location information described in historical documents. Constructed database would be utilized to estimate the distributions of seismic intensities and tsunami

  17. 15 CFR 950.5 - National Geophysical and Solar-Terrestrial Data Center (NGSDC).

    Science.gov (United States)

    2010-01-01

    .... Ionosphere data, including ionograms, frequency plots, riometer and field-strength strip charts, and... accelerograms; earthquake data list (events since January 1900); earthquake data service with updates on a...

  18. Digital Tidbits

    Science.gov (United States)

    Kumaran, Maha; Geary, Joe

    2011-01-01

    Technology has transformed libraries. There are digital libraries, electronic collections, online databases and catalogs, ebooks, downloadable books, and much more. With free technology such as social websites, newspaper collections, downloadable online calendars, clocks and sticky notes, online scheduling, online document sharing, and online…

  19. Digital Forensics

    Science.gov (United States)

    Harron, Jason; Langdon, John; Gonzalez, Jennifer; Cater, Scott

    2017-01-01

    The term forensic science may evoke thoughts of blood-spatter analysis, DNA testing, and identifying molds, spores, and larvae. A growing part of this field, however, is that of digital forensics, involving techniques with clear connections to math and physics. This article describes a five-part project involving smartphones and the investigation…

  20. Digital books.

    Science.gov (United States)

    Wink, Diane M

    2011-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes digital books.

  1. Digital Methods

    NARCIS (Netherlands)

    Rogers, R.

    2013-01-01

    In Digital Methods, Richard Rogers proposes a methodological outlook for social and cultural scholarly research on the Web that seeks to move Internet research beyond the study of online culture. It is not a toolkit for Internet research, or operating instructions for a software package; it deals

  2. Digital Humanities and networked digital media

    Directory of Open Access Journals (Sweden)

    Niels Ole Finnemann

    2014-10-01

    Full Text Available This article discusses digital humanities and the growing diversity of digital media, digital materials and digital methods. The first section describes the humanities computing tradition formed around the interpretation of computation as a rule-based process connected to a concept of digital materials centred on the digitisation of non-digital, finite works, corpora and oeuvres. The second section discusses “the big tent” of contemporary digital humanities. It is argued that there can be no unifying interpretation of digital humanities above the level of studying digital materials with the help of software-supported methods. This is so, in part, because of the complexity of the world and, in part, because digital media remain open to the projection of new epistemologies onto the functional architecture of these media. The third section discusses the heterogeneous character of digital materials and proposes that the study of digital materials should be established as a field in its own right.

  3. Digital Humanities and networked digital media

    Directory of Open Access Journals (Sweden)

    Niels Ole Finnemann

    2014-12-01

    Full Text Available This article discusses digital humanities and the growing diversity of digital media, digital materials and digital methods. The first section describes the humanities computing tradition formed around the interpretation of computation as a rule-based process connected to a concept of digital materials centred on the digitisation of non-digital, finite works, corpora and oeuvres. The second section discusses “the big tent” of contemporary digital humanities. It is argued that there can be no unifying interpretation of digital humanities above the level of studying digital materials with the help of software-supported methods. This is so, in part, because of the complexity of the world and, in part, because digital media remain open to the projection of new epistemologies onto the functional architecture of these media. The third section discusses the heterogeneous character of digital materials and proposes that the study of digital materials should be established as a field in its own right.

  4. Measuring co-seismic deformation of the Sichuan earthquake by satellite differential INSAR

    Science.gov (United States)

    Zhang, Yonghong; Gong, Wenyu; Zhang, Jixian

    2008-12-01

    The Sichuan Earthquake, occurred on May 12, 2008, is the strongest earthquake to hit China since the 1976 Tangshan earthquake. The earthquake had a magnitude of M 8.0, and caused surface deformation greater than 3 meters. This paper presents the research work of measuring the co-seismic deformations of the earthquake with satellite differential interferometric SAR technique. Four L-band SAR images were used to form the interferogram with 2 pre- scenes imaged on Feb 17, 2008 and 2 post- scenes on May 19, 2008. The Digital Elevation Models extracted from 1:50,000-scale national geo-spatial database were used to remove the topographic contribution and form a differential interferogram. The interferogram presents very high coherence in most areas, although the pre- and post- images were acquired with time interval of 92 days. This indicates that the L-band PALSAR sensor is very powerful for interferometry applications. The baseline error is regarded as the main phase error source in the differential interferogram. Due to the difficulties of doing field works immediately after the earthquake, only one deformation measurement recorded by a permanent GPS station is obtained for this research. An approximation method is proposed to eliminate the orbital phase error with one control point. The derived deformation map shows similar spatial pattern and deformation magnitude compared with deformation field generated by seismic inversion method.

  5. Indonesian earthquake: earthquake risk from co-seismic stress.

    Science.gov (United States)

    McCloskey, John; Nalbant, Suleyman S; Steacy, Sandy

    2005-03-17

    Following the massive loss of life caused by the Sumatra-Andaman earthquake in Indonesia and its tsunami, the possibility of a triggered earthquake on the contiguous Sunda trench subduction zone is a real concern. We have calculated the distributions of co-seismic stress on this zone, as well as on the neighbouring, vertical strike-slip Sumatra fault, and find an increase in stress on both structures that significantly boosts the already considerable earthquake hazard posed by them. In particular, the increased potential for a large subduction-zone event in this region, with the concomitant risk of another tsunami, makes the need for a tsunami warning system in the Indian Ocean all the more urgent.

  6. Using earthquake intensities to forecast earthquake occurrence times

    Directory of Open Access Journals (Sweden)

    J. R. Holliday

    2006-01-01

    Full Text Available It is well known that earthquakes do not occur randomly in space and time. Foreshocks, aftershocks, precursory activation, and quiescence are just some of the patterns recognized by seismologists. Using the Pattern Informatics technique along with relative intensity analysis, we create a scoring method based on time dependent relative operating characteristic diagrams and show that the occurrences of large earthquakes in California correlate with time intervals where fluctuations in small earthquakes are suppressed relative to the long term average. We estimate a probability of less than 1% that this coincidence is due to random clustering. Furthermore, we show that the methods used to obtain these results may be applicable to other parts of the world.

  7. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  8. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    Science.gov (United States)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  9. Book review: Earthquakes and water

    Science.gov (United States)

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  10. Unbonded Prestressed Columns for Earthquake Resistance

    Science.gov (United States)

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  11. Global Earthquake Hazard Distribution - Peak Ground Acceleration

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Distribution-Peak Ground Acceleration is a 2.5 by 2.5 minute grid of global earthquake hazards developed using Global Seismic Hazard Program...

  12. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 by 2.5 minute global utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  13. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  14. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  15. Global Earthquake Proportional Economic Loss Risk Deciles

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Proportional Economic Loss Risk Deciles is a 2.5 minute grid of earthquake hazard economic loss as proportions of Gross Domestic Product (GDP) per...

  16. Global Earthquake Total Economic Loss Risk Deciles

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Total Economic Loss Risk Deciles is a 2.5 minute grid of global earthquake total economic loss risks. A process of spatially allocating Gross...

  17. Global Earthquake Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Mortality Risks and Distribution is a 2.5 minute grid of global earthquake mortality risks. Gridded Population of the World, Version 3 (GPWv3) data...

  18. Global Earthquake Mortality Risks and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  19. Global Earthquake Hazard Distribution - Peak Ground Acceleration

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Distribution-peak ground acceleration is a 2.5 minute grid of global earthquake hazards developed using Global Seismic Hazard Program...

  20. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  1. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  2. Historical earthquake investigations in Greece

    Directory of Open Access Journals (Sweden)

    K. Makropoulos

    2004-06-01

    Full Text Available The active tectonics of the area of Greece and its seismic activity have always been present in the country?s history. Many researchers, tempted to work on Greek historical earthquakes, have realized that this is a task not easily fulfilled. The existing catalogues of strong historical earthquakes are useful tools to perform general SHA studies. However, a variety of supporting datasets, non-uniformly distributed in space and time, need to be further investigated. In the present paper, a review of historical earthquake studies in Greece is attempted. The seismic history of the country is divided into four main periods. In each one of them, characteristic examples, studies and approaches are presented.

  3. Associating an ionospheric parameter with major earthquake ...

    Indian Academy of Sciences (India)

    With time, ionospheric variation analysis is gaining over lithospheric monitoring in serving precursors for earthquake forecast. The current paper highlights the association of major (Ms ≥ 6.0) and medium (4.0 ≤ Ms > 6.0) earthquake occurrences throughout the world in different ranges of the Ionospheric Earthquake ...

  4. Earthquakes: A Teacher's Package for K-6.

    Science.gov (United States)

    National Science Teachers Association, Washington, DC.

    Like rain, an earthquake is a natural occurrence which may be mild or catastrophic. Although an earthquake may last only a few seconds, the processes that cause it have operated within the earth for millions of years. Until recently, the cause of earthquakes was a mystery and the subject of fanciful folklore to people all around the world. This…

  5. Earthquakes in the New Zealand Region.

    Science.gov (United States)

    Wallace, Cleland

    1995-01-01

    Presents a thorough overview of earthquakes in New Zealand, discussing plate tectonics, seismic measurement, and historical occurrences. Includes 10 figures illustrating such aspects as earthquake distribution, intensity, and fissures in the continental crust. Tabular data includes a list of most destructive earthquakes and descriptive effects…

  6. Earthquakes in Zimbabwe | Clark | Zimbabwe Science News

    African Journals Online (AJOL)

    Earthquakes are one of the most destructive natural forces, in both human and economic terms. For example, since 1900, 10 earthquakes have occurred that each killed over 50 000 people. Earthquakes in modern industrialized areas can be also be very costly, even if well designed and constructed buildings save many ...

  7. Can Dams and Reservoirs Cause Earthquakes?

    Indian Academy of Sciences (India)

    indirect investigations of these regions are subject to inevitable multiple interpretations. Still, a measure of understanding about reservoir induced earthquakes has been achieved. It is my aim to put the phenomenon in a perspective on this basis. I saw the Koyna Earthquake Recorded. Koyna earthquake of December 10, ...

  8. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  9. Technical features of a low-cost earthquake alert system

    International Nuclear Information System (INIS)

    Harben, P.

    1991-01-01

    The concept and features of an Earthquake Alert System (EAS) involving a distributed network of strong motion sensors is discussed. The EAS analyzes real-time data telemetered to a central facility and issues an areawide warning of a large earthquake in advance of the spreading elastic wave energy. A low-cost solution to high-cost estimates for installation and maintenance of a dedicated EAS is presented that makes use of existing microseismic stations. Using the San Francisco Bay area as an example, we show that existing US Geological Survey microseismic monitoring stations are of sufficient density to form the elements of a prototype EAS. By installing strong motion instrumentation and a specially developed switching device, strong ground motion can be telemetered in real-time to the central microseismic station on the existing communication channels. When a large earthquake occurs, a dedicated real-time central processing unit at the central microseismic station digitizes and analyzes the incoming data and issues a warning containing location and magnitude estimations. A 50-station EAS of this type in the San Francisco Bay area should cost under $70,000 to install and less than $5,000 annually to maintain

  10. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M.J.S.; Linde, A.T.; Gladwin, M.T.; Borcherdt, R.D.

    1987-01-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake (ML = 6.7, ?? = 51 km), the August 4, 1985, Kettleman Hills earthquake (ML = 5.5, ?? = 34 km), the April 1984 Morgan Hill earthquake (ML = 6.1, ?? = 55 km), the November 1984 Round Valley earthquake (ML = 5.8, ?? = 54 km), the January 14, 1978, Izu, Japan earthquake (ML = 7.0, ?? = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10-8), with borehole dilatometers (resolution 10-10) and a 3-component borehole strainmeter (resolution 10-9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure. ?? 1987.

  11. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  12. Becoming digital

    DEFF Research Database (Denmark)

    Pors, Anja Svejgaard

    2015-01-01

    government, and draws on empirical material generated through observations, field notes, interviews and policy documents. The material is documenting how service is performed by frontline agents in the ‘bureaucratic encounter’ with citizens, who needs assistance to use digital self-service in order to apply...... online for a public benefit. Findings: The paper shows that e-government technology changes the mode of professionalism in citizen service from service to support. The paper gives an empirical account of recent Danish digital reforms and shows how the reforms both enable and constrain the work...... of ‘becoming digital’ by frontline agents. Overall the street-level bureaucrat’s classical tasks such as specialized casework are being displaced into promoting and educational tasks. An implication of this is blurred distinctions between professional skills and personal competences of the frontline agent...

  13. Digital resources

    Directory of Open Access Journals (Sweden)

    Redazione Reti Medievali (a cura di

    2005-12-01

    Full Text Available Bibliotheca Latinitatis Mediaevalis (circa VII sec. - XIV sec. IntraText Digital Library [01/06] Corpus Scriptorum Latinorum. A digital library of Latin literature by David Camden [01/06] Fonti disponibili online concernenti la vita religiosa medievale Rete Vitae Religiosae Mediaevalis Studia Conectens [01/06] Fuentes del Medievo Hispanico Instituto de Historia, Consejo Superior de Investigaciones Científicas [01/06] Latin Literature Forum Romanum [01/06] Ludovico Antonio Muratori, Dissertazioni sopra le antichità italiane, 1751 Biblioteca dei Classici Italiani di Giuseppe Bonghi [01/06] Medieval Latin The Latin Library [01/06] Médiévales Presses Universitaires de Vincennes - Revues.org [01/06] Regesta imperii Deutsche Kommission für die Bearbeitung der Regesta Imperii e.V. [01/06] Suda On Line Byzantine Lexicography [01/06

  14. Digital produktion

    DEFF Research Database (Denmark)

    Bogen sætter fokus på digital produktion, som er en stærk læringsform, der faciliterer elevernes læreprocesser og kvalificerer elevernes faglige læringsresultater. Det sker når lærerne udarbejder didaktiske rammedesign, hvor eleverne arbejder selvstændigt inden for dette rammedesign, og hvor mål ...

  15. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  16. Computational methods in earthquake engineering

    CERN Document Server

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  17. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    Science.gov (United States)

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  18. Testing hypotheses of earthquake occurrence

    Science.gov (United States)

    Kagan, Y. Y.; Jackson, D. D.; Schorlemmer, D.; Gerstenberger, M.

    2003-12-01

    We present a relatively straightforward likelihood method for testing those earthquake hypotheses that can be stated as vectors of earthquake rate density in defined bins of area, magnitude, and time. We illustrate the method as it will be applied to the Regional Earthquake Likelihood Models (RELM) project of the Southern California Earthquake Center (SCEC). Several earthquake forecast models are being developed as part of this project, and additional contributed forecasts are welcome. Various models are based on fault geometry and slip rates, seismicity, geodetic strain, and stress interactions. We would test models in pairs, requiring that both forecasts in a pair be defined over the same set of bins. Thus we offer a standard "menu" of bins and ground rules to encourage standardization. One menu category includes five-year forecasts of magnitude 5.0 and larger. Forecasts would be in the form of a vector of yearly earthquake rates on a 0.05 degree grid at the beginning of the test. Focal mechanism forecasts, when available, would be also be archived and used in the tests. The five-year forecast category may be appropriate for testing hypotheses of stress shadows from large earthquakes. Interim progress will be evaluated yearly, but final conclusions would be made on the basis of cumulative five-year performance. The second category includes forecasts of earthquakes above magnitude 4.0 on a 0.05 degree grid, evaluated and renewed daily. Final evaluation would be based on cumulative performance over five years. Other types of forecasts with different magnitude, space, and time sampling are welcome and will be tested against other models with shared characteristics. All earthquakes would be counted, and no attempt made to separate foreshocks, main shocks, and aftershocks. Earthquakes would be considered as point sources located at the hypocenter. For each pair of forecasts, we plan to compute alpha, the probability that the first would be wrongly rejected in favor of

  19. Digital citizens Digital nations: the next agenda

    NARCIS (Netherlands)

    A.W. (Bert) Mulder; M.W. (Martijn) Hartog

    2015-01-01

    DIGITAL CITIZENS CREATE A DIGITAL NATION Citizens will play the lead role as they – in the next phase of the information society – collectively create a digital nation. Personal adoption of information and communication technology will create a digital infrastructure that supports individual and

  20. Digital Humanities and networked digital media

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2014-01-01

    of software-supported methods. This is so, in part, because of the complexity of the world and, in part, because digital media remain open to the projection of new epistemologies onto the functional architecture of these media. The third section discusses the heterogeneous character of digital materials......This article discusses digital humanities and the growing diversity of digital media, digital materials and digital methods. The first section describes the humanities computing tradition formed around the interpretation of computation as a rule-based process connected to a concept of digital...

  1. A critical history of British earthquakes

    OpenAIRE

    R. M. W. Musson

    2004-01-01

    This paper reviews the history of the study of historical British earthquakes. The publication of compendia of British earthquakes goes back as early as the late 16th Century. A boost to the study of earthquakes in Britain was given in the mid 18th Century as a result of two events occurring in London in 1750 (analogous to the general increase in earthquakes in Europe five years later after the 1755 Lisbon earthquake). The 19th Century saw a number of significant studies, culminating in th...

  2. Detection of collapsed buildings from lidar data due to the 2016 Kumamoto earthquake in Japan

    Science.gov (United States)

    Moya, Luis; Yamazaki, Fumio; Liu, Wen; Yamada, Masumi

    2018-01-01

    The 2016 Kumamoto earthquake sequence was triggered by an Mw 6.2 event at 21:26 on 14 April. Approximately 28 h later, at 01:25 on 16 April, an Mw 7.0 event (the mainshock) followed. The epicenters of both events were located near the residential area of Mashiki and affected the region nearby. Due to very strong seismic ground motion, the earthquake produced extensive damage to buildings and infrastructure. In this paper, collapsed buildings were detected using a pair of digital surface models (DSMs), taken before and after the 16 April mainshock by airborne light detection and ranging (lidar) flights. Different methods were evaluated to identify collapsed buildings from the DSMs. The change in average elevation within a building footprint was found to be the most important factor. Finally, the distribution of collapsed buildings in the study area was presented, and the result was consistent with that of a building damage survey performed after the earthquake.

  3. ARMA models for earthquake ground motions. Seismic Safety Margins Research Program

    International Nuclear Information System (INIS)

    Chang, Mark K.; Kwiatkowski, Jan W.; Nau, Robert F.; Oliver, Robert M.; Pister, Karl S.

    1981-02-01

    This report contains an analysis of four major California earthquake records using a class of discrete linear time-domain processes commonly referred to as ARMA (Autoregressive/Moving-Average) models. It has been possible to analyze these different earthquakes, identify the order of the appropriate ARMA model(s), estimate parameters and test the residuals generated by these models. It has also been possible to show the connections, similarities and differences between the traditional continuous models (with parameter estimates based on spectral analyses) and the discrete models with parameters estimated by various maximum likelihood techniques applied to digitized acceleration data in the time domain. The methodology proposed in this report is suitable for simulating earthquake ground motions in the time domain and appears to be easily adapted to serve as inputs for nonlinear discrete time models of structural motions. (author)

  4. PAGER-CAT: A composite earthquake catalog for calibrating global fatality models

    Science.gov (United States)

    Allen, T.I.; Marano, K.D.; Earle, P.S.; Wald, D.J.

    2009-01-01

    We have described the compilation and contents of PAGER-CAT, an earthquake catalog developed principally for calibrating earthquake fatality models. It brings together information from a range of sources in a comprehensive, easy to use digital format. Earthquake source information (e.g., origin time, hypocenter, and magnitude) contained in PAGER-CAT has been used to develop an Atlas of Shake Maps of historical earthquakes (Allen et al. 2008) that can subsequently be used to estimate the population exposed to various levels of ground shaking (Wald et al. 2008). These measures will ultimately yield improved earthquake loss models employing the uniform hazard mapping methods of ShakeMap. Currently PAGER-CAT does not consistently contain indicators of landslide and liquefaction occurrence prior to 1973. In future PAGER-CAT releases we plan to better document the incidence of these secondary hazards. This information is contained in some existing global catalogs but is far from complete and often difficult to parse. Landslide and liquefaction hazards can be important factors contributing to earthquake losses (e.g., Marano et al. unpublished). Consequently, the absence of secondary hazard indicators in PAGER-CAT, particularly for events prior to 1973, could be misleading to sorne users concerned with ground-shaking-related losses. We have applied our best judgment in the selection of PAGER-CAT's preferred source parameters and earthquake effects. We acknowledge the creation of a composite catalog always requires subjective decisions, but we believe PAGER-CAT represents a significant step forward in bringing together the best available estimates of earthquake source parameters and reports of earthquake effects. All information considered in PAGER-CAT is stored as provided in its native catalog so that other users can modify PAGER preferred parameters based on their specific needs or opinions. As with all catalogs, the values of some parameters listed in PAGER-CAT are

  5. Predecessors of the giant 1960 Chile earthquake.

    Science.gov (United States)

    Cisternas, Marco; Atwater, Brian F; Torrejón, Fernando; Sawai, Yuki; Machuca, Gonzalo; Lagos, Marcelo; Eipert, Annaliese; Youlton, Cristián; Salgado, Ignacio; Kamataki, Takanobu; Shishikura, Masanobu; Rajendran, C P; Malik, Javed K; Rizal, Yan; Husni, Muhammad

    2005-09-15

    It is commonly thought that the longer the time since last earthquake, the larger the next earthquake's slip will be. But this logical predictor of earthquake size, unsuccessful for large earthquakes on a strike-slip fault, fails also with the giant 1960 Chile earthquake of magnitude 9.5 (ref. 3). Although the time since the preceding earthquake spanned 123 years (refs 4, 5), the estimated slip in 1960, which occurred on a fault between the Nazca and South American tectonic plates, equalled 250-350 years' worth of the plate motion. Thus the average interval between such giant earthquakes on this fault should span several centuries. Here we present evidence that such long intervals were indeed typical of the last two millennia. We use buried soils and sand layers as records of tectonic subsidence and tsunami inundation at an estuary midway along the 1960 rupture. In these records, the 1960 earthquake ended a recurrence interval that had begun almost four centuries before, with an earthquake documented by Spanish conquistadors in 1575. Two later earthquakes, in 1737 and 1837, produced little if any subsidence or tsunami at the estuary and they therefore probably left the fault partly loaded with accumulated plate motion that the 1960 earthquake then expended.

  6. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  7. Earthquake design for controlled structures

    Directory of Open Access Journals (Sweden)

    Nikos G. Pnevmatikos

    2017-04-01

    Full Text Available An alternative design philosophy, for structures equipped with control devices, capable to resist an expected earthquake while remaining in the elastic range, is described. The idea is that a portion of the earthquake loading is under¬taken by the control system and the remaining by the structure which is designed to resist elastically. The earthquake forces assuming elastic behavior (elastic forces and elastoplastic behavior (design forces are first calculated ac¬cording to the codes. The required control forces are calculated as the difference from elastic to design forces. The maximum value of capacity of control devices is then compared to the required control force. If the capacity of the control devices is larger than the required control force then the control devices are accepted and installed in the structure and the structure is designed according to the design forces. If the capacity is smaller than the required control force then a scale factor, α, reducing the elastic forces to new design forces is calculated. The structure is redesigned and devices are installed. The proposed procedure ensures that the structure behaves elastically (without damage for the expected earthquake at no additional cost, excluding that of buying and installing the control devices.

  8. Control rod behaviour in earthquakes

    International Nuclear Information System (INIS)

    Kawakami, S.; Akiyama, H.; Shibata, H.; Watabe, M.; Ichikawa, T.; Fujita, K.

    1990-01-01

    For some years the Japanese have been working on a major research programme to determine the likely effects of an earthquake on nuclear plant internals. One aspect of this was a study of the behaviour of Pressurized Water Reactor control rods as they are being inserted in the core, which is reported here. (author)

  9. Building Resilient Mountain Communities: Earthquake ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2015-04-25

    A powerful 7.8 Richter magnitude earthquake hit central Nepal on April 25, 2015, causing over 8,700 deaths and more than 22,000 injuries. Hundreds of thousands of homes were flattened, some 15,000 government buildings and 288,797 residential buildings were destroyed, and some 8,000 schools and 1,000 health ...

  10. Explanation of earthquake response spectra

    OpenAIRE

    Douglas, John

    2017-01-01

    This is a set of five slides explaining how earthquake response spectra are derived from strong-motion records and simple models of structures and their purpose within seismic design and assessment. It dates from about 2002 and I have used it in various introductory lectures on engineering seismology.

  11. Earthquake swarms in South America

    Science.gov (United States)

    Holtkamp, S. G.; Pritchard, M. E.; Lohman, R. B.

    2011-10-01

    We searched for earthquake swarms in South America between 1973 and 2009 using the global Preliminary Determination of Epicenters (PDE) catalogue. Seismicity rates vary greatly over the South American continent, so we employ a manual search approach that aims to be insensitive to spatial and temporal scales or to the number of earthquakes in a potential swarm. We identify 29 possible swarms involving 5-180 earthquakes each (with total swarm moment magnitudes between 4.7 and 6.9) within a range of tectonic and volcanic locations. Some of the earthquake swarms on the subduction megathrust occur as foreshocks and delineate the limits of main shock rupture propagation for large earthquakes, including the 2010 Mw 8.8 Maule, Chile and 2007 Mw 8.1 Pisco, Peru earthquakes. Also, subduction megathrust swarms commonly occur at the location of subduction of aseismic ridges, including areas of long-standing seismic gaps in Peru and Ecuador. The magnitude-frequency relationship of swarms we observe appears to agree with previously determined magnitude-frequency scaling for swarms in Japan. We examine geodetic data covering five of the swarms to search for an aseismic component. Only two of these swarms (at Copiapó, Chile, in 2006 and near Ticsani Volcano, Peru, in 2005) have suitable satellite-based Interferometric Synthetic Aperture Radar (InSAR) observations. We invert the InSAR geodetic signal and find that the ground deformation associated with these swarms does not require a significant component of aseismic fault slip or magmatic intrusion. Three swarms in the vicinity of the volcanic arc in southern Peru appear to be triggered by the Mw= 8.5 2001 Peru earthquake, but predicted static Coulomb stress changes due to the main shock were very small at the swarm locations, suggesting that dynamic triggering processes may have had a role in their occurrence. Although we identified few swarms in volcanic regions, we suggest that particularly large volcanic swarms (those that

  12. Digital fluorimeter

    International Nuclear Information System (INIS)

    Mello, H.A. de.

    1980-11-01

    The specifications of a digital fluorimeter. That with adequated analytical techniques permits to determine trace amounts of fluorescents materials in samples, are described. The fluorimeter is of the reflection type, and uses fluorescents lamps for the excitation and an optical system which is connected to a photomultiplyer machine and permits the measurement of the light intensity. In the case of IEN (Instituto de Engenharia Nuclear) the equipment is used for to determine the uranium content in sample materials to be exported. The precision of the instrument is about + - 1% in the scale of 0.1 which is the normally one used in the current researchs. (E.G.) [pt

  13. Earthquake sources near Uturuncu Volcano

    Science.gov (United States)

    Keyson, L.; West, M. E.

    2013-12-01

    Uturuncu, located in southern Bolivia near the Chile and Argentina border, is a dacitic volcano that was last active 270 ka. It is a part of the Altiplano-Puna Volcanic Complex, which spans 50,000 km2 and is comprised of a series of ignimbrite flare-ups since ~23 ma. Two sets of evidence suggest that the region is underlain by a significant magma body. First, seismic velocities show a low velocity layer consistent with a magmatic sill below depths of 15-20 km. This inference is corroborated by high electrical conductivity between 10km and 30km. This magma body, the so called Altiplano-Puna Magma Body (APMB) is the likely source of volcanic activity in the region. InSAR studies show that during the 1990s, the volcano experienced an average uplift of about 1 to 2 cm per year. The deformation is consistent with an expanding source at depth. Though the Uturuncu region exhibits high rates of crustal seismicity, any connection between the inflation and the seismicity is unclear. We investigate the root causes of these earthquakes using a temporary network of 33 seismic stations - part of the PLUTONS project. Our primary approach is based on hypocenter locations and magnitudes paired with correlation-based relative relocation techniques. We find a strong tendency toward earthquake swarms that cluster in space and time. These swarms often last a few days and consist of numerous earthquakes with similar source mechanisms. Most seismicity occurs in the top 10 kilometers of the crust and is characterized by well-defined phase arrivals and significant high frequency content. The frequency-magnitude relationship of this seismicity demonstrates b-values consistent with tectonic sources. There is a strong clustering of earthquakes around the Uturuncu edifice. Earthquakes elsewhere in the region align in bands striking northwest-southeast consistent with regional stresses.

  14. An open repository of earthquake-triggered ground-failure inventories

    Science.gov (United States)

    Schmitt, Robert G.; Tanyas, Hakan; Nowicki Jessee, M. Anna; Zhu, Jing; Biegel, Katherine M.; Allstadt, Kate E.; Jibson, Randall W.; Thompson, Eric M.; van Westen, Cees J.; Sato, Hiroshi P.; Wald, David J.; Godt, Jonathan W.; Gorum, Tolga; Xu, Chong; Rathje, Ellen M.; Knudsen, Keith L.

    2017-12-20

    Earthquake-triggered ground failure, such as landsliding and liquefaction, can contribute significantly to losses, but our current ability to accurately include them in earthquake-hazard analyses is limited. The development of robust and widely applicable models requires access to numerous inventories of ground failures triggered by earthquakes that span a broad range of terrains, shaking characteristics, and climates. We present an openly accessible, centralized earthquake-triggered groundfailure inventory repository in the form of a ScienceBase Community to provide open access to these data with the goal of accelerating research progress. The ScienceBase Community hosts digital inventories created by both U.S. Geological Survey (USGS) and non-USGS authors. We present the original digital inventory files (when available) as well as an integrated database with uniform attributes. We also summarize the mapping methodology and level of completeness as reported by the original author(s) for each inventory. This document describes the steps taken to collect, process, and compile the inventories and the process for adding additional ground-failure inventories to the ScienceBase Community in the future.

  15. Effective Use of Earthquake Data,

    Science.gov (United States)

    1983-01-01

    exchange of digital data should be used. * gIs " Item need Imdiate actio "n- 421, 13. Use Natioala Nesearb CMoUmei a Cmittse ON Seimal@W shoul asem the...ollectLons be kept indefinitely as part of an active. accessIbl national sismolocal data base. Specifically, with regard to the analog and digital magnetic

  16. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  17. Napa earthquake: An earthquake in a highly connected world

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  18. Estimation of source parameters of Chamoli Earthquake, India

    Indian Academy of Sciences (India)

    R. Narasimhan, Krishtel eMaging Solutions

    experienced two more devastating earthquakes of magnitude greater than 6.0 in the last decade namely the Uttarkashi earthquake in 1991 and the Chamoli earthquake in 1999 (Rajendran et al 2000, Rastogi. 2000). The effect of these earthquakes was felt up to approx. 300 km. in the city of Delhi. In the recent earthquake ...

  19. Regulatory point of view on Hengchun earthquake

    International Nuclear Information System (INIS)

    Niu, H.C.; Hsu, M.T.; Chen, Y.B.

    2008-01-01

    At the night of December 26th, 2006, a series of earthquakes struck Hengchun area where Maanshan NPS (MNPS) is located. Two main earthquakes with magnitude of 7.0 (Richters scale) occurred at 20:26 and 20:34 respectively. The epicenter of 20:34 earthquake, which was closer to the seashore than 20:26 earthquake, located at 33.5 Km west from MNPS and 50.2 Km depth down the surface. Before the earthquake, both MNPS units were at rated power operation. The unit no.2 operators tripped Reactor manually due to high vibration alarms from reactor coolant pumps and main turbine. While unit no.1 operators had decided to take the same action, the intensity of earthquake became less and less, so the shift supervisor made decision to keep unit no.1 in operation. The maximum peak ground acceleration recorded by MNPS seismic monitoring system was 0.16g which was still under MNPS seismic design basis, safe shutdown earthquake (SEE: 0.4g) and operating basis earthquake (OBE: 0.2g). The post-earthquake inspection of both units showed that there was no major damage on all SSCs. It still was the strongest earthquake which have ever been recorded in Taiwan's NPS site area since 1978, the first nuclear power station declared commercial operation. From regulatory point of view, it is important by taking account of the experience and lessons learned from Hengchun Earthquake. Especially, the training requirements of operators, the standard operating procedures during and after the earthquake need to be re-evaluated to enhance the ability to prevent the hazard during an earthquake event. (author)

  20. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  1. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  2. Analysis of the enhanced negative correlation between electron density and electron temperature related to earthquakes

    Science.gov (United States)

    Shen, X. H.; Zhang, X.; Liu, J.; Zhao, S. F.; Yuan, G. P.

    2015-04-01

    Ionospheric perturbations in plasma parameters have been observed before large earthquakes, but the correlation between different parameters has been less studied in previous research. The present study is focused on the relationship between electron density (Ne) and temperature (Te) observed by the DEMETER (Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions) satellite during local nighttime, in which a positive correlation has been revealed near the equator and a weak correlation at mid- and low latitudes over both hemispheres. Based on this normal background analysis, the negative correlation with the lowest percent in all Ne and Te points is studied before and after large earthquakes at mid- and low latitudes. The multiparameter observations exhibited typical synchronous disturbances before the Chile M8.8 earthquake in 2010 and the Pu'er M6.4 in 2007, and Te varied inversely with Ne over the epicentral areas. Moreover, statistical analysis has been done by selecting the orbits at a distance of 1000 km and ±7 days before and after the global earthquakes. Enhanced negative correlation coefficients lower than -0.5 between Ne and Te are found in 42% of points to be connected with earthquakes. The correlation median values at different seismic levels show a clear decrease with earthquakes larger than 7. Finally, the electric-field-coupling model is discussed; furthermore, a digital simulation has been carried out by SAMI2 (Sami2 is Another Model of the Ionosphere), which illustrates that the external electric field in the ionosphere can strengthen the negative correlation in Ne and Te at a lower latitude relative to the disturbed source due to the effects of the geomagnetic field. Although seismic activity is not the only source to cause the inverse Ne-Te variations, the present results demonstrate one possibly useful tool in seismo-electromagnetic anomaly differentiation, and a comprehensive analysis with multiple parameters helps to

  3. Digital demodulator

    Science.gov (United States)

    Shull, T. A. (Inventor)

    1982-01-01

    A digital demodulator for converting pulse code modulated data from phase shift key (PSK) to non return to zero (NRZ) and to biphase data is described. The demodulator is composed of standard integrated logic circuits. The key to the demodulation function is a pair of cross coupled one shot multivibrators and which with a flip-flop produce the NRZ-L is all that is required, the circuitry is greatly simplified and the 2(v) times bit rate contraint can be removed from the carrier. A flip-flop, an OR gate, and AND gate and a binary counter generate the bit rate clock (BTCK) for the NRZ-L. The remainder of the circuitry is for converting the NRZ-L and BTCK into biphase data. The device was designed for use in the space shuttle bay environment measurements.

  4. Focus: Digital

    DEFF Research Database (Denmark)

    Technology has been an all-important and defining element within the arts throughout the 20th century, and it has fundamentally changed the ways in which we produce and consume music. With this Focus we investigate the latest developments in the digital domain – and their pervasiveness and rapid...... production and reception of contemporary music and sound art. With ‘Digital’ we present four composers' very different answers to how technology impact their work. To Juliana Hodkinson it has become an integral part of her sonic writing. Rudiger Meyer analyses the relationships between art and design and how...... technology affects our habits of consumption. Risto Holopainen presents a notion of autonomous instruments and automated composition that, in the end, cannot escape the human while Jøren Rudi reflects on aesthetic elements and artistic approaches to sound in computer games. This focus is edited by Sanne...

  5. Digital radiography

    DEFF Research Database (Denmark)

    Precht, H; Gerke, O; Rosendahl, K

    2012-01-01

    BACKGROUND: New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults...... with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. RESULTS: Optimal image-quality was maintained at a dose...... reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. CONCLUSION: By optimizing...

  6. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008

    Science.gov (United States)

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.

    2009-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: http://earthquake.usgs.gov/regional/ceus/urban_map/st_louis/index.php. This progress report summarizes the

  7. Great East Japan Earthquake Tsunami

    Science.gov (United States)

    Iijima, Y.; Minoura, K.; Hirano, S.; Yamada, T.

    2011-12-01

    The 11 March 2011, Mw 9.0 Great East Japan Earthquake, already among the most destructive earthquakes in modern history, emanated from a fault rupture that extended an estimated 500 km along the Pacific coast of Honshu. This earthquake is the fourth among five of the strongest temblors since AD 1900 and the largest in Japan since modern instrumental recordings began 130 years ago. The earthquake triggered a huge tsunami, which invaded the seaside areas of the Pacific coast of East Japan, causing devastating damages on the coast. Artificial structures were destroyed and planted forests were thoroughly eroded. Inrush of turbulent flows washed backshore areas and dunes. Coastal materials including beach sand were transported onto inland areas by going-up currents. Just after the occurrence of the tsunami, we started field investigation of measuring thickness and distribution of sediment layers by the tsunami and the inundation depth of water in Sendai plain. Ripple marks showing direction of sediment transport were the important object of observation. We used a soil auger for collecting sediments in the field, and sediment samples were submitted for analyzing grain size and interstitial water chemistry. Satellite images and aerial photographs are very useful for estimating the hydrogeological effects of tsunami inundation. We checked the correspondence of micro-topography, vegetation and sediment covering between before and after the tsunami. The most conspicuous phenomenon is the damage of pine forests planted in the purpose of preventing sand shifting. About ninety-five percent of vegetation coverage was lost during the period of rapid currents changed from first wave. The landward slopes of seawalls were mostly damaged and destroyed. Some aerial photographs leave detailed records of wave destruction just behind seawalls, which shows the occurrence of supercritical flows. The large-scale erosion of backshore behind seawalls is interpreted to have been caused by

  8. The 2016 Kumamoto earthquake sequence.

    Science.gov (United States)

    Kato, Aitaro; Nakamura, Kouji; Hiyama, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An M j 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an M j 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest.

  9. Earthquake lights and rupture processes

    Directory of Open Access Journals (Sweden)

    T. V. Losseva

    2005-01-01

    Full Text Available A physical model of earthquake lights is proposed. It is suggested that the magnetic diffusion from the electric and magnetic fields source region is a dominant process, explaining rather high localization of the light flashes. A 3D numerical code allowing to take into account the arbitrary distribution of currents caused by ground motion, conductivity in the ground and at its surface, including the existence of sea water above the epicenter or (and near the ruptured segments of the fault have been developed. Simulations for the 1995 Kobe earthquake were conducted taking into account the existence of sea water with realistic geometry of shores. The results do not contradict the eyewitness reports and scarce measurements of the electric and magnetic fields at large distances from the epicenter.

  10. The 2016 Kumamoto earthquake sequence

    Science.gov (United States)

    KATO, Aitaro; NAKAMURA, Kouji; HIYAMA, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An Mj 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an Mj 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest. PMID:27725474

  11. Digital work in a digitally challenged organization

    NARCIS (Netherlands)

    Davison, R.M.; Ou, Carol

    Digitally literate employees are accustomed to having free access to digital media technologies. However, some organizations enact information technology (IT) governance structures that explicitly proscribe access to these technologies, resulting in considerable tension between employees and the

  12. Dim prospects for earthquake prediction

    Science.gov (United States)

    Geller, Robert J.

    I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

  13. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  14. On the plant operators performance during earthquake

    International Nuclear Information System (INIS)

    Kitada, Y.; Yoshimura, S.; Abe, M.; Niwa, H.; Yoneda, T.; Matsunaga, M.; Suzuki, T.

    1994-01-01

    There is little data on which to judge the performance of plant operators during and after strong earthquakes. In order to obtain such data to enhance the reliability on the plant operation, a Japanese utility and a power plant manufacturer carried out a vibration test using a shaking table. The purpose of the test was to investigate operator performance, i.e., the quickness and correctness in switch handling and panel meter read-out. The movement of chairs during earthquake as also of interest, because if the chairs moved significantly or turned over during a strong earthquake, some arresting mechanism would be required for the chair. Although there were differences between the simulated earthquake motions used and actual earthquakes mainly due to the specifications of the shaking table, the earthquake motions had almost no influence on the operators of their capability (performance) for operating the simulated console and the personal computers

  15. Earthquake Hazard Mitigation Strategy in Indonesia

    Science.gov (United States)

    Karnawati, D.; Anderson, R.; Pramumijoyo, S.

    2008-05-01

    Because of the active tectonic setting of the region, the risks of geological hazards inevitably increase in Indonesian Archipelagoes and other ASIAN countries. Encouraging community living in the vulnerable area to adapt with the nature of geology will be the most appropriate strategy for earthquake risk reduction. Updating the Earthquake Hazard Maps, enhancement ofthe existing landuse management , establishment of public education strategy and method, strengthening linkages among stake holders of disaster mitigation institutions as well as establishement of continues public consultation are the main strategic programs for community resilience in earthquake vulnerable areas. This paper highlights some important achievements of Earthquake Hazard Mitigation Programs in Indonesia, together with the difficulties in implementing such programs. Case examples of Yogyakarta and Bengkulu Earthquake Mitigation efforts will also be discussed as the lesson learned. The new approach for developing earthquake hazard map which is innitiating by mapping the psychological aspect of the people living in vulnerable area will be addressed as well.

  16. Digital Marketer: Facing Digital Marketing Opportunities

    OpenAIRE

    NEGRICEA, Costel Iliuta; PURCAREA, Ioan Matei

    2015-01-01

    We are witnessing the emergence of new ecosystems thanks to digital disruption, marketers being challenged to bring marketing operations into the digital era, enhance the customer journey and shift consumer behavior with the help of the digital tools, while actively encouraging feedback from users, and building a circle of trust with the company’s audience. Recent findings showed clear differences of consumers’ preferences and of what marketers say they’re doing with digital technology. Respo...

  17. Deviant Earthquakes: Data-driven Constraints on the Variability in Earthquake Source Properties and Seismic Hazard

    OpenAIRE

    Trugman, Daniel T

    2017-01-01

    The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical feature...

  18. 3. Waveform and Spectral Features of Earthquake Swarms and Foreshocks : in Special Reference to Earthquake Prediction

    OpenAIRE

    Tsujiura, Masaru

    1983-01-01

    Through the analyses of waveforms and spectra for the earthquake swarm, foreshock and ordinary seismic activities, some differences in the activity mode are found among those activities. The most striking difference is the ""similarity of waveform"". The earthquake swarm activity which occurred in a certain short time interval mainly consists of events with similar waveforms, belonging to the event group called ""similar earthquakes"" or an ""earthquake family"". On the other hand, the foresh...

  19. Local earthquake tomography of Scotland

    Science.gov (United States)

    Luckett, Richard; Baptie, Brian

    2015-03-01

    Scotland is a relatively aseismic region for the use of local earthquake tomography, but 40 yr of earthquakes recorded by a good and growing network make it possible. A careful selection is made from the earthquakes located by the British Geological Survey (BGS) over the last four decades to provide a data set maximising arrival time accuracy and ray path coverage of Scotland. A large number of 1-D velocity models with different layer geometries are considered and differentiated by employing quarry blasts as ground-truth events. Then, SIMULPS14 is used to produce a robust 3-D tomographic P-wave velocity model for Scotland. In areas of high resolution the model shows good agreement with previously published interpretations of seismic refraction and reflection experiments. However, the model shows relatively little lateral variation in seismic velocity except at shallow depths, where sedimentary basins such as the Midland Valley are apparent. At greater depths, higher velocities in the northwest parts of the model suggest that the thickness of crust increases towards the south and east. This observation is also in agreement with previous studies. Quarry blasts used as ground truth events and relocated with the preferred 3-D model are shown to be markedly more accurate than when located with the existing BGS 1-D velocity model.

  20. Pre-earthquake Magnetic Pulses

    Science.gov (United States)

    Scoville, J.; Heraud, J. A.; Freund, F. T.

    2015-12-01

    A semiconductor model of rocks is shown to describe unipolar magnetic pulses, a phenomenon that has been observed prior to earthquakes. These pulses are suspected to be generated deep in the Earth's crust, in and around the hypocentral volume, days or even weeks before earth quakes. Their extremely long wavelength allows them to pass through kilometers of rock. Interestingly, when the sources of these pulses are triangulated, the locations coincide with the epicenters of future earthquakes. We couple a drift-diffusion semiconductor model to a magnetic field in order to describe the electromagnetic effects associated with electrical currents flowing within rocks. The resulting system of equations is solved numerically and it is seen that a volume of rock may act as a diode that produces transient currents when it switches bias. These unidirectional currents are expected to produce transient unipolar magnetic pulses similar in form, amplitude, and duration to those observed before earthquakes, and this suggests that the pulses could be the result of geophysical semiconductor processes.

  1. Earthquake-protective pneumatic foundation

    Science.gov (United States)

    Shustov, Valentin

    2000-04-01

    The main objective of the research in progress is to evaluate the applicability of an innovative earthquake-protective system called pneumatic foundation to building construction and industrial equipment. The system represents kind of seismic soil isolation. The research is analytical and accompanied with limited testing on a shake table. The concept of partial suppression of seismic energy flow inside a structure is known as a seismic or base isolation. Normally, this technique needs some pads to be inserted into all major load-carrying elements in a base of the building. It also requires creating additional rigidity diaphragms in the basement and a moat around the building, as well as making additional provisions against overturning and/or P-(Delta ) effect. Besides, potential benefits of base isolation techniques should not be taken for granted: they depend on many internal and external factors. The author developed a new earthquake protective technique called pneumatic foundation. Its main components are: a horizontal protective layer located under the footing at a certain depth, and a vertical one installed along the horizontal protective layer perimeter. The first experiments proved a sizable screening effect of pneumatic foundation: two identical models of a steel frame building, put simultaneously on the same vibrating support simulating an earthquake, performed in a strikingly different manner: while the regular building model shook vigorously, the model on a pneumatic foundation just slightly trembled.

  2. USNA DIGITAL FORENSICS LAB

    Data.gov (United States)

    Federal Laboratory Consortium — To enable Digital Forensics and Computer Security research and educational opportunities across majors and departments. Lab MissionEstablish and maintain a Digital...

  3. Understanding Great Earthquakes in Japan's Kanto Region

    Science.gov (United States)

    Kobayashi, Reiji; Curewitz, Daniel

    2008-10-01

    Third International Workshop on the Kanto Asperity Project; Chiba, Japan, 16-19 February 2008; The 1703 (Genroku) and 1923 (Taisho) earthquakes in Japan's Kanto region (M 8.2 and M 7.9, respectively) caused severe damage in the Tokyo metropolitan area. These great earthquakes occurred along the Sagami Trough, where the Philippine Sea slab is subducting beneath Japan. Historical records, paleoseismological research, and geophysical/geodetic monitoring in the region indicate that such great earthquakes will repeat in the future.

  4. If pandas scream. an earthquake is coming

    Energy Technology Data Exchange (ETDEWEB)

    Magida, P.

    Feature article:Use of the behavior of animals to predict weather has spanned several ages and dozens of countries. While animals may behave in diverse ways to indicate weather changes, they all tend to behave in more or less the same way before earthquakes. The geophysical community in the U.S. has begun testing animal behavior before earthquakes. It has been determined that animals have the potential of acting as accurate geosensors to detect earthquakes before they occur. (5 drawings)

  5. Social Media as Seismic Networks for the Earthquake Damage Assessment

    Science.gov (United States)

    Meletti, C.; Cresci, S.; La Polla, M. N.; Marchetti, A.; Tesconi, M.

    2014-12-01

    The growing popularity of online platforms, based on user-generated content, is gradually creating a digital world that mirrors the physical world. In the paradigm of crowdsensing, the crowd becomes a distributed network of sensors that allows us to understand real life events at a quasi-real-time rate. The SoS-Social Sensing project [http://socialsensing.it/] exploits the opportunistic crowdsensing, involving users in the sensing process in a minimal way, for social media emergency management purposes in order to obtain a very fast, but still reliable, detection of emergency dimension to face. First of all we designed and implemented a decision support system for the detection and the damage assessment of earthquakes. Our system exploits the messages shared in real-time on Twitter. In the detection phase, data mining and natural language processing techniques are firstly adopted to select meaningful and comprehensive sets of tweets. Then we applied a burst detection algorithm in order to promptly identify outbreaking seismic events. Using georeferenced tweets and reported locality names, a rough epicentral determination is also possible. The results, compared to Italian INGV official reports, show that the system is able to detect, within seconds, events of a magnitude in the region of 3.5 with a precision of 75% and a recall of 81,82%. We then focused our attention on damage assessment phase. We investigated the possibility to exploit social media data to estimate earthquake intensity. We designed a set of predictive linear models and evaluated their ability to map the intensity of worldwide earthquakes. The models build on a dataset of almost 5 million tweets exploited to compute our earthquake features, and more than 7,000 globally distributed earthquakes data, acquired in a semi-automatic way from USGS, serving as ground truth. We extracted 45 distinct features falling into four categories: profile, tweet, time and linguistic. We run diagnostic tests and

  6. Methodology to determine the parameters of historical earthquakes in China

    Science.gov (United States)

    Wang, Jian; Lin, Guoliang; Zhang, Zhe

    2017-12-01

    China is one of the countries with the longest cultural tradition. Meanwhile, China has been suffering very heavy earthquake disasters; so, there are abundant earthquake recordings. In this paper, we try to sketch out historical earthquake sources and research achievements in China. We will introduce some basic information about the collections of historical earthquake sources, establishing intensity scale and the editions of historical earthquake catalogues. Spatial-temporal and magnitude distributions of historical earthquake are analyzed briefly. Besides traditional methods, we also illustrate a new approach to amend the parameters of historical earthquakes or even identify candidate zones for large historical or palaeo-earthquakes. In the new method, a relationship between instrumentally recorded small earthquakes and strong historical earthquakes is built up. Abundant historical earthquake sources and the achievements of historical earthquake research in China are of valuable cultural heritage in the world.

  7. Global Significant Earthquake Database, 2150 BC to present

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Significant Earthquake Database is a global listing of over 5,700 earthquakes from 2150 BC to the present. A significant earthquake is classified as one that...

  8. The 2015 Gorkha (Nepal) earthquake: unfinished business Large ...

    Indian Academy of Sciences (India)

    jaj2

    The 2015 Gorkha (Nepal) earthquake: unfinished business. Large earthquakes in the Himalaya and India. James Jackson, Bullard Laboratories, University of Cambridge ... Jackson, GSA Today, 2002; Sloan et al GJI, 2011. Moho depth. Earthquake depth ...

  9. Smoking prevalence increases following Canterbury earthquakes.

    Science.gov (United States)

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  10. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  11. Impact- and earthquake- proof roof structure

    International Nuclear Information System (INIS)

    Shohara, Ryoichi.

    1990-01-01

    Building roofs are constituted with roof slabs, an earthquake proof layer at the upper surface thereof and an impact proof layer made of iron-reinforced concrete disposed further thereover. Since the roofs constitute an earthquake proof structure loading building dampers on the upper surface of the slabs by the concrete layer, seismic inputs of earthquakes to the buildings can be moderated and the impact-proof layer is formed, to ensure the safety to external conditions such as earthquakes or falling accidents of airplane in important facilities such as reactor buildings. (T.M.)

  12. Evaluation and cataloging of Korean historical earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kew Hwa; Han, Young Woo; Lee, Jun Hui; Park, Ji Eok; Na, Kwang Wooing; Shin, Byung Ju [The Reaearch Institute of Basic Sciences, Seoul Nationl Univ., Seoul (Korea, Republic of)

    1998-03-15

    In order to systematically collect and analyze the historical earthquake data of the Korean peninsula which are very important in analyzing the seismicity and seismic risk of the peninsula by seismologist and historian, extensive governmental and private historical documents are investigated and relative reliabilities of these documents are examined. This research unearthed about 70 new earthquake records and revealed the change in the cultural, political and social effects of earthquakes with time in Korea. Also, the results of the vibration test of the Korean traditional wooden house are obtained in order to better estimate intensities of the historical earthquakes.

  13. Smoking Prevalence Increases following Canterbury Earthquakes

    Directory of Open Access Journals (Sweden)

    Nick Erskine

    2013-01-01

    Full Text Available Background. A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents’ living, working, and social conditions. Aim. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Methods. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. Results. In August 2010, prior to any earthquake, 409 (41% participants had never smoked, 273 (27% were currently smoking, and 316 (32% were ex-smokers. Since the September 2010 earthquake, 76 (24% of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2% had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1% had increased consumption following the earthquake, 94 (34.4% had not changed, and 86 (31.5% had decreased their consumption. 53 (57% of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. Conclusion. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  14. Thermal Infrared Anomalies of Several Strong Earthquakes

    Directory of Open Access Journals (Sweden)

    Congxin Wei

    2013-01-01

    Full Text Available In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1 There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2 There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3 Thermal radiation anomalies are closely related to the geological structure. (4 Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  15. A minimalist model of characteristic earthquakes

    Directory of Open Access Journals (Sweden)

    M. Vázquez-Prada

    2002-01-01

    Full Text Available In a spirit akin to the sandpile model of self-organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time of the characteristic earthquake.

  16. A minimalist model of characteristic earthquakes

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    In a spirit akin to the sandpile model of self- organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake...... behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...... of the characteristic earthquake....

  17. Statistical tests of simple earthquake cycle models

    Science.gov (United States)

    Devries, Phoebe M. R.; Evans, Eileen

    2016-01-01

    A central goal of observing and modeling the earthquake cycle is to forecast when a particular fault may generate an earthquake: a fault late in its earthquake cycle may be more likely to generate an earthquake than a fault early in its earthquake cycle. Models that can explain geodetic observations throughout the entire earthquake cycle may be required to gain a more complete understanding of relevant physics and phenomenology. Previous efforts to develop unified earthquake models for strike-slip faults have largely focused on explaining both preseismic and postseismic geodetic observations available across a few faults in California, Turkey, and Tibet. An alternative approach leverages the global distribution of geodetic and geologic slip rate estimates on strike-slip faults worldwide. Here we use the Kolmogorov-Smirnov test for similarity of distributions to infer, in a statistically rigorous manner, viscoelastic earthquake cycle models that are inconsistent with 15 sets of observations across major strike-slip faults. We reject a large subset of two-layer models incorporating Burgers rheologies at a significance level of α = 0.05 (those with long-term Maxwell viscosities ηM ~ 4.6 × 1020 Pa s) but cannot reject models on the basis of transient Kelvin viscosity ηK. Finally, we examine the implications of these results for the predicted earthquake cycle timing of the 15 faults considered and compare these predictions to the geologic and historical record.

  18. Thermal Infrared Anomalies of Several Strong Earthquakes

    Science.gov (United States)

    Wei, Congxin; Guo, Xiao; Qin, Manzhong

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of “time-frequency relative power spectrum.” (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting. PMID:24222728

  19. Insurance Stock Prices Following the 1994 Los Angeles Earthquake

    OpenAIRE

    Thomas A. Aiuppa; Thomas M. Krueger

    1995-01-01

    This study examines the changes in insurance firm value following the 1994 Los Angeles earthquake. While prior studies found that the 1989 San Francisco earthquake was associated with an increase in earthquake insurers’ firm value, the findings of this study indicate that earthquake firms sustained their value following the 1994 earthquake. These results and their implications provide insight for investors, regulators, and other policymakers as regards future earthquakes.

  20. Digital forensics digital evidence in criminal investigations

    CERN Document Server

    Marshall, Angus McKenzie

    2009-01-01

    The vast majority of modern criminal investigations involve some element of digital evidence, from mobile phones, computers, CCTV and other devices. Digital Forensics: Digital Evidence in Criminal Investigations provides the reader with a better understanding of how digital evidence complements "traditional" scientific evidence and examines how it can be used more effectively and efficiently in a range of investigations. Taking a new approach to the topic, this book presents digital evidence as an adjunct to other types of evidence and discusses how it can be deployed effectively in s

  1. Digital Disruption

    DEFF Research Database (Denmark)

    Rosenstand, Claus Andreas Foss

    Disruption var frem til slutningen af 2016 i Danmark et ord, som kun få kendte og endnu færre havde en holdning til. Nu er der imidlertid sat fokus på begrebet fra allerhøjeste nationale sted, idet regeringen har taget initiativ til nedsættelse af det, Statsminister Lars Løkke Rasmussen indtil...... videre kalder et ”disruption-råd”. Faktisk er rådet skrevet ind i 2016 regeringsgrundlaget for VLK-regeringen. Disruption af organisationer er ikke et nyt fænomen; men hastigheden, hvormed det sker, er stadig accelererende. Årsagen er den globale mega-trend: Digitalisering. Og derfor er specielt digital...... disruption en sag for os alle. Derfor er det også for vigtigt et emne til, at det udelukkende behandles i elitære videnskabelige, industrielle og politiske kredse. Der er behov for en bredere samfundsdebat; og bogen er et forskningsbaseret bidrag ind i denne debat. For a kvalificere debatten om disruption i...

  2. Digital Copies and Digital Museums in a Digital Cultural Policy

    Directory of Open Access Journals (Sweden)

    Ole Marius Hylland

    2017-09-01

    Full Text Available This article investigates how a digital turn and digital copies have influenced ideas, roles and authorities within a national museum sector. It asks whether digital mu-seums and their digital reproductions expand and/or challenge a traditional cul-tural policy. Two specific cases are highlighted to inform the discussion on these questions - the Norwegian digital museum platform DigitaltMuseum and Google Art Project. The article argues that there is a certain epochalism at play when the impact of a digital turn is analysed. At the same time, some clear major changes are taking place, even if their impact on cultural policies might be less than expec-ted. I propose that one of the changes is the replacing of authenticity with accessi-bility as the primary legitimating value of museum objects.

  3. Digital platforms as enablers for digital transformation

    DEFF Research Database (Denmark)

    Hossain, Mokter; Lassen, Astrid Heidemann

    transformation is crucial. This study aims at exploring how organizations are driven towards transformation in various ways to embrace digital platforms for ideas, technologies, and knowledge. It shows the opportunities and challenges digital platforms bring in organizations. It also highlights underlying......Digital platforms offer new ways for organizations to collaborate with the external environment for ideas, technologies, and knowledge. They provide new possibilities and competence but they also bring new challenges for organizations. Understanding the role of these platforms in digital...... mechanisms and potential outcomes of various digital platforms. The contribution of the submission is valuable for scholars to understand and further explore this area. It provides insight for practitioners to capture value through digital platforms and accelerate the pace of organizations’ digital...

  4. Educational Applications for Digital Cameras.

    Science.gov (United States)

    Cavanaugh, Terence; Cavanaugh, Catherine

    1997-01-01

    Discusses uses of digital cameras in education. Highlights include advantages and disadvantages, digital photography assignments and activities, camera features and operation, applications for digital images, accessory equipment, and comparisons between digital cameras and other digitizers. (AEF)

  5. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  6. Stress Drops for Potentially Induced Earthquake Sequences

    Science.gov (United States)

    Huang, Y.; Beroza, G. C.; Ellsworth, W. L.

    2015-12-01

    Stress drop, the difference between shear stress acting across a fault before and after an earthquake, is a fundamental parameter of the earthquake source process and the generation of strong ground motions. Higher stress drops usually lead to more high-frequency ground motions. Hough [2014 and 2015] observed low intensities in "Did You Feel It?" data for injection-induced earthquakes, and interpreted them to be a result of low stress drops. It is also possible that the low recorded intensities could be a result of propagation effects. Atkinson et al. [2015] show that the shallow depth of injection-induced earthquakes can lead to a lack of high-frequency ground motion as well. We apply the spectral ratio method of Imanishi and Ellsworth [2006] to analyze stress drops of injection-induced earthquakes, using smaller earthquakes with similar waveforms as empirical Green's functions (eGfs). Both the effects of path and linear site response should be cancelled out through the spectral ratio analysis. We apply this technique to the Guy-Greenbrier earthquake sequence in central Arkansas. The earthquakes migrated along the Guy-Greenbrier Fault while nearby injection wells were operating in 2010-2011. Huang and Beroza [GRL, 2015] improved the magnitude of completeness to about -1 using template matching and found that the earthquakes deviated from Gutenberg-Richter statistics during the operation of nearby injection wells. We identify 49 clusters of highly similar events in the Huang and Beroza [2015] catalog and calculate stress drops using the source model described in Imanishi and Ellsworth [2006]. Our results suggest that stress drops of the Guy-Greenbrier sequence are similar to tectonic earthquakes at Parkfield, California (the attached figure). We will also present stress drop analysis of other suspected induced earthquake sequences using the same method.

  7. Estimation of Maximum Magnitudes of Subduction Earthquakes

    Science.gov (United States)

    Muldashev, Iskander; Sobolev, Stephan

    2017-04-01

    Even though methods of instrumentally observing earthquakes at subduction zones have rapidly improved in recent decades, the characteristic recurrence interval of giant subduction earthquakes (Mw>8.5) is much larger than the currently available observational record and therefore the necessary conditions for giant earthquakes are not clear. However, the statistical studies have recognized the importance of the slab shape and its surface roughness, state of the strain of the upper plate and thickness of sediments filling the trenches. Here we apply cross-scale seismic cycle modeling technique (Sobolev and Muldashev, under review) to study key factors controlling maximum magnitudes of earthquakes in subduction zones. Our models employ elasticity, non-linear transient viscous rheology and rate-and-state friction. They generate spontaneous earthquake sequences and by using adaptive time-step algorithm, recreate the deformation process as observed naturally during seismic cycle and multiple seismic cycles. We explore effects of slab geometry, megathrust friction coefficients, and convergence rates on the magnitude of earthquakes. We found that the low-angle subduction (largest effect) and low static friction, likely caused by thick sediments in the subduction channel (smaller effect) are the key factors controlling magnitude of great earthquakes, while the change of subduction velocity from 10 to 3.5 cm/yr has much lower effect. Modeling results also suggest that thick sediments in the subduction channel causing low static friction, result in neutral or compressive deformation in the overriding plate for low-angle subduction zones in agreement with observations for the giant earthquakes. The model also predicts the magnitudes of the largest possible earthquakes for subduction zones of given dipping angles. We demonstrate that our predictions are consistent with all known giant subduction earthquakes of 20th and 21st centuries and with estimations for historical

  8. Dual beam vidicon digitizer

    International Nuclear Information System (INIS)

    Evans, T.L.

    1976-01-01

    A vidicon waveform digitizer which can simultaneously digitize two independent signals has been developed. Either transient or repetitive waveforms can be digitized with this system. A dual beam oscilloscope is used as the signal input device. The light from the oscilloscope traces is optically coupled to a television camera, where the signals are temporarily stored prior to digitizing

  9. H. Sapiens Digital: From Digital Immigrants and Digital Natives to Digital Wisdom

    Science.gov (United States)

    Prensky, Marc

    2009-01-01

    As we move further into the 21st century, the digital native/digital immigrant paradigm created by Marc Prensky in 2001 is becoming less relevant. In this article, Prensky suggests that we should focus instead on the development of what he calls "digital wisdom." Arguing that digital technology can make us not just smarter but truly wiser, Prensky…

  10. Contributions to the European workshop on investigation of strong motion processing procedures

    International Nuclear Information System (INIS)

    Mohammadioun, B.; Goula, X.; Hamaide, D.

    1985-11-01

    The first paper is one contribution to a joint study program in the numerical processing of accelerograms from strong earthquakes. A method is proposed for generating an analytic signal having characteristics similar to those of an actual ground displacement. From this signal, a simulated accelerogram is obtained analytically. Various numerical processing techniques are to be tested using this signal: the ground displacements they yield will be compared with the original analytic signal. The second contribution deals with a high-performance digitization complex, custom-designed to stringent technical criteria by the CISI Petrole Services, which has recently been put into service at the Bureau d'Evaluation des Risques Sismiques pour la Surete des Installations Nucleaires. Specially tailored to cope with the problems raised by the sampling of Strong-Motion photographic recordings, it offers considerable flexibility, due to its self-teaching conception, constant monitoring of the work ongoing, and numerous preprocessing options. In the third contribution, a critical examination of several processing techniques applicable to photographic recordings of SMA-1 type accelerometers is conducted. The basis for comparison was a set of two accelerograms drawn from synthetic signals, the characteristics of which were already well known

  11. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part B, historical earthquakes

    Science.gov (United States)

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.

  12. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    Science.gov (United States)

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  13. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  14. Enabling Digital Literacy

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne

    2010-01-01

    There are some tensions between high-level policy definitions of “digital literacy” and actual teaching practice. We need to find workable definitions of digital literacy; obtain a better understanding of what digital literacy might look like in practice; and identify pedagogical approaches, which...... support teachers in designing digital literacy learning. We suggest that frameworks such as Problem Based Learning (PBL) are approaches that enable digital literacy learning because they provide good settings for engaging with digital literacy. We illustrate this through analysis of a case. Furthermore......, these operate on a meso-level mediating between high-level concepts of digital literacy and classroom practice....

  15. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    International Nuclear Information System (INIS)

    2001-01-01

    It is well accepted that the 1992 M 5.6 Little Skull Mountain earthquake, the largest historical event to have occurred within 25 km of Yucca Mountain, Nevada, was triggered by the M 7.2 Landers earthquake that occurred the day before. On the premise that earthquakes can be triggered by applied stresses, we have examined the earthquake catalog from the Southern Great Basin Digital Seismic Network (SGBDSN) for other evidence of triggering by external and internal stresses. This catalog now comprises over 12,000 events, encompassing five years of consistent monitoring, and has a low threshold of completeness, varying from M 0 in the center of the network to M 1 at the fringes. We examined the SGBDSN catalog response to external stresses such as large signals propagating from teleseismic and regional earthquakes, microseismic storms, and earth tides. Results are generally negative. We also examined the interplay of earthquakes within the SGBDSN. The number of ''foreshocks'', as judged by most criteria, is significantly higher than the background seismicity rate. In order to establish this, we first removed aftershocks from the catalog with widely used methodology. The existence of SGBDSN foreshocks is supported by comparing actual statistics to those of a simulated catalog with uniform-distributed locations and Poisson-distributed times of occurrence. The probabilities of a given SGBDSN earthquake being followed by one having a higher magnitude within a short time frame and within a close distance are at least as high as those found with regional catalogs. These catalogs have completeness thresholds two to three units higher in magnitude than the SGBDSN catalog used here. The largest earthquake in the SGBDSN catalog, the M 4.7 event in Frenchman Flat on 01/27/1999, was preceded by a definite foreshock sequence. The largest event within 75 km of Yucca Mountain in historical time, the M 5.7 Scotty's Junction event of 08/01/1999, was also preceded by foreshocks. The

  16. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  17. Earthquakes, detecting and understanding them

    International Nuclear Information System (INIS)

    2008-05-01

    The signatures at the surface of the Earth is continually changing on a geological timescale. The tectonic plates, which make up this surface, are moving in relation to each other. On human timescale, these movements are the result of earthquakes, which suddenly, release energy accumulated over a period of time. The vibrations they produce propagate through the interior of the Earth: these are seismic waves. However, other phenomena can generate seismic waves, such as volcanoes, quarry blasts, etc. The surf of the ocean waves on the coasts, the wind in the trees and human activity (industry and road traffic) all contribute to the 'seismic background noise'. Sensors are able to detect signals from events which are then discriminated, analyzed and located. Earthquakes and active volcanoes are not distributed randomly over the surface of the globe: they mainly coincide with mountain chains and ocean trenches and ridges. 'An earthquake results from the abrupt release of the energy accumulated by movements and rubbing of different plates'. The study of the propagation of seismic waves has allowed to determine the outline of the plates inside the Earth and has highlighted their movements. There are seven major plates which are colliding, diverging or sliding past each other. Each year the continents move several centimeters with respect to one another. This process, known as 'continental drift', was finally explained by plate tectonics. The initial hypothesis for this science dates from the beginning of the 20. century, but it was not confirmed until the 1960's. It explains that convection inside the Earth is the source of the forces required for these movements. This science, as well as explaining these great movements, has provided a coherent, unifying and quantitative framework, which unites the explanations for all the geophysical phenomena under one mechanism. (authors)

  18. Forecasting characteristic earthquakes in a minimalist model

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; Pacheco, A.; González, Á.

    2003-01-01

    Using error diagrams, we quantify the forecasting of characteristic-earthquake occurence in a recently introduced minimalist model. Initially we connect the earthquake alarm at a fixed time after the occurence of a characteristic event. The evaluation of this strategy leads to a one...

  19. Refresher Course on Physics of Earthquakes -98 ...

    Indian Academy of Sciences (India)

    The objective of this course is to help teachers gain an understanding of the earhquake phenomenon and the physical processes involved in its genesis as well as offhe earthquake waves which propagate the energy released by the earthquake rupture outward from the source. The Course will begin with mathematical ...

  20. Wood-framed houses for earthquake zones

    DEFF Research Database (Denmark)

    Hansen, Klavs Feilberg

    Wood-framed houses with a sheathing are suitable for use in earthquake zones. The Direction describes a method of determining the earthquake forces in a house and shows how these forces can be resisted by diaphragm action in the walls, floors, and roof, of the house. An appendix explains how...

  1. Napa Earthquake impact on water systems

    Science.gov (United States)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  2. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  3. A minimalist model of characteristic earthquakes

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    -earthquake behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...

  4. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  5. Rapid Inventory of Earthquake Damage (RIED)

    NARCIS (Netherlands)

    Duque, Adriana; Hack, Robert; Montoya, L.; Scarpas, Tom; Slob, Siefko; Soeters, Rob; van Westen, Cees

    2001-01-01

    The 25 January 1999 Quindío earthquake in Colombia was a major disaster for the coffee-growing region in Colombia. Most of the damage occurred in the city of Armenia and surrounding villages. Damage due to earthquakes is strongly related to topographic and subsurface geotechnical conditions

  6. Simultaneous estimation of earthquake source parameters and ...

    Indian Academy of Sciences (India)

    stress drop values are quite large compared to the other similar size Indian intraplate earthquakes, which can be attributed ... Earthquake source parameters; crustal Q-value; simultaneous inversion; S-wave spectra; aftershocks. J. Earth Syst. Sci. ...... 28 1339–1342. Lee W H K and Valdes C M 1985 HYP071PC: A personal.

  7. The Earthquake Preparedness Task Force Report. Revised.

    Science.gov (United States)

    Roybal-Allard, Lucille

    A report on Earthquake Preparedness presents California school districts with direction for complying with existing earthquake preparedness planning laws. It first contains two sets of recommendations. The first set requires state action and is presented to the Legislature for consideration. The second set consists of policy statements and…

  8. Earthquake effect on the geological environment

    International Nuclear Information System (INIS)

    Kawamura, Makoto

    1999-01-01

    Acceleration caused by the earthquake, changes in the water pressure, and the rock-mass strain were monitored for a series of 344 earthquakes from 1990 to 1998 at Kamaishi In Situ Test Site. The largest acceleration was registered to be 57.14 gal with the earthquake named 'North coast of Iwate Earthquake' (M4.4) occurred in June, 1996. Changes of the water pressure were recorded with 27 earthquakes; the largest change was -0.35 Kgt/cm 2 . The water-pressure change by earthquake was, however, usually smaller than that caused by rainfall in this area. No change in the electric conductivity or pH of ground water was detected before and after the earthquake throughout the entire period of monitoring. The rock-mass strain was measured with a extensometer whose detection limit was of the order of 10 -8 to 10 -9 degrees and the remaining strain of about 2.5x10 -9 degrees was detected following the 'Offshore Miyagi Earthquake' (M5.1) in October, 1997. (H. Baba)

  9. Earthquake engineering research program in Chile

    Science.gov (United States)

    Saragoni, G. R.

    1982-01-01

    Earthquake engineering research in Chile has been carried out for more than 30 years. Systematic research is done at the university of Chile in Santiago. Other universities such as the Catholic University, university of Concepcion, and the Federico Santa Maria Technical University have begun to teach and conduct research in earthquake engineering in recent years. 

  10. Elevated Tank Due to Earthquake Even

    Directory of Open Access Journals (Sweden)

    Kotrasová Kamila

    2017-12-01

    Full Text Available Elevated reservoirs are mainly used for storing of variety water. During earthquake activity the fluid exerts impulsive and convective (sloshing effects on the walls and bottom of tank. This paper provides theoretical background for analytical calculating of elevated water tank due to earthquake even and deals with simplified seismic design procedures for elevated tanks.

  11. Tutorial on earthquake rotational effects: historical examples

    Czech Academy of Sciences Publication Activity Database

    Kozák, Jan

    2009-01-01

    Roč. 99, 2B (2009), s. 998-1010 ISSN 0037-1106 Institutional research plan: CEZ:AV0Z30120515 Keywords : rotational seismic models * earthquake rotational effects * historical earthquakes Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 1.860, year: 2009

  12. Simultaneous estimation of earthquake source parameters and ...

    Indian Academy of Sciences (India)

    This paper presents the simultaneous estimation of source parameters and crustal Q values for small to moderate-size aftershocks ( 2.1–5.1) of the 7.7 2001 Bhuj earthquake. The horizontal-component S-waves of 144 well located earthquakes (2001–2010) recorded at 3–10 broadband seismograph sites in the ...

  13. Post-earthquake inspection of utility buildings

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, E.; Cluff, L.; Savage, W. [Pacific Gas and Electric Co., San Francisco, CA (United States). Geosciences Dept.; Poland, C. [Degenkolb Engineers, San Francisco, CA (United States)

    1995-12-31

    The evacuation of safe buildings and the inability to reoccupy them immediately after earthquakes can have significant impacts on lifeline utilities, including delays in the restoration of essential services. For many of Pacific Gas and Electric Company`s 3400 buildings, the potential for unnecessary evacuations and delays in reentry was judged unacceptable. A Post-Earthquake Investigation Program, developed jointly by PG and E and Degenkolb Engineers, facilitates the post-earthquake use of essential buildings. The details of the program were developed taking into consideration the effects of high-likelihood scenario earthquakes on PG and E facilities, and the potential disruption of transportation and communication systems. Qualified engineers were pre-assigned to inspect key facilities following prescribed earthquakes. The inspections will be facilitated by pre-earthquake evaluations and post-earthquake manuals. Building department personnel support the program, because it promotes the timely and accurate assessment of essential buildings within their jurisdiction. The program was developed for a gas and electric utility; however, it is applicable to other organizations in earthquake regions.

  14. Simultaneous estimation of earthquake source parameters and ...

    Indian Academy of Sciences (India)

    This paper presents the simultaneous estimation of source parameters and crustal Q values for small to moderate-size aftershocks (Mw 2.1–5.1) of the Mw 7.7 2001 Bhuj earthquake. The horizontal-component. S-waves of 144 well located earthquakes (2001–2010) recorded at 3–10 broadband seismograph sites in.

  15. Bayesian exploration of recent Chilean earthquakes

    Science.gov (United States)

    Duputel, Zacharie; Jiang, Junle; Jolivet, Romain; Simons, Mark; Rivera, Luis; Ampuero, Jean-Paul; Liang, Cunren; Agram, Piyush; Owen, Susan; Ortega, Francisco; Minson, Sarah

    2016-04-01

    The South-American subduction zone is an exceptional natural laboratory for investigating the behavior of large faults over the earthquake cycle. It is also a playground to develop novel modeling techniques combining different datasets. Coastal Chile was impacted by two major earthquakes in the last two years: the 2015 M 8.3 Illapel earthquake in central Chile and the 2014 M 8.1 Iquique earthquake that ruptured the central portion of the 1877 seismic gap in northern Chile. To gain better understanding of the distribution of co-seismic slip for those two earthquakes, we derive joint kinematic finite fault models using a combination of static GPS offsets, radar interferograms, tsunami measurements, high-rate GPS waveforms and strong motion data. Our modeling approach follows a Bayesian formulation devoid of a priori smoothing thereby allowing us to maximize spatial resolution of the inferred family of models. The adopted approach also attempts to account for major sources of uncertainty in the Green's functions. The results reveal different rupture behaviors for the 2014 Iquique and 2015 Illapel earthquakes. The 2014 Iquique earthquake involved a sharp slip zone and did not rupture to the trench. The 2015 Illapel earthquake nucleated close to the coast and propagated toward the trench with significant slip apparently reaching the trench or at least very close to the trench. At the inherent resolution of our models, we also present the relationship of co-seismic models to the spatial distribution of foreshocks, aftershocks and fault coupling models.

  16. Designing an Earthquake-Resistant Building

    Science.gov (United States)

    English, Lyn D.; King, Donna T.

    2016-01-01

    How do cross-bracing, geometry, and base isolation help buildings withstand earthquakes? These important structural design features involve fundamental geometry that elementary school students can readily model and understand. The problem activity, Designing an Earthquake-Resistant Building, was undertaken by several classes of sixth- grade…

  17. Digital information management

    OpenAIRE

    Sridhar, M. S.

    2007-01-01

    Digital libraries and digital information are exciting everyone. Accessing and using digital information needs to be understood in correct perspective. Many feel that digital libraries and Internet are enough and traditional libraries are no more required. The pros and cons in using digital information and appropriateness as well as limitations of the Internet are highlighted in this slide presentation to participants of the training programme.

  18. Digital Sensor Technology

    Energy Technology Data Exchange (ETDEWEB)

    Ted Quinn; Jerry Mauck; Richard Bockhorst; Ken Thomas

    2013-07-01

    The nuclear industry has been slow to incorporate digital sensor technology into nuclear plant designs due to concerns with digital qualification issues. However, the benefits of digital sensor technology for nuclear plant instrumentation are substantial in terms of accuracy, reliability, availability, and maintainability. This report demonstrates these benefits in direct comparisons of digital and analog sensor applications. It also addresses the qualification issues that must be addressed in the application of digital sensor technology.

  19. Digital preservation for heritages

    CERN Document Server

    Lu, Dongming

    2011-01-01

    ""Digital Preservation for Heritages: Technologies and Applications"" provides a comprehensive and up-to-date coverage of digital technologies in the area of cultural heritage preservation, including digitalization, research aiding, conservation aiding, digital exhibition, and digital utilization. Processes, technical frameworks, key technologies, as well as typical systems and applications are discussed in the book. It is intended for researchers and students in the fields of computer science and technology, museology, and archaeology. Dr. Dongming Lu is a professor at College of Computer Sci

  20. Deep long-period earthquakes beneath Washington and Oregon volcanoes

    Science.gov (United States)

    Nichols, M.L.; Malone, S.D.; Moran, S.C.; Thelen, W.A.; Vidale, J.E.

    2011-01-01

    Deep long-period (DLP) earthquakes are an enigmatic type of seismicity occurring near or beneath volcanoes. They are commonly associated with the presence of magma, and found in some cases to correlate with eruptive activity. To more thoroughly understand and characterize DLP occurrence near volcanoes in Washington and Oregon, we systematically searched the Pacific Northwest Seismic Network (PNSN) triggered earthquake catalog for DLPs occurring between 1980 (when PNSN began collecting digital data) and October 2009. Through our analysis we identified 60 DLPs beneath six Cascade volcanic centers. No DLPs were associated with volcanic activity, including the 1980-1986 and 2004-2008 eruptions at Mount St. Helens. More than half of the events occurred near Mount Baker, where the background flux of magmatic gases is greatest among Washington and Oregon volcanoes. The six volcanoes with DLPs (counts in parentheses) are Mount Baker (31), Glacier Peak (9), Mount Rainier (9), Mount St. Helens (9), Three Sisters (1), and Crater Lake (1). No DLPs were identified beneath Mount Adams, Mount Hood, Mount Jefferson, or Newberry Volcano, although (except at Hood) that may be due in part to poorer network coverage. In cases where the DLPs do not occur directly beneath the volcanic edifice, the locations coincide with large structural faults that extend into the deep crust. Our observations suggest the occurrence of DLPs in these areas could represent fluid and/or magma transport along pre-existing tectonic structures in the middle crust. ?? 2010 Elsevier B.V.

  1. The 2010 Haiti earthquake response.

    Science.gov (United States)

    Raviola, Giuseppe; Severe, Jennifer; Therosme, Tatiana; Oswald, Cate; Belkin, Gary; Eustache, Eddy

    2013-09-01

    This article presents an overview of the mental health response to the 2010 Haiti earthquake. Discussion includes consideration of complexities that relate to emergency response, mental health and psychosocial response in disasters, long-term planning of systems of care, and the development of safe, effective, and culturally sound mental health services in the Haitian context. This information will be of value to mental health professionals and policy specialists interested in mental health in Haiti, and in the delivery of mental health services in particularly resource-limited contexts in the setting of disasters. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Earthquake Source Depths in the Zagros Mountains: A "Jelly Sandwich" or "Creme Brulee" Lithosphere?

    Science.gov (United States)

    Adams, A. N.; Nyblade, A.; Brazier, R.; Rodgers, A.; Al-Amri, A.

    2006-12-01

    The Zagros Mountain Belt of southwestern Iran is one of the most seismically active mountain belts in the world. Previous studies of the depth distribution of earthquakes in this region have shown conflicting results. Early seismic studies of teleseismically recorded events found that earthquakes in the Zagros Mountains nucleated within both the upper crust and upper mantle, indicating that the lithosphere underlying the Zagros Mountains has a strong upper crust and a strong lithospheric mantle, separated by a weak lower crust. Such a model of lithospheric structure is called the "Jelly Sandwich" model. More recent teleseismic studies, however, found that earthquakes in the Zagros Mountains occur only within the upper crust, thus indicating that the strength of the Zagros Mountains' lithosphere is primarily isolated to the upper crust. This model of lithospheric structure is called the "crème brûlée" model. Analysis of regionally recorded earthquakes nucleating within the Zagros Mountains is presented here. Data primarily come from the Saudi Arabian National Digital Seismic Network, although data sources include many regional open and closed networks. The use of regionally recorded earthquakes facilitates the analysis of a larger dataset than has been used in previous teleseismic studies. Regional waveforms have been inverted for source parameters using a range of potential source depths to determine the best fitting source parameters and depths. Results indicate that earthquakes nucleate in two distinct zones. One seismogenic zone lies at shallow, upper crustal depths. The second seismogenic zone lies near the Moho. Due to uncertainty in the source and Moho depths, further study is needed to determine whether these deeper events are nucleating within the lower crust or the upper mantle.

  3. Development of an earthquake catalog management program

    International Nuclear Information System (INIS)

    Eum, H. S.; Choi, I. K.

    1999-01-01

    Earthquake Catalog Management Program was developed for earthquake engineering and research. The program is composed of catalog database and application program. Catalog database currently has more than 720 catalog records from earthquake data recorded between 1994/12 and 1998/5 in korea. 17 parameters derived from earthquake data constitute each record. These parameters in database include information on the triggering events, recording station, and station specific recorded values. Catalog database also has information on 12 recording stations. Application program is a tool for accessing and managing the catalog database and recorded earthquake data files. The program provides various functions such as search, sort, display capabilities of catalog subset, file retrieval from hard disks or CD-ROM, file type conversion, and multiple output options including computer screen, printer, and disk files

  4. Parallelization of the Coupled Earthquake Model

    Science.gov (United States)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  5. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  6. Earthquake Prediction Techniques: Their Application in Japan

    Science.gov (United States)

    Kisslinger, Carl

    Japan is serious about solving the earthquake prediction problem. A well-organized and well-funded program of research has been under way for almost 20 years in pursuit of the national goal of protecting the dense population of this earthquake-prone country through reliable predictions.This rather amazing book, edited by Toshi Asada, retired director of the Geophysical Institute of the University of Tokyo, has been written by 10 scientists, each of whom has made important contributions to earthquake science, but who have not been known in the past as principal spokesmen for the Japanese earthquake prediction program. The result is a combination of a very readable tutorial presentation of basic earthquake science that will make the book understandable to the nonspecialist, a good summary of Japanese data and research conclusions, and a bare-knuckles appraisal of current philosophy and strategy for prediction in Japan.

  7. Low cost earthquake resistant ferrocement small house

    International Nuclear Information System (INIS)

    Saleem, M.A.; Ashraf, M.; Ashraf, M.

    2008-01-01

    The greatest humanitarian challenge faced even today after one year of Kashmir Hazara earthquake is that of providing shelter. Currently on the globe one in seven people live in a slum or refugee camp. The earthquake of October 2005 resulted in a great loss of life and property. This research work is mainly focused on developing a design of small size, low cost and earthquake resistant house. Ferrocement panels are recommended as the main structural elements with lightweight truss roofing system. Earthquake resistance is ensured by analyzing the structure on ETABS for a seismic activity of zone 4. The behavior of structure is found satisfactory under the earthquake loading. An estimate of cost is also presented which shows that it is an economical solution. (author)

  8. Ionospheric Anomaly before Kyushu|Japan Earthquake

    Directory of Open Access Journals (Sweden)

    YANG Li

    2017-05-01

    Full Text Available GIM data released by IGS is used in the article and a new method of combining the Sliding Time Window Method and the Ionospheric TEC correlation analysis method of adjacent grid points is proposed to study the relationship between pre-earthquake ionospheric anomalies and earthquake. By analyzing the abnormal change of TEC in the 5 grid points around the seismic region, the abnormal change of ionospheric TEC is found before the earthquake and the correlation between the TEC sequences of lattice points is significantly affected by earthquake. Based on the analysis of the spatial distribution of TEC anomaly, anomalies of 6 h, 12 h and 6 h were found near the epicenter three days before the earthquake. Finally, ionospheric tomographic technology is used to do tomographic inversion on electron density. And the distribution of the electron density in the ionospheric anomaly is further analyzed.

  9. Earthquake activity along the Himalayan orogenic belt

    Science.gov (United States)

    Bai, L.; Mori, J. J.

    2017-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  10. Cognitive Hacking and Digital Government: Digital Identity

    OpenAIRE

    Paul Thompson

    2004-01-01

    Recently the National Center for Digital Government held a workshop on "The Virtual Citizen: Identity, Autonomy, and Accountability: A Civic Scenario Exploration of the Role of Identity in On-Line. Discussions at the workshop focused on five scenarios for future authentication policies with respect to digital identity. The underlying technologies considered for authentication were: biometrics: cryptography, with a focus on digital signatures; secure processing/computation; and reputation syst...

  11. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Stephenson, D.E.; Zandt, G.; Bouchon, M.; Hustrulid, W.A.

    1980-01-01

    In order to assess the seismic risk for an underground facility, a data base was established and analyzed to evaluate the potential for seismic disturbance. Substantial damage to underground facilities is usually the result of displacements primarily along pre-existing faults and fractures, or at the surface entrance to these facilities. Evidence of this comes from both earthquakes and large explosions. Therefore, the displacement due to earthquakes as a function of depth is important in the evaluation of the hazard to underground facilities. To evaluate potential displacements due to seismic effects of block motions along pre-existing or induced fractures, the displacement fields surrounding two types of faults were investigated. Analytical models were used to determine relative displacements of shafts and near-surface displacement of large rock masses. Numerical methods were used to determine the displacement fields associated with pure strike-slip and vertical normal faults. Results are presented as displacements for various fault lengths as a function of depth and distance. This provides input to determine potential displacements in terms of depth and distance for underground facilities, important for assessing potential sites and design parameters

  12. Understanding earthquake from the granular physics point of view — Causes of earthquake, earthquake precursors and predictions

    Science.gov (United States)

    Lu, Kunquan; Hou, Meiying; Jiang, Zehui; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    We treat the earth crust and mantle as large scale discrete matters based on the principles of granular physics and existing experimental observations. Main outcomes are: A granular model of the structure and movement of the earth crust and mantle is established. The formation mechanism of the tectonic forces, which causes the earthquake, and a model of propagation for precursory information are proposed. Properties of the seismic precursory information and its relevance with the earthquake occurrence are illustrated, and principle of ways to detect the effective seismic precursor is elaborated. The mechanism of deep-focus earthquake is also explained by the jamming-unjamming transition of the granular flow. Some earthquake phenomena which were previously difficult to understand are explained, and the predictability of the earthquake is discussed. Due to the discrete nature of the earth crust and mantle, the continuum theory no longer applies during the quasi-static seismological process. In this paper, based on the principles of granular physics, we study the causes of earthquakes, earthquake precursors and predictions, and a new understanding, different from the traditional seismological viewpoint, is obtained.

  13. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  14. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    Science.gov (United States)

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  15. 78 FR 64973 - Scientific Earthquake Studies Advisory Committee (SESAC)

    Science.gov (United States)

    2013-10-30

    ... warning and national earthquake hazard mapping. Meetings of the Scientific Earthquake Studies Advisory... DEPARTMENT OF THE INTERIOR Geological Survey [GX14GG009950000] Scientific Earthquake Studies... Public Law 106-503, the Scientific Earthquake Studies Advisory Committee (SESAC) will hold its next...

  16. Economic consequences of earthquakes: bridging research and practice with HayWired

    Science.gov (United States)

    Wein, A. M.; Kroll, C.

    2016-12-01

    The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.

  17. The 1989 Ms 7.1 Loma Prieta, California, Magnetic Earthquake Precursor Revisited

    Science.gov (United States)

    Thomas, J. N.; Love, J. J.; Johnston, M. J.

    2007-12-01

    Repeatable prediction of individual large earthquakes on the basis of quantitative geophysical data has proven to be frustratingly difficult and fraught with controversy. Still, some claims of success have been published, and among these are reports of identifiable precursory changes in magnetic-field activity as measured by ground- based magnetometers. By far the most prominent of such claims is that of Fraser-Smith et al., GRL, 17, 1465- 1468, 1990 who identified changes in Ultra Low Frequency (ULF, 0.01-10 Hz) magnetic noise prior to the 18 October 1989 Ms 7.1 Loma Prieta, California earthquake. The Fraser-Smith et al. result has been frequently cited in the literature, and it has been a major motivational influence for new research programs involving large arrays of ground-based instruments and even some satellite-based systems. We re-examine the data of the reported precursor, comparing them against independent data collected by magnetometers located in Japan and in the United States at the time of the Loma Prieta earthquake. From our analysis we infer that the key components of the precursory signal identified by Fraser-Smith et al. can be explained by minor corruption of the data in the form of a gain enhancement and time-stamp missassignment, possibly due to digital processing errors or inadvertent post-acquisitional treatment. We conclude that the reported magnetic anomaly is not related to the Loma Prieta earthquake.

  18. A long source area of the 1906 Colombia-Ecuador earthquake estimated from observed tsunami waveforms

    Science.gov (United States)

    Yamanaka, Yusuke; Tanioka, Yuichiro; Shiina, Takahiro

    2017-12-01

    The 1906 Colombia-Ecuador earthquake induced both strong seismic motions and a tsunami, the most destructive earthquake in the history of the Colombia-Ecuador subduction zone. The tsunami propagated across the Pacific Ocean, and its waveforms were observed at tide gauge stations in countries including Panama, Japan, and the USA. This study conducted slip inverse analysis for the 1906 earthquake using these waveforms. A digital dataset of observed tsunami waveforms at the Naos Island (Panama) and Honolulu (USA) tide gauge stations, where the tsunami was clearly observed, was first produced by consulting documents. Next, the two waveforms were applied in an inverse analysis as the target waveform. The results of this analysis indicated that the moment magnitude of the 1906 earthquake ranged from 8.3 to 8.6. Moreover, the dominant slip occurred in the northern part of the assumed source region near the coast of Colombia, where little significant seismicity has occurred, rather than in the southern part. The results also indicated that the source area, with significant slip, covered a long distance, including the southern, central, and northern parts of the region.[Figure not available: see fulltext.

  19. Estimating the Probability of Earthquake-Induced Landslides

    Science.gov (United States)

    McRae, M. E.; Christman, M. C.; Soller, D. R.; Sutter, J. F.

    2001-12-01

    The development of a regionally applicable, predictive model for earthquake-triggered landslides is needed to improve mitigation decisions at the community level. The distribution of landslides triggered by the 1994 Northridge earthquake in the Oat Mountain and Simi Valley quadrangles of southern California provided an inventory of failures against which to evaluate the significance of a variety of physical variables in probabilistic models of static slope stability. Through a cooperative project, the California Division of Mines and Geology provided 10-meter resolution data on elevation, slope angle, coincidence of bedding plane and topographic slope, distribution of pre-Northridge landslides, internal friction angle and cohesive strength of individual geologic units. Hydrologic factors were not evaluated since failures in the study area were dominated by shallow, disrupted landslides in dry materials. Previous studies indicate that 10-meter digital elevation data is required to properly characterize the short, steep slopes on which many earthquake-induced landslides occur. However, to explore the robustness of the model at different spatial resolutions, models were developed at the 10, 50, and 100-meter resolution using classification and regression tree (CART) analysis and logistic regression techniques. Multiple resampling algorithms were tested for each variable in order to observe how resampling affects the statistical properties of each grid, and how relationships between variables within the model change with increasing resolution. Various transformations of the independent variables were used to see which had the strongest relationship with the probability of failure. These transformations were based on deterministic relationships in the factor of safety equation. Preliminary results were similar for all spatial scales. Topographic variables dominate the predictive capability of the models. The distribution of prior landslides and the coincidence of slope

  20. Digital multilayer tomography

    International Nuclear Information System (INIS)

    Dueber, C.; Klose, K.J.; Thelen, M.

    1991-01-01

    With digital multilayer tomography a sequence of projection images is recorded by an image intensifier television system and stored as digital data during a linear run of a layer sequence. Using this data record, tomograms of the examined body region can be computed for any layer thickness by shifts and superimposition of the single projections later at a digital workstation. The qualities of digital and conventional tomograms are basically comparable. A drawback of digital tomography is its lower local resolution (512 x 512 image matrix), advantages are a lower radiation exposure, a shorter patient examination time, and the facilities of digital image processing (later processing, archive setup, transmission). (orig.) [de

  1. Playtesting The Digital Playground

    DEFF Research Database (Denmark)

    Majgaard, Gunver; Jessen, Carsten

    2009-01-01

    Being able to be absorbed in play in the digital playground is motivating for children who are used digital computer games. The children can play and exercise outdoors while using the same literacy as in indoor digital games. This paper presents a new playground product where an outdoor playground...... has been combined with digital games. The playground was tested in natural surroundings in a school yard and the findings about the interplay between digital and analog play are described here. Finally balancing in digital and analog games is discussed....

  2. Logic of the digital

    CERN Document Server

    Evens, Aden

    2015-01-01

    Building a foundational understanding of the digital, Logic of the Digital reveals a unique digital ontology. Beginning from formal and technical characteristics, especially the binary code at the core of all digital technologies, Aden Evens traces the pathways along which the digital domain of abstract logic encounters the material, human world. How does a code using only 0s and 1s give rise to the vast range of applications and information that constitutes a great and growing portion of our world? Evens' analysis shows how any encounter between the actual and the digital must cross an ontolo

  3. Experience with digital mammography

    Directory of Open Access Journals (Sweden)

    G. P. Korzhenkova

    2011-01-01

    Full Text Available The use of digital techniques in mammography has become a last step for completing the process of digitization in diagnostic imaging. It is assumed that such a spatial decision will be required for digital mammography, as well as for high-resolution intensifying screen-film systems used in conventional mammography and that the digital techniques will be limited by the digitizer pixel size on detecting minor structures, such as microcalcifications. The introduction of digital technologies in mammography involves a tight control over an image and assures its high quality.

  4. Smartphone MEMS accelerometers and earthquake early warning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  5. Strong motion duration and earthquake magnitude relationships

    International Nuclear Information System (INIS)

    Salmon, M.W.; Short, S.A.; Kennedy, R.P.

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ''strong motion duration'' has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions

  6. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  7. Strong motion duration and earthquake magnitude relationships

    Energy Technology Data Exchange (ETDEWEB)

    Salmon, M.W.; Short, S.A. [EQE International, Inc., San Francisco, CA (United States); Kennedy, R.P. [RPK Structural Mechanics Consulting, Yorba Linda, CA (United States)

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ``strong motion duration`` has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions.

  8. Stress triggering and the Canterbury earthquake sequence

    Science.gov (United States)

    Steacy, Sandy; Jiménez, Abigail; Holden, Caroline

    2014-01-01

    The Canterbury earthquake sequence, which includes the devastating Christchurch event of 2011 February, has to date led to losses of around 40 billion NZ dollars. The location and severity of the earthquakes was a surprise to most inhabitants as the seismic hazard model was dominated by an expected Mw > 8 earthquake on the Alpine fault and an Mw 7.5 earthquake on the Porters Pass fault, 150 and 80 km to the west of Christchurch. The sequence to date has included an Mw = 7.1 earthquake and 3 Mw ≥ 5.9 events which migrated from west to east. Here we investigate whether the later events are consistent with stress triggering and whether a simple stress map produced shortly after the first earthquake would have accurately indicated the regions where the subsequent activity occurred. We find that 100 per cent of M > 5.5 earthquakes occurred in positive stress areas computed using a slip model for the first event that was available within 10 d of its occurrence. We further find that the stress changes at the starting points of major slip patches of post-Darfield main events are consistent with triggering although this is not always true at the hypocentral locations. Our results suggest that Coulomb stress changes contributed to the evolution of the Canterbury sequence and we note additional areas of increased stress in the Christchurch region and on the Porters Pass fault.

  9. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.

    1997-01-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  10. Ionospheric precursors for crustal earthquakes in Italy

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2010-04-01

    Full Text Available Crustal earthquakes with magnitude 6.0>M≥5.5 observed in Italy for the period 1979–2009 including the last one at L'Aquila on 6 April 2009 were considered to check if the earlier obtained relationships for ionospheric precursors for strong Japanese earthquakes are valid for the Italian moderate earthquakes. The ionospheric precursors are based on the observed variations of the sporadic E-layer parameters (h'Es, fbEs and foF2 at the ionospheric station Rome. Empirical dependencies for the seismo-ionospheric disturbances relating the earthquake magnitude and the epicenter distance are obtained and they have been shown to be similar to those obtained earlier for Japanese earthquakes. The dependences indicate the process of spreading the disturbance from the epicenter towards periphery during the earthquake preparation process. Large lead times for the precursor occurrence (up to 34 days for M=5.8–5.9 tells about a prolong preparation period. A possibility of using the obtained relationships for the earthquakes prediction is discussed.

  11. Losses Associated with Secondary Effects in Earthquakes

    Directory of Open Access Journals (Sweden)

    James E. Daniell

    2017-06-01

    Full Text Available The number of earthquakes with high damage and high losses has been limited to around 100 events since 1900. Looking at historical losses from 1900 onward, we see that around 100 key earthquakes (or around 1% of damaging earthquakes have caused around 93% of fatalities globally. What is indeed interesting about this statistic is that within these events, secondary effects have played a major role, causing around 40% of economic losses and fatalities as compared to shaking effects. Disaggregation of secondary effect economic losses and fatalities demonstrating the relative influence of historical losses from direct earthquake shaking in comparison to tsunami, fire, landslides, liquefaction, fault rupture, and other type losses is important if we are to understand the key causes post-earthquake. The trends and major event impacts of secondary effects are explored in terms of their historic impact as well as looking to improved ways to disaggregate them through two case studies of the Tohoku 2011 event for earthquake, tsunami, liquefaction, fire, and the nuclear impact; as well as the Chilean 1960 earthquake and tsunami event.

  12. A critical history of British earthquakes

    Directory of Open Access Journals (Sweden)

    R. M. W. Musson

    2004-06-01

    Full Text Available This paper reviews the history of the study of historical British earthquakes. The publication of compendia of British earthquakes goes back as early as the late 16th Century. A boost to the study of earthquakes in Britain was given in the mid 18th Century as a result of two events occurring in London in 1750 (analogous to the general increase in earthquakes in Europe five years later after the 1755 Lisbon earthquake. The 19th Century saw a number of significant studies, culminating in the work of Davison, whose book-length catalogue was published finally in 1924. After that appears a gap, until interest in the subject was renewed in the mid 1970s. The expansion of the U.K. nuclear programme in the 1980s led to a series of large-scale investigations of historical British earthquakes, all based almost completely on primary historical data and conducted to high standards. The catalogue published by BGS in 1994 is a synthesis of these studies, and presents a parametric catalogue in which historical earthquakes are assessed from intensity data points based on primary source material. Since 1994, revisions to parameters have been minor and new events discovered have been restricted to a few small events.

  13. Automatic Earthquake Detection by Active Learning

    Science.gov (United States)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  14. Source Mechanisms of Recent Earthquakes occurred in the Fethiye-Rhodes Basin and Anaximander Seamounts (SW Turkey)

    Science.gov (United States)

    Yolsal-Çevikbilen, Seda; Taymaz, Tuncay

    2015-04-01

    Understanding the active tectonics of southern Turkey involves integrating earthquake source parameters with the regional tectonics. In this respect, seismological studies have played important roles in deciphering tectonic deformations and existing stress accumulations in the region. This study is concerned with the source mechanism parameters and spatio-temporal finite-fault slip distributions of recent earthquakes occurred along the Pliny-Strabo Trench (PST), which constitutes the eastern part of the Hellenic subduction zone in the Eastern Mediterranean Sea Region, and along the Fethiye-Burdur Fault Zone (SW Turkey). The study area is located at the junction of the Hellenic and Cyprus arcs along which the African plate plunges northwards beneath the Aegean Sea and the Anatolian block. Bathymetry and topography including large-scale tectonic structures such as the Rhodes Basin, Anaximander Seamounts, the Florence Rise, the Isparta Angle, the Taurus Mountains, and Kyrenia Range also reflect the tectonic complexities in the region. In this study, we performed point-source inversions by using teleseismic long-period P- and SH- and broad-band P-waveforms recorded by the Federation of Digital Seismograph Networks (FDSN) and the Global Digital Seismograph Network (GDSN) stations. We obtained source mechanism parameters and finite-fault slip distributions of recent Fethiye-Rhodes earthquakes (Mw ≥ 5.0) by comparing the shapes and amplitudes of long period P- and SH-waveforms, recorded in the distance range of 30 - 90 degrees, with synthetic waveforms. We further obtained rupture histories of the earthquakes to determine the fault area (fault length and width), maximum displacement, rupture duration and stress drop. Inversion results exhibit that recent earthquakes show left-lateral strike-slip faulting mechanisms with relatively deeper focal depths (h > 40 km) consistent with tectonic characteristics of the region, for example, the June 10, 2012 Fethiye earthquake (Mw

  15. Sense of Community and Depressive Symptoms among Older Earthquake Survivors Following the 2008 Earthquake in Chengdu China

    Science.gov (United States)

    Li, Yawen; Sun, Fei; He, Xusong; Chan, Kin Sun

    2011-01-01

    This study examined the impact of an earthquake as well as the role of sense of community as a protective factor against depressive symptoms among older Chinese adults who survived an 8.0 magnitude earthquake in 2008. A household survey of a random sample was conducted 3 months after the earthquake and 298 older earthquake survivors participated…

  16. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi

    2000-06-01

    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  17. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    Science.gov (United States)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  18. ASSESSMENT OF EARTHQUAKE HAZARDS ON WASTE LANDFILLS

    DEFF Research Database (Denmark)

    Zania, Varvara; Tsompanakis, Yiannis; Psarropoulos, Prodromos

    Earthquake hazards may arise as a result of: (a) transient ground deformation, which is induced due to seismic wave propagation, and (b) permanent ground deformation, which is caused by abrupt fault dislocation. Since the adequate performance of waste landfills after an earthquake is of outmost...... importance, the current study examines the impact of both types of earthquake hazards by performing efficient finite-element analyses. These took also into account the potential slip displacement development along the geosynthetic interfaces of the composite base liner. At first, the development of permanent...

  19. Sismosima: A pioneer project for earthquake detection

    International Nuclear Information System (INIS)

    Echague, C. de

    2015-01-01

    Currently you can only study how earthquakes occur and minimizing their consequences, but in Sismosima are studied earthquakes for if possible issue a pre-alert. Geological and Mining Institute of Spain (IGME) launched this project that has already achieved in test the caves in which you installed meters an increase of carbon dioxide (CO 2 ) that match the shot earthquake. Now, it remains check if gas emission occurs simultaneously, before or after. If were before, a couple of minutes would be enough to give an early warning with which save lives and ensure facilities. (Author)

  20. Wave-equation Based Earthquake Location

    Science.gov (United States)

    Tong, P.; Yang, D.; Yang, X.; Chen, J.; Harris, J.

    2014-12-01

    Precisely locating earthquakes is fundamentally important for studying earthquake physics, fault orientations and Earth's deformation. In industry, accurately determining hypocenters of microseismic events triggered in the course of a hydraulic fracturing treatment can help improve the production of oil and gas from unconventional reservoirs. We develop a novel earthquake location method based on solving full wave equations to accurately locate earthquakes (including microseismic earthquakes) in complex and heterogeneous structures. Traveltime residuals or differential traveltime measurements with the waveform cross-correlation technique are iteratively inverted to obtain the locations of earthquakes. The inversion process involves the computation of the Fréchet derivative with respect to the source (earthquake) location via the interaction between a forward wavefield emitting from the source to the receiver and an adjoint wavefield reversely propagating from the receiver to the source. When there is a source perturbation, the Fréchet derivative not only measures the influence of source location but also the effects of heterogeneity, anisotropy and attenuation of the subsurface structure on the arrival of seismic wave at the receiver. This is essential for the accuracy of earthquake location in complex media. In addition, to reduce the computational cost, we can first assume that seismic wave only propagates in a vertical plane passing through the source and the receiver. The forward wavefield, adjoint wavefield and Fréchet derivative with respect to the source location are all computed in a 2D vertical plane. By transferring the Fréchet derivative along the horizontal direction of the 2D plane into the ones along Latitude and Longitude coordinates or local 3D Cartesian coordinates, the source location can be updated in a 3D geometry. The earthquake location obtained with this combined 2D-3D approach can then be used as the initial location for a true 3D wave

  1. Earthquake consequences and measures for reduction of seismic risk.

    Science.gov (United States)

    Jurukovski, D

    1997-09-01

    Earthquakes are one of the most destructive of all natural disasters. This article discusses the consequences of earthquakes on material property. In addition, measures for the control and reduction of the consequences of earthquakes are described. Emphasis is placed on appropriate preparation by the general population and the need for a rapid and efficient response of governmental agencies. Finally, the experience of the staff of the Institute of Earthquake Engineering and Engineering Seismology in minimizing the consequences of earthquakes is described.

  2. Modified Mercalli intensities for some recent California earthquakes and historic San Francisco Bay Region earthquakes

    Science.gov (United States)

    Bakun, William H.

    1998-01-01

    Modified Mercalli Intensity (MMI) data for recent California earthquakes were used by Bakun and Wentworth (1997) to develop a strategy for bounding the location and moment magnitude M of earthquakes from MMI observations only. Bakun (Bull. Seismol. Soc. Amer., submitted) used the Bakun and Wentworth (1997) strategy to analyze 19th century and early 20th century San Francisco Bay Region earthquakes. The MMI data and site corrections used in these studies are listed in this Open-file Report. 

  3. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    Science.gov (United States)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  4. Coastal California Digital Imagery

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This digital ortho-imagery dataset is a survey of coastal California. The project area consists of approximately 3774 square miles. The project design of the digital...

  5. Ambient noise as the new source for urban engineering seismology and earthquake engineering: a case study from Beijing metropolitan area

    Science.gov (United States)

    Liu, Lanbo; Chen, Qi-fu; Wang, Weijun; Rohrbach, Eric

    2014-02-01

    In highly populated urban centers, traditional seismic survey sources can no longer be properly applied due to restrictions in modern civilian life styles. The ambient vibration noise, including both microseisms and microtremor, though are generally weak but available anywhere and anytime, can be an ideal supplementary source for conducting seismic surveys for engineering seismology and earthquake engineering. This is fundamentally supported by advanced digital signal processing techniques for effectively extracting the useful information out from the noise. Thus, it can be essentially regarded as a passive seismic method. In this paper we first make a brief survey of the ambient vibration noise, followed by a quick summary of digital signal processing for passive seismic surveys. Then the applications of ambient noise in engineering seismology and earthquake engineering for urban settings are illustrated with examples from Beijing metropolitan area. For engineering seismology the example is the assessment of site effect in a large area via microtremor observations. For earthquake engineering the example is for structural characterization of a typical reinforced concrete high-rise building using background vibration noise. By showing these examples we argue that the ambient noise can be treated as a new source that is economical, practical, and particularly valuable to engineering seismology and earthquake engineering projects for seismic hazard mitigation in urban areas.

  6. Developing Dynamic Digital Image Techniques with Continuous Parameters to Detect Structural Damage

    Science.gov (United States)

    Sung, Wen-Pei

    2013-01-01

    Several earthquakes with strong magnitude occurred globally at various locations, especially the unforgettable tsunami disaster caused by the earthquake in Indonesia and Japan. If the characteristics of structures can be well understood to implement new technology, the damages caused by most natural disasters can be significantly alleviated. In this research, dynamic digital image correlation method for using continuous parameter is applied for developing a low-cost digital image correlation coefficient method with advanced digital cameras and high-speed computers. The experimental study using cantilever test object with defect control confirms that the vibration mode calculated using this proposed method can highly express the defect locations. This proposed method combined with the sensitivity of Inter-Story Drift Mode Shape, IDMS, can also reveal the damage degree of damage structure. These test and analysis results indicate that this proposed method is high enough for applying to achieve the object of real-time online monitoring of structure. PMID:24023530

  7. Developing Dynamic Digital Image Techniques with Continuous Parameters to Detect Structural Damage

    Directory of Open Access Journals (Sweden)

    Ming-Hsiang Shih

    2013-01-01

    Full Text Available Several earthquakes with strong magnitude occurred globally at various locations, especially the unforgettable tsunami disaster caused by the earthquake in Indonesia and Japan. If the characteristics of structures can be well understood to implement new technology, the damages caused by most natural disasters can be significantly alleviated. In this research, dynamic digital image correlation method for using continuous parameter is applied for developing a low-cost digital image correlation coefficient method with advanced digital cameras and high-speed computers. The experimental study using cantilever test object with defect control confirms that the vibration mode calculated using this proposed method can highly express the defect locations. This proposed method combined with the sensitivity of Inter-Story Drift Mode Shape, IDMS, can also reveal the damage degree of damage structure. These test and analysis results indicate that this proposed method is high enough for applying to achieve the object of real-time online monitoring of structure.

  8. Digital Privacy Legislation Awareness

    OpenAIRE

    Henry Foulds; Magda Huisman; Gunther R. Drevin

    2013-01-01

    Privacy is regarded as a fundamental human right and it is clear that the study of digital privacy is an important field. Digital privacy is influenced by new and constantly evolving technologies and this continuous change makes it hard to create legislation to protect people's privacy from being exploited by misuse of these technologies. This study aims to benefit digital privacy legislation efforts by evaluating the awareness and perceived importance of digital privacy legislation among...

  9. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  10. Digital asset management.

    Science.gov (United States)

    Humphrey, Clinton D; Tollefson, Travis T; Kriet, J David

    2010-05-01

    Facial plastic surgeons are accumulating massive digital image databases with the evolution of photodocumentation and widespread adoption of digital photography. Managing and maximizing the utility of these vast data repositories, or digital asset management (DAM), is a persistent challenge. Developing a DAM workflow that incorporates a file naming algorithm and metadata assignment will increase the utility of a surgeon's digital images. Copyright 2010 Elsevier Inc. All rights reserved.

  11. Digital Inkjet Textile Printing

    OpenAIRE

    Wang, Meichun

    2017-01-01

    Digital inkjet textile printing is an emerging technology developed with the rise of the digital world. It offers a possibility to print high-resolution images with unlimited color selection on fabrics. Digital inkjet printing brings a revolutionary chance for the textile printing industry. The history of textile printing shows the law how new technology replaces the traditional way of printing. This indicates the future of digital inkjet textile printing is relatively positive. Differen...

  12. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    Science.gov (United States)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  13. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  14. A detailed analysis of some local earthquakes at Somma-Vesuvius

    Directory of Open Access Journals (Sweden)

    C. Troise

    1999-06-01

    Full Text Available In this paper, we analyze local earthquakes which occurred at Somma-Vesuvius during two episodes of intense seismic swarms, in 1989 and 1995 respectively. For the selected earthquakes we have computed accurate hypocentral locations, focal mechanisms and spectral parameters. We have also studied the ground acceleration produced by the largest events of the sequences (ML 3.0, at various digital stations installed in the area during the periods of higher seismic activity. The main result is that seismicity during the two swarm episodes presents similar features in both locations and focal mechanisms. Strong site dependent effects are evidenced in the seismic radiation and strong amplifications in the frequency band 10-15 Hz are evident at stations located on the younger Vesuvius structure, with respect to one located on the ancient Somma structure. Furthermore, seismic stations show peak accelerations for the same events of more than one order of magnitude apart.

  15. Digital voltage discriminator

    International Nuclear Information System (INIS)

    Zhou Zhicheng

    1992-01-01

    A digital voltage discriminator is described, which is synthesized by digital comparator and ADC. The threshold is program controllable with high stability. Digital region of confusion is approximately equal to 1.5 LSB. This discriminator has a single channel analyzer function model with channel width of 1.5 LSB

  16. Behandlingseffekt af Digital Dermatitis

    DEFF Research Database (Denmark)

    Krogh, Kenneth; Thomsen, Peter

    2008-01-01

    af klovlidelser herunder især Digital Dermatitis. Klovregistreringerne viser, at der er stor dynamik og mange nyinfektioner af Digital Dermatitis svarende til problematikken ved mastitis. Behandlingseffekten ved Digital Dermatitis er høj (omkring 90 %) ved den udførte behandling. Behandlingen bestod...

  17. Digital Language Death

    Science.gov (United States)

    Kornai, András

    2013-01-01

    Of the approximately 7,000 languages spoken today, some 2,500 are generally considered endangered. Here we argue that this consensus figure vastly underestimates the danger of digital language death, in that less than 5% of all languages can still ascend to the digital realm. We present evidence of a massive die-off caused by the digital divide. PMID:24167559

  18. Digital language death.

    Directory of Open Access Journals (Sweden)

    András Kornai

    Full Text Available Of the approximately 7,000 languages spoken today, some 2,500 are generally considered endangered. Here we argue that this consensus figure vastly underestimates the danger of digital language death, in that less than 5% of all languages can still ascend to the digital realm. We present evidence of a massive die-off caused by the digital divide.

  19. Reconceptualising Critical Digital Literacy

    Science.gov (United States)

    Pangrazio, Luciana

    2016-01-01

    While it has proved a useful concept during the past 20 years, the notion of "critical digital literacy" requires rethinking in light of the fast-changing nature of young people's digital practices. This paper contrasts long-established notions of "critical digital literacy" (based primarily around the critical consumption of…

  20. Digitization in Maritime Industry

    DEFF Research Database (Denmark)

    Constantiou, Ioanna; Shollo, Arisa; Kreiner, Kristian

    2017-01-01

    Digitization in the maritime industry is expected to transform businesses. The recently introduced mobile technologies in inter-organizational processes is an example of digitization in an industry which moves very slowly towards digital transformation. We focus on the influence of mobile...

  1. Preparing collections for digitization

    CERN Document Server

    Bulow, Anna E

    2010-01-01

    Most libraries, archives and museums are confronting the challenges of providing digital access to their collections. This guide offers guidance covering the end-to-end process of digitizing collections, from selecting records for digitization to choosing suppliers and equipment and dealing with documents that present individual problems.

  2. Strong ground motion prediction using virtual earthquakes.

    Science.gov (United States)

    Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

    2014-01-24

    Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

  3. DYFI data for Induced Earthquake Studies

    Data.gov (United States)

    Department of the Interior — The significant rise in seismicity rates in Oklahoma and Kansas (OK–KS) in the last decade has led to an increased interest in studying induced earthquakes. Although...

  4. Coping with earthquakes induced by fluid injection

    Science.gov (United States)

    McGarr, Arthur F.; Bekins, Barbara; Burkardt, Nina; Dewey, James W.; Earle, Paul S.; Ellsworth, William L.; Ge, Shemin; Hickman, Stephen H.; Holland, Austin F.; Majer, Ernest; Rubinstein, Justin L.; Sheehan, Anne

    2015-01-01

    Large areas of the United States long considered geologically stable with little or no detected seismicity have recently become seismically active. The increase in earthquake activity began in the mid-continent starting in 2001 (1) and has continued to rise. In 2014, the rate of occurrence of earthquakes with magnitudes (M) of 3 and greater in Oklahoma exceeded that in California (see the figure). This elevated activity includes larger earthquakes, several with M > 5, that have caused significant damage (2, 3). To a large extent, the increasing rate of earthquakes in the mid-continent is due to fluid-injection activities used in modern energy production (1, 4, 5). We explore potential avenues for mitigating effects of induced seismicity. Although the United States is our focus here, Canada, China, the UK, and others confront similar problems associated with oil and gas production, whereas quakes induced by geothermal activities affect Switzerland, Germany, and others.

  5. Masonry infill performance during the Northridge earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Flanagan, R.D. [Lockheed Martin Energy Systems, Oak Ridge, TN (United States); Bennett, R.M.; Fischer, W.L. [Univ. of Tennesee, Knoxville, TN (United States); Adham, S.A. [Agbabian Associates, Pasadena, CA (United States)

    1996-03-08

    The response of masonry infills during the 1994 Northridge, California earthquake is described in terms of three categories: (1) lowrise and midrise structures experiencing large near field seismic excitations, (2) lowrise and midrise structures experiencing moderate far field excitation, and (3) highrise structures experiencing moderate far field excitation. In general, the infills provided a positive beneficial effect on the performance of the buildings, even those experiencing large peak accelerations near the epicenter. Varying types of masonry infills, structural frames, design conditions, and construction deficiencies were observed and their performance during the earthquake indicated. A summary of observations of the performance of infills in other recent earthquakes is given. Comparison with the Northridge earthquake is made and expected response of infill structures in lower seismic regions of the central and eastern United States is discussed.

  6. SHOCK WAVE IN IONOSPHERE DURING EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    V.V. Kuznetsov

    2016-11-01

    Full Text Available Fundamentally new model of the shock wave (SW generation in atmosphere and ionosphere during earthquake is proposed. The model proceeds from the idea of cooperative shock water crystallization in a cloud

  7. Safety and survival in an earthquake

    Science.gov (United States)

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  8. NGA Nepal Earthquake Support Data Services

    Data.gov (United States)

    National Geospatial Intelligence Agency — In support of the Spring 2015 Nepal earthquake response, NGA is providing to the public and humanitarian disaster response community these Nepal data services. They...

  9. The 15 April 1909 Taipei Earthquake

    Directory of Open Access Journals (Sweden)

    Jeen-Hwa Wang

    2011-01-01

    Full Text Available In the very early morning at 03 h 53.7 m on 15 April 1909 (local time, a large earthquake occurred in northern Taiwan. In all, 9 persons were killed and 51 injured; 122 houses collapsed along with damage to another 1050 houses. This earthquake was one of the largest and most damaging events of the 20th century for the Taipei Metropolitan Area. The epicenter estimated by Hsu (1971 was determined to be 25¢XN, 121.53¢XE and its focal depth and earthquake magnitude evaluated by Gutenberg and Richter (1954 were ~80 km and MGR = 7.3, respectively. The event took place underneath the Taipei Metropolitan Area and might be located at the western edge of the subduction zone of the Philippine Sea plate. In this study, the magnitudes of the earthquakes determined by others will also be described.

  10. Drinking Water Earthquake Resilience Paper Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data for the 9 figures contained in the paper, A SOFTWARE FRAMEWORK FOR ASSESSING THE RESILIENCE OF DRINKING WATER SYSTEMS TO DISASTERS WITH AN EXAMPLE EARTHQUAKE...

  11. The 5th July 1930 earthquake at Montilla (S Spain). Use of regionally recorded smoked paper seismograms

    Science.gov (United States)

    Batlló, J.; Stich, D.; Macià, R.; Morales, J.

    2009-04-01

    On the night of 5th July 1930 a damaging earthquake struck the town of Montilla (near Córdoba, S-Spain) and its surroundings. Magnitude estimation for this earthquake is M=5, and its epicentral intensity has been evaluated as VIII (MSK). Even it is an earthquake of moderate size, it is the largest one in-strumentally recorded in this region. This makes this event of interest for a better definition of the regional seismicity. For this reason we decided to study a new its source from the analysis of the available contemporary seismograms and related documents. A total of 25 seismograms from 11 seismic stations have been collected and digitized. Processing of some of the records has been difficult because they were obtained from microfilm or contemporary reproductions on journals. Most of them are on smoked paper and recorded at regional distances. This poses a good opportunity to test the limits of the use of such low frequency - low dynamics recorded seismograms for the study of regional events. Results are promising: Using such regional seismograms the event has been relocated, its magnitude recalculated (Mw 5.1) and inversion of waveforms to elucidate its focal mechanism has been performed. We present the results of this research and its consequences for the regional seismicity and we compare them with present smaller earthquakes occurred in the same place and with the results obtained for earthquakes of similar size occurred more to the East on 1951.

  12. Field Imaging Spectroscopy. Applications in Earthquake Geology

    Science.gov (United States)

    Ragona, D.; Minster, B.; Rockwell, T. K.; Fialko, Y.; Jussila, J.; Blom, R.

    2005-12-01

    Field Imaging Spectroscopy in the visible and infrared sections of the spectrum can be used as a technique to assist paleoseismological studies. Submeter range hyperspectral images of paleoseismic excavations can assist the analyisis and interpretation of the earthquake history of a site. They also provide an excellent platform for storage of the stratigraphic and structural information collected from such a site. At the present, most field data are collected descriptively. This greatly enhances the range of information that can be recorded in the field. The descriptions are documented on hand drawn field logs and/or photomosaics constructed from individual photographs. Recently developed portable hyperspectral sensors acquire high-quality spectroscopic information at high spatial resolution (pixel size ~ 0.5 mm at 50 cm) over frequencies ranging from the visible band to short wave infrared. The new data collection and interpretation methodology that we are developing (Field Imaging Spectroscopy) makes available, for the first time, a tool to quantitatively analyze paleoseismic and stratigraphic information. The reflectance spectra of each sub-millimeter portion of the material are stored in a 3-D matrix (hyperspectral cube) that can be analyzed by visual inspection, or by using a large variety of algorithms. The reflectance spectrum is related to the chemical composition and physical properties of the surface therefore hyperspectral images are capable of revealing subtle changes in texture, composition and weathering. For paleoseismic studies, we are primarily interested in distinguishing changes between layers at a given site (spectral stratigraphy) rather than the precise composition of the layers, although this is an added benefit. We have experimented with push-broom (panoramic) portable scanners, and acquired data form portions of fault exposures and cores. These images were processed using well-known imaging processing algorithms, and the results have being

  13. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Science.gov (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  14. Constraining subducted slab properties with deep earthquakes

    Science.gov (United States)

    Zhan, Z.; Yang, T.; Gurnis, M.; Shen, Z.; Wu, F.

    2017-12-01

    The discovery of deep earthquakes and Wadati-Benioff zone was a critical piece in the early history of plate tectonics. Today, deep earthquakes continue to serve as important markers/probes of subducted slab geometry, structure, and stress state. Here we discuss three examples in which we have recently used deep earthquakes to provide new insights to subducted slab properties. In the first application, we investigate the slab morphology and stress regimes under different trench motion histories with geodynamic models. We find that the isolation of the 2015 Mw 7.9 Bonin Islands deep earthquake from the background Wadati-Benioff zone may be explained as a result of Pacific slab buckling in response to the slow trench retreat. Additionally, subducted slab is inherently heterogeneous due to non-linear viscosity, contributing to the occurrences of isolated deep earthquakes. In the second application, we quantify the coda waveform differences from nearby deep earthquakes to image fine-scale slab structures. We find that large metastable olivine wedge suggested by several previous studies can not fit our observations. Therefore, the effects of metastable olivine on slab dynamics should be re-assessed. In the third application, we take advantage of P and S differential travel times from deep earthquake clusters to isolate signatures of Vp/Vs ratios within slabs from ambient mantle. We observe substantial deviations of slab Vp/Vs from that in 1D reference Earth models, and even possible lateral variations. This sheds light on potential difference in slab temperature or water content. All three applications underscore that deep earthquakes are still incredibly useful in informing us more about subducted slabs.

  15. Natural Gas Extraction, Earthquakes and House Prices

    OpenAIRE

    Hans R.A. Koster; Jos N. van Ommeren

    2015-01-01

    The production of natural gas is strongly increasing around the world. Long-run negative external effects of extraction are understudied and often ignored in social) cost-benefit analyses. One important example is that natural gas extraction leads to soil subsidence and subsequent induced earthquakes that may occur only after a couple of decades. We show that induced earthquakes that are noticeable to residents generate substantial non-monetary economic effects, as measured by their effects o...

  16. Earthquake geology of the Bulnay Fault (Mongolia)

    Science.gov (United States)

    Rizza, Magali; Ritz, Jean-Franciois; Prentice, Carol S.; Vassallo, Ricardo; Braucher, Regis; Larroque, Christophe; Arzhannikova, A.; Arzhanikov, S.; Mahan, Shannon; Massault, M.; Michelot, J-L.; Todbileg, M.

    2015-01-01

    The Bulnay earthquake of July 23, 1905 (Mw 8.3-8.5), in north-central Mongolia, is one of the world's largest recorded intracontinental earthquakes and one of four great earthquakes that occurred in the region during the 20th century. The 375-km-long surface rupture of the left-lateral, strike-slip, N095°E trending Bulnay Fault associated with this earthquake is remarkable for its pronounced expression across the landscape and for the size of features produced by previous earthquakes. Our field observations suggest that in many areas the width and geometry of the rupture zone is the result of repeated earthquakes; however, in those areas where it is possible to determine that the geomorphic features are the result of the 1905 surface rupture alone, the size of the features produced by this single earthquake are singular in comparison to most other historical strike-slip surface ruptures worldwide. Along the 80 km stretch, between 97.18°E and 98.33°E, the fault zone is characterized by several meters width and the mean left-lateral 1905 offset is 8.9 ± 0.6 m with two measured cumulative offsets that are twice the 1905 slip. These observations suggest that the displacement produced during the penultimate event was similar to the 1905 slip. Morphotectonic analyses carried out at three sites along the eastern part of the Bulnay fault, allow us to estimate a mean horizontal slip rate of 3.1 ± 1.7 mm/yr over the Late Pleistocene-Holocene period. In parallel, paleoseismological investigations show evidence for two earthquakes prior to the 1905 event with recurrence intervals of ~2700-4000 years.

  17. Evaluation of near-field earthquake effects

    Energy Technology Data Exchange (ETDEWEB)

    Shrivastava, H.P.

    1994-11-01

    Structures and equipment, which are qualified for the design basis earthquake (DBE) and have anchorage designed for the DBE loading, do not require an evaluation of the near-field earthquake (NFE) effects. However, safety class 1 acceleration sensitive equipment such as electrical relays must be evaluated for both NFE and DBE since they are known to malfunction when excited by high frequency seismic motions.

  18. The Christchurch earthquake stroke incidence study.

    Science.gov (United States)

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Digital Living at Home

    DEFF Research Database (Denmark)

    Andersen, Pernille Viktoria Kathja; Christiansen, Ellen Tove

    2013-01-01

    of these user voices has directed us towards a ‘home-keeping’ design discourse, which opens new horizons for design of digital home control systems by allowing users to perform as self-determined controllers and groomers of their habitat. The paper concludes by outlining the implications of a ‘home......Does living with digital technology inevitably lead to digital living? Users talking about a digital home control system, they have had in their homes for eight years, indicate that there is more to living with digital technology than a functional-operational grip on regulation. Our analysis...

  20. La radio digital.

    OpenAIRE

    Cortés, Carlos

    2005-01-01

    La radio digital es un producto de la llamada convergencia digital. Desde la década de1990, nuevos dispositivos electrónicos de recepción y reproducción digital, incluyendo ciertos teléfonos celulares, se comunican entre sí, en los entornos de redes, mediante sencillas interfaces. Por esta razón, ofrecen ventajas antes inexistentes en los medios analógicos. A partir de sistemas de adquisición y producción digital, que comenzaron como simples cintas de audio digital (DAT), la evolución tec...

  1. Digital disruption ?syndromes.

    Science.gov (United States)

    Sullivan, Clair; Staib, Andrew

    2017-05-18

    The digital transformation of hospitals in Australia is occurring rapidly in order to facilitate innovation and improve efficiency. Rapid transformation can cause temporary disruption of hospital workflows and staff as processes are adapted to the new digital workflows. The aim of this paper is to outline various types of digital disruption and some strategies for effective management. A large tertiary university hospital recently underwent a rapid, successful roll-out of an integrated electronic medical record (EMR). We observed this transformation and propose several digital disruption "syndromes" to assist with understanding and management during digital transformation: digital deceleration, digital transparency, digital hypervigilance, data discordance, digital churn and post-digital 'depression'. These 'syndromes' are defined and discussed in detail. Successful management of this temporary digital disruption is important to ensure a successful transition to a digital platform. What is known about this topic? Digital disruption is defined as the changes facilitated by digital technologies that occur at a pace and magnitude that disrupt established ways of value creation, social interactions, doing business and more generally our thinking. Increasing numbers of Australian hospitals are implementing digital solutions to replace traditional paper-based systems for patient care in order to create opportunities for improved care and efficiencies. Such large scale change has the potential to create transient disruption to workflows and staff. Managing this temporary disruption effectively is an important factor in the successful implementation of an EMR. What does this paper add? A large tertiary university hospital recently underwent a successful rapid roll-out of an integrated electronic medical record (EMR) to become Australia's largest digital hospital over a 3-week period. We observed and assisted with the management of several cultural, behavioural and

  2. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  3. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  4. Relationship of heat and cold to earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.

    1980-06-26

    An analysis of 54 earthquakes of magnitude 7 and above, including 13 of magnitude 8 and above, between 780 BC and the present, shows that the vast majority of them fell in the four major cool periods during this time span, or on the boundaries of these periods. Between 1800 and 1876, four periods of earthquake activity in China can be recognized, and these tend to correspond to relatively cold periods over that time span. An analysis of earthquakes of magnitude 6 or above over the period 1951 to 1965 gives the following results: earthquakes in north and southwest China tended to occur when the preceding year had an above-average annual temperature and winter temperature; in the northeast they tended to occur in a year after a year with an above-average winter temperature; in the northwest there was also a connection with a preceding warm winter, but to a less pronounced degree. The few earthquakes in South China seemed to follow cold winters. Both the Tangshan and Yongshan Pass earthquakes were preceded by unusually warm years and relatively high winter temperatures.

  5. Do weak global stresses synchronize earthquakes?

    Science.gov (United States)

    Bendick, R.; Bilham, R.

    2017-08-01

    Insofar as slip in an earthquake is related to the strain accumulated near a fault since a previous earthquake, and this process repeats many times, the earthquake cycle approximates an autonomous oscillator. Its asymmetric slow accumulation of strain and rapid release is quite unlike the harmonic motion of a pendulum and need not be time predictable, but still resembles a class of repeating systems known as integrate-and-fire oscillators, whose behavior has been shown to demonstrate a remarkable ability to synchronize to either external or self-organized forcing. Given sufficient time and even very weak physical coupling, the phases of sets of such oscillators, with similar though not necessarily identical period, approach each other. Topological and time series analyses presented here demonstrate that earthquakes worldwide show evidence of such synchronization. Though numerous studies demonstrate that the composite temporal distribution of major earthquakes in the instrumental record is indistinguishable from random, the additional consideration of event renewal interval serves to identify earthquake groupings suggestive of synchronization that are absent in synthetic catalogs. We envisage the weak forces responsible for clustering originate from lithospheric strain induced by seismicity itself, by finite strains over teleseismic distances, or by other sources of lithospheric loading such as Earth's variable rotation. For example, quasi-periodic maxima in rotational deceleration are accompanied by increased global seismicity at multidecadal intervals.

  6. Slope instabilities triggered by the 2011 Lorca earthquake (M{sub w} 5.1): a comparison and revision of hazard assessments of earthquake-triggered landslides in Murcia; Inestabilidades de ladera provocadas por el terremoto de Lorca de 2011 (Mw 5,1): comparacion y revision de estudios de peligrosidad de movimientos de ladera por efecto sismico en Murcia

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez-Peces, M. J.; Garcia-Mayordomo, J.; Martinez-Diaz, J. J.; Tsige, M.

    2012-11-01

    The Lorca basin has been the object of recent research aimed at studying the phenomenon of earthquake induced landslides and their assessment within the context of different seismic scenarios, bearing in mind the influence of soil and topographical amplification effects. Nevertheless, it was not until the Lorca earthquakes of 11 May 2011 that it became possible to adopt a systematic approach to the problem. We provide here an inventory of slope instabilities triggered by the Lorca earthquakes comprising 100 cases, mainly small rock and soil falls (1 to 100 m{sup 3}). The distribution of these instabilities is compared to two different earthquake-triggered landslide hazard maps: one considering the occurrence of the most probable earthquake for a 475-yr return period in the Lorca basin (M{sub w} = 5.0), which was previously published on the basis of a low-resolution digital elevation model (DEM), and a second one matching the occurrence of the M{sub w} = 5.1 2011 Lorca earthquake, which was undertaken using a higher resolution DEM. The most frequent Newmark displacement values related to the slope failures triggered by the 2011 Lorca earthquakes are smaller than 2 cm in both hazard scenarios and coincide with areas where significant soil and topographical seismic amplification effects have occurred.

  7. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  8. AUTOMATIC BLOCKED ROADS ASSESSMENT AFTER EARTHQUAKE USING HIGH RESOLUTION SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    H. Rastiveis

    2015-12-01

    Full Text Available In 2010, an earthquake in the city of Port-au-Prince, Haiti, happened quite by chance an accident and killed over 300000 people. According to historical data such an earthquake has not occurred in the area. Unpredictability of earthquakes has necessitated the need for comprehensive mitigation efforts to minimize deaths and injuries. Blocked roads, caused by debris of destroyed buildings, may increase the difficulty of rescue activities. In this case, a damage map, which specifies blocked and unblocked roads, can be definitely helpful for a rescue team. In this paper, a novel method for providing destruction map based on pre-event vector map and high resolution world view II satellite images after earthquake, is presented. For this purpose, firstly in pre-processing step, image quality improvement and co-coordination of image and map are performed. Then, after extraction of texture descriptor from the image after quake and SVM classification, different terrains are detected in the image. Finally, considering the classification results, specifically objects belong to “debris” class, damage analysis are performed to estimate the damage percentage. In this case, in addition to the area objects in the “debris” class their shape should also be counted. The aforementioned process are performed on all the roads in the road layer.In this research, pre-event digital vector map and post-event high resolution satellite image, acquired by Worldview-2, of the city of Port-au-Prince, Haiti's capital, were used to evaluate the proposed method. The algorithm was executed on 1200×800 m2 of the data set, including 60 roads, and all the roads were labelled correctly. The visual examination have authenticated the abilities of this method for damage assessment of urban roads network after an earthquake.

  9. Visualizing the 2009 Samoan and Sumatran Earthquakes using Google Earth-based COLLADA models

    Science.gov (United States)

    de Paor, D. G.; Brooks, W. D.; Dordevic, M.; Ranasinghe, N. R.; Wild, S. C.

    2009-12-01

    Earthquake hazards are generally analyzed by a combination of graphical focal mechanism or centroid moment tensor solutions (aka geophysical beach balls), contoured fault plane maps, and shake maps or tsunami damage surveys. In regions of complex micro-plate tectonics, it can be difficult to visualize spatial and temporal relations among earthquakes, aftershocks, and associated tectonic and volcanic structures using two-dimensional maps and cross sections alone. Developing the techniques originally described by D.G. De Paor & N.R. Williams (EOS Trans. AGU S53E-05, 2006), we can view the plate tectonic setting, geophysical parameters, and societal consequences of the 2009 Samoan and Sumatran earthquakes on the Google Earth virtual globe. We use xml-based COLLADA models to represent the subsurface structure and standard KML to overlay map data on the digital terrain model. Unlike traditional geophysical beach ball figures, our models are three dimensional and located at correct depth, and they optionally show nodal planes which are useful in relating the orientation of one earthquake to the hypo-centers of its neighbors. With the aid of the new Google Earth application program interface (GE API), we can use web page-based Javascript controls to lift structural models from the subsurface in Google Earth and generate serial sections along strike. Finally, we use the built-in features of the Google Earth web browser plug-in to create a virtual tour of damage sites with hyperlinks to web-based field reports. These virtual globe visualizations may help complement existing KML and HTML resources of the USGS Earthquake Hazards Program and The Global CMT Project.

  10. Monitoring Geologic Hazards and Vegetation Recovery in the Wenchuan Earthquake Region Using Aerial Photography

    Directory of Open Access Journals (Sweden)

    Zhenwang Li

    2014-03-01

    Full Text Available On 12 May 2008, the 8.0-magnitude Wenchuan earthquake occurred in Sichuan Province, China, triggering thousands of landslides, debris flows, and barrier lakes, leading to a substantial loss of life and damage to the local environment and infrastructure. This study aimed to monitor the status of geologic hazards and vegetation recovery in a post-earthquake disaster area using high-resolution aerial photography from 2008 to 2011, acquired from the Center for Earth Observation and Digital Earth (CEODE, Chinese Academy of Sciences. The distribution and range of hazards were identified in 15 large, representative geologic hazard areas triggered by the Wenchuan earthquake. After conducting an overlay analysis, the variations of these hazards between successive years were analyzed to reflect the geologic hazard development and vegetation recovery. The results showed that in the first year after the Wenchuan earthquake, debris flows occurred frequently with high intensity. Resultantly, with the source material becoming less available and the slope structure stabilizing, the intensity and frequency of debris flows gradually decreased with time. The development rate of debris flows between 2008 and 2011 was 3% per year. The lithology played a dominant role in the formation of debris flows, and the topography and hazard size in the earthquake affected area also had an influence on the debris flow development process. Meanwhile, the overall geologic hazard area decreased at 12% per year, and the vegetation recovery on the landslide mass was 15% to 20% per year between 2008 and 2011. The outcomes of this study provide supporting data for ecological recovery as well as debris flow control and prevention projects in hazard-prone areas.

  11. Progress in digital radiography

    International Nuclear Information System (INIS)

    Cappelle, A.

    2016-01-01

    Because of its practical aspect digital radiography is more and more used in the industrial sector. There are 2 kinds of digital radiography. First, the 'computed radiography' that uses a photon-stimulated screen, and after radiation exposure this screen must be read by an analyser to get a digit image. The second type is the 'direct radiography' that allows one to get a digit radiograph of the object directly. Digital radiography uses the same radioactive nuclides as radiography with silver films: cobalt, iridium or selenium. The spatial resolution of digital radiography is less good than with classical silver film radiography but digital radiography offers a better visual contrast. (A.C.)

  12. Digital positron annihilation spectrometer

    International Nuclear Information System (INIS)

    Cheng Bin; Weng Huimin; Han Rongdian; Ye Bangjiao

    2010-01-01

    With the high speed development of digital signal process, the technique of the digitization and processing of signals was applied in the domain of a broad class of nuclear technique. The development of digital positron lifetime spectrometer (DPLS) is more promising than the conventional positron lifetime spectrometer equipped with nuclear instrument modules. And digital lifetime spectrometer has many advantages, such as low noise, long term stability, flexible online or offline digital processing, simple setup, low expense, easy to setting, and more physical information. Digital constant fraction discrimination is for timing. And a new method of optimizing energy windows setting for digital positron lifetime spectrometer is also developed employing the simulated annealing for the convenient use. The time resolution is 220ps and the count rate is 200cps. (authors)

  13. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  14. Digital Banking: Risks of Financial Digitalization

    Directory of Open Access Journals (Sweden)

    Kornіvska Valerіa O.

    2017-09-01

    Full Text Available The paper demonstrates the results of the research into development of the digital economy in the global financial space, shows the contradictions of these processes in the context of the growth of financial inclusion of households, identifies the risks of active introduction of digital banking in poor countries of the world. The characteristics of the processes of growth of the influence of banking institutions on the operational activity and daily life of households are given, and it is proved that under conditions of digital banking, without having any other alternative ways of economic activity in terms of money circulation than non-cash turnover, clients are forced to agree to unlimited presence of financial operators in social systems. It is substantiated that in Ukraine the global risk of digitalization of the society is gradually developing as a result of the creation of unified information systems for the control of global financial flows.

  15. Radon anomalies prior to earthquakes (2). Atmospheric radon anomaly observed before the Hyogoken-Nanbu earthquake

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    Before the 1995 Hyogoken-Nanbu earthquake, various geochemical precursors were observed in the aftershock area: chloride ion concentration, groundwater discharge rate, groundwater radon concentration and so on. Kobe Pharmaceutical University (KPU) is located about 25 km northeast from the epicenter and within the aftershock area. Atmospheric radon concentration had been continuously measured from 1984 at KPU, using a flow-type ionization chamber. The radon concentration data were analyzed using the smoothed residual values which represent the daily minimum of radon concentration with the exclusion of normalized seasonal variation. The radon concentration (smoothed residual values) demonstrated an upward trend about two months before the Hyogoken-Nanbu earthquake. The trend can be well fitted to a log-periodic model related to earthquake fault dynamics. As a result of model fitting, a critical point was calculated to be between 13 and 27 January 1995, which was in good agreement with the occurrence date of earthquake (17 January 1995). The mechanism of radon anomaly before earthquakes is not fully understood. However, it might be possible to detect atmospheric radon anomaly as a precursor before a large earthquake, if (1) the measurement is conducted near the earthquake fault, (2) the monitoring station is located on granite (radon-rich) areas, and (3) the measurement is conducted for more than several years before the earthquake to obtain background data. (author)

  16. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake.

    Science.gov (United States)

    Hayes, Gavin P; Herman, Matthew W; Barnhart, William D; Furlong, Kevin P; Riquelme, Sebástian; Benz, Harley M; Bergman, Eric; Barrientos, Sergio; Earle, Paul S; Samsonov, Sergey

    2014-08-21

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which had not ruptured in a megathrust earthquake since a M ∼8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March-April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  17. The Nankai Trough earthquake tsunamis in Korea: Numerical studies of the 1707 Hoei earthquake and physics-based scenarios

    Science.gov (United States)

    Kim, S.; Saito, T.; Fukuyama, E.; Kang, T. S.

    2016-12-01

    Historical documents in Korea and China report abnormal waves in the sea and rivers close to the date of the 1707 Hoei earthquake, which occurred in the Nankai Trough, off southwestern Japan. This indicates that the tsunami caused by the Hoei earthquake might have reached Korea and China, which suggests a potential hazard in Korea from large earthquakes in the Nankai Trough. We conducted tsunami simulations to study the details of tsunamis in Korea caused by large earthquakes. We employed the 1707 Hoei earthquake source model and physics-based scenarios of anticipated earthquake in the Nankai subduction zone. We also considered the effect of horizontal displacement on tsunami generation. Our simulation results from the Hoei earthquake model and the anticipated earthquake models showed that the maximum tsunami height along the Korean coast was less than 0.5 m. Even though the tsunami is not life-threatening, the effect of larger earthquakes should be still considered.

  18. Soil structure interactions of eastern U.S. type earthquakes

    International Nuclear Information System (INIS)

    Chang Chen; Serhan, S.

    1991-01-01

    Two types of earthquakes have occurred in the eastern US in the past. One of them was the infrequent major events such as the 1811-1812 New Madrid Earthquakes, or the 1886 Charleston Earthquake. The other type was the frequent shallow earthquakes with high frequency, short duration and high accelerations. Two eastern US nuclear power plants, V.C Summer and Perry, went through extensive licensing effort to obtain fuel load licenses after this type of earthquake was recorded on sites and exceeded the design bases beyond 10 hertz region. This paper discusses the soil-structure interactions of the latter type of earthquakes

  19. Earthquakes - a danger to deep-lying repositories?

    International Nuclear Information System (INIS)

    2012-03-01

    This booklet issued by the Swiss National Cooperative for the Disposal of Radioactive Waste NAGRA takes a look at geological factors concerning earthquakes and the safety of deep-lying repositories for nuclear waste. The geological processes involved in the occurrence of earthquakes are briefly looked at and the definitions for magnitude and intensity of earthquakes are discussed. Examples of damage caused by earthquakes are given. The earthquake situation in Switzerland is looked at and the effects of earthquakes on sub-surface structures and deep-lying repositories are discussed. Finally, the ideas proposed for deep-lying geological repositories for nuclear wastes are discussed

  20. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  1. Analysis of the enhanced negative correlation between electron density and electron temperature related to earthquakes

    Directory of Open Access Journals (Sweden)

    X. H. Shen

    2015-04-01

    Full Text Available Ionospheric perturbations in plasma parameters have been observed before large earthquakes, but the correlation between different parameters has been less studied in previous research. The present study is focused on the relationship between electron density (Ne and temperature (Te observed by the DEMETER (Detection of Electro-Magnetic Emissions Transmitted from Earthquake Regions satellite during local nighttime, in which a positive correlation has been revealed near the equator and a weak correlation at mid- and low latitudes over both hemispheres. Based on this normal background analysis, the negative correlation with the lowest percent in all Ne and Te points is studied before and after large earthquakes at mid- and low latitudes. The multiparameter observations exhibited typical synchronous disturbances before the Chile M8.8 earthquake in 2010 and the Pu'er M6.4 in 2007, and Te varied inversely with Ne over the epicentral areas. Moreover, statistical analysis has been done by selecting the orbits at a distance of 1000 km and ±7 days before and after the global earthquakes. Enhanced negative correlation coefficients lower than −0.5 between Ne and Te are found in 42% of points to be connected with earthquakes. The correlation median values at different seismic levels show a clear decrease with earthquakes larger than 7. Finally, the electric-field-coupling model is discussed; furthermore, a digital simulation has been carried out by SAMI2 (Sami2 is Another Model of the Ionosphere, which illustrates that the external electric field in the ionosphere can strengthen the negative correlation in Ne and Te at a lower latitude relative to the disturbed source due to the effects of the geomagnetic field. Although seismic activity is not the only source to cause the inverse Ne–Te variations, the present results demonstrate one possibly useful tool in seismo-electromagnetic anomaly differentiation, and a comprehensive analysis with multiple

  2. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  3. Earthquake damage orientation to infer seismic parameters in archaeological sites and historical earthquakes

    Science.gov (United States)

    Martín-González, Fidel

    2018-01-01

    Studies to provide information concerning seismic parameters and seismic sources of historical and archaeological seismic events are used to better evaluate the seismic hazard of a region. This is of especial interest when no surface rupture is recorded or the seismogenic fault cannot be identified. The orientation pattern of the earthquake damage (ED) (e.g., fallen columns, dropped key stones) that affected architectonic elements of cities after earthquakes has been traditionally used in historical and archaeoseismological studies to infer seismic parameters. However, in the literature depending on the authors, the parameters that can be obtained are contradictory (it has been proposed: the epicenter location, the orientation of the P-waves, the orientation of the compressional strain and the fault kinematics) and authors even question these relations with the earthquake damage. The earthquakes of Lorca in 2011, Christchurch in 2011 and Emilia Romagna in 2012 present an opportunity to measure systematically a large number and wide variety of earthquake damage in historical buildings (the same structures that are used in historical and archaeological studies). The damage pattern orientation has been compared with modern instrumental data, which is not possible in historical and archaeoseismological studies. From measurements and quantification of the orientation patterns in the studied earthquakes, it is observed that there is a systematic pattern of the earthquake damage orientation (EDO) in the proximity of the seismic source (fault trace) (earthquakes is normal to the fault trend (±15°). This orientation can be generated by a pulse of motion that in the near fault region has a distinguishable acceleration normal to the fault due to the polarization of the S-waves. Therefore, the earthquake damage orientation could be used to estimate the seismogenic fault trend of historical earthquakes studies where no instrumental data are available.

  4. Earthquake response of inelastic structures

    International Nuclear Information System (INIS)

    Parulekar, Y.M.; Vaity, K.N.; Reddy, .R.; Vaze, K.K.; Kushwaha, H.S.

    2004-01-01

    The most commonly used method in the seismic analysis of structures is the response spectrum method. For seismic re-evaluation of existing facilities elastic response spectrum method cannot be used directly as large deformation above yield may be observed under Safe Shutdown Earthquake (SSE). The plastic deformation, i.e. hysteretic characteristics of various elements of the structure cause dissipation of energy. Hence the values of damping given by the code, which does not account hysteretic energy dissipation cannot be directly used. In this paper, appropriate damping values are evaluated for 5-storey, 10-storey and 15-storey shear beam structures, which deform beyond their yield limit. Linear elastic analysis is performed for the same structures using these damping values and the storey forces are compared with those obtained using inelastic time history analysis. A damping model, which relates ductility of the structure and damping, is developed. Using his damping model, a practical structure is analysed and results are compared with inelastic time history analysis and the comparison is found to be good

  5. Development of an Earthquake Impact Scale

    Science.gov (United States)

    Wald, D. J.; Marano, K. D.; Jaiswal, K. S.

    2009-12-01

    With the advent of the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, domestic (U.S.) and international earthquake responders are reconsidering their automatic alert and activation levels as well as their response procedures. To help facilitate rapid and proportionate earthquake response, we propose and describe an Earthquake Impact Scale (EIS) founded on two alerting criteria. One, based on the estimated cost of damage, is most suitable for domestic events; the other, based on estimated ranges of fatalities, is more appropriate for most global events. Simple thresholds, derived from the systematic analysis of past earthquake impact and response levels, turn out to be quite effective in communicating predicted impact and response level of an event, characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (major disaster, necessitating international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses exceeding 1M, 10M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness dominate in countries where vernacular building practices typically lend themselves to high collapse and casualty rates, and it is these impacts that set prioritization for international response. In contrast, it is often financial and overall societal impacts that trigger the level of response in regions or countries where prevalent earthquake resistant construction practices greatly reduce building collapse and associated fatalities. Any newly devised alert protocols, whether financial or casualty based, must be intuitive and consistent with established lexicons and procedures. In this analysis, we make an attempt

  6. Crowd-Sourced Global Earthquake Early Warning

    Science.gov (United States)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  7. Fractals and Forecasting in Earthquakes and Finance

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.

    2011-12-01

    It is now recognized that Benoit Mandelbrot's fractals play a critical role in describing a vast range of physical and social phenomena. Here we focus on two systems, earthquakes and finance. Since 1942, earthquakes have been characterized by the Gutenberg-Richter magnitude-frequency relation, which in more recent times is often written as a moment-frequency power law. A similar relation can be shown to hold for financial markets. Moreover, a recent New York Times article, titled "A Richter Scale for the Markets" [1] summarized the emerging viewpoint that stock market crashes can be described with similar ideas as large and great earthquakes. The idea that stock market crashes can be related in any way to earthquake phenomena has its roots in Mandelbrot's 1963 work on speculative prices in commodities markets such as cotton [2]. He pointed out that Gaussian statistics did not account for the excessive number of booms and busts that characterize such markets. Here we show that both earthquakes and financial crashes can both be described by a common Landau-Ginzburg-type free energy model, involving the presence of a classical limit of stability, or spinodal. These metastable systems are characterized by fractal statistics near the spinodal. For earthquakes, the independent ("order") parameter is the slip deficit along a fault, whereas for the financial markets, it is financial leverage in place. For financial markets, asset values play the role of a free energy. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In the case of financial models, the probabilities are closely related to implied volatility, an important component of Black-Scholes models for stock valuations. [2] B. Mandelbrot, The variation of certain speculative prices, J. Business, 36, 294 (1963)

  8. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  9. Remote sensing and earthquake risk: A (re)insurance perspective

    Science.gov (United States)

    Smolka, Anselm; Siebert, Andreas

    2013-04-01

    The insurance sector is faced with two issues regarding earthquake risk: the estimation of rarely occurring losses from large events and the assessment of the average annual net loss. For this purpose, knowledge is needed of actual event losses, of the distribution of exposed values, and of their vulnerability to earthquakes. To what extent can remote sensing help the insurance industry fulfil these tasks, and what are its limitations? In consequence of more regular and high-resolution satellite coverage, we have seen earth observation and remote sensing methods develop over the past years to a stage where they appear to offer great potential for addressing some shortcomings of the data underlying risk assessment. These include lack of statistical representativeness and lack of topicality. Here, remote sensing can help in the following areas: • Inventories of exposed objects (pre- and post-disaster) • Projection of small-scale ground-based vulnerability classification surveys to a full inventory • Post-event loss assessment But especially from an insurance point of view, challenges remain. The strength of airborne remote sensing techniques lies in outlining heavily damaged areas where damage is caused by easily discernible structural failure, i.e. total or partial building collapse. Examples are the Haiti earthquake (with minimal insured loss) and the tsunami-stricken areas in the Tohoku district of Japan. What counts for insurers, however, is the sum of monetary losses. The Chile, the Christchurch and the Tohoku earthquakes each caused insured losses in the two-digit billion dollar range. By far the greatest proportion of these insured losses were due to non-structural damage to buildings, machinery and equipment. Even with the Tohoku event, no more than 30% of the total material damage was caused by the tsunami according to preliminary surveys, and this figure includes damage due to earthquake shock which was unrecognisable after the passage of the tsunami

  10. Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Hearne, Mike

    2009-01-01

    We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.

  11. Clinical characteristics of patients seizure following the 2016 Kumamoto earthquake.

    Science.gov (United States)

    Inatomi, Yuichiro; Nakajima, Makoto; Yonehara, Toshiro; Ando, Yukio

    2017-06-01

    To investigate the clinical characteristics of patients with seizure following the 2016 Kumamoto earthquake. We retrospectively studied patients with seizure admitted to our hospital for 12weeks following the earthquake. We compared the clinical backgrounds and characteristics of the patients: before (the same period from the previous 3years) and after the earthquake; and the early (first 2weeks) and late (subsequent 10weeks) phases. A total of 60 patients with seizure were admitted to the emergency room after the earthquake, and 175 (58.3/year) patients were admitted before the earthquake. Of them, 35 patients with seizure were hospitalized in the Department of Neurology after the earthquake, and 96 (32/year) patients were hospitalized before the earthquake. In patients after the earthquake, males and non-cerebrovascular diseases as an epileptogenic disease were seen more frequently than before the earthquake. During the early phase after the earthquake, female, first-attack, and non-focal-type patients were seen more frequently than during the late phase after the earthquake. These characteristics of patients with seizure during the early phase after the earthquake suggest that many patients had non-epileptic seizures. To prevent seizures following earthquakes, mental stress and physical status of evacuees must be assessed. Copyright © 2017. Published by Elsevier Ltd.

  12. Theory of Digital Automata

    CERN Document Server

    Borowik, Bohdan; Lahno, Valery; Petrov, Oleksandr

    2013-01-01

    This book serves a dual purpose: firstly to combine the treatment of circuits and digital electronics, and secondly, to establish a strong connection with the contemporary world of digital systems. The need for this approach arises from the observation that introducing digital electronics through a course in traditional circuit analysis is fast becoming obsolete. Our world has gone digital. Automata theory helps with the design of digital circuits such as parts of computers, telephone systems and control systems. A complete perspective is emphasized, because even the most elegant computer architecture will not function without adequate supporting circuits. The focus is on explaining the real-world implementation of complete digital systems. In doing so, the reader is prepared to immediately begin design and implementation work. This work serves as a bridge to take readers from the theoretical world to the everyday design world where solutions must be complete to be successful.

  13. Digital radiology and ultrasound

    International Nuclear Information System (INIS)

    Todd-Pokropek, A.

    1991-01-01

    With the access to digital methods for handling and processing images in general, many medical imaging methods are becoming more effectively handled digitally. This applies in particular to basically digital techniques such as CT and MR but also now includes Nuclear Medicine (NM), Ultrasound (US) and a variety of radiological procedures such as Digital Subtraction Angiography (DSA) and Fluoroscopy (DF). The access to conventional projection images by stimulatable plates (CR) or by digitization of film makes all of radiology potentially accessible, and the management of such images by a network is the basic aim of Picture Archiving and Communication Systems (PACS). However, it is suggested that in order for such systems to be of greater value, that way in which such images are treated needs to change, that is, digital images can be used to derive additional clinical value by appropriate processing

  14. Digital cine-imaging

    International Nuclear Information System (INIS)

    Masuda, Kazuhiro

    1992-01-01

    Digitization of fluoroscopic images has been developed for the digital cine imaging system as a result of the computer technology, television technology, and popularization of interventional radiology. Present digital cine imaging system is able to offer images similar to cine film because of the higher operatability and better image quality with the development of interventional radiology. As a result, its higher usefulness for catheter diagnosis examination except for interventional radiology was reported, and the possibility of having filmless cine is close to becoming a reality. However several problems have been pointed out, such as spatial resolution, time resolution, storage and exchangeability of data, disconsolidated viewing functions, etc. Anyhow, digital cine imaging system has some unresolved points and lots the needs to be discussed. The tendency of digitization is the passage of the time and we have to promote a study for more useful digital cine imaging system in team medical treatment which centers on the patients. (author)

  15. Automatic recognition of seismic intensity based on RS and GIS: a case study in Wenchuan Ms8.0 earthquake of China.

    Science.gov (United States)

    Zhang, Qiuwen; Zhang, Yan; Yang, Xiaohong; Su, Bin

    2014-01-01

    In recent years, earthquakes have frequently occurred all over the world, which caused huge casualties and economic losses. It is very necessary and urgent to obtain the seismic intensity map timely so as to master the distribution of the disaster and provide supports for quick earthquake relief. Compared with traditional methods of drawing seismic intensity map, which require many investigations in the field of earthquake area or are too dependent on the empirical formulas, spatial information technologies such as Remote Sensing (RS) and Geographical Information System (GIS) can provide fast and economical way to automatically recognize the seismic intensity. With the integrated application of RS and GIS, this paper proposes a RS/GIS-based approach for automatic recognition of seismic intensity, in which RS is used to retrieve and extract the information on damages caused by earthquake, and GIS is applied to manage and display the data of seismic intensity. The case study in Wenchuan Ms8.0 earthquake in China shows that the information on seismic intensity can be automatically extracted from remotely sensed images as quickly as possible after earthquake occurrence, and the Digital Intensity Model (DIM) can be used to visually query and display the distribution of seismic intensity.

  16. On a report that the 2012 M 6.0 earthquake in Italy was predicted after seeing an unusual cloud formation

    Science.gov (United States)

    Thomas, J.N.; Masci, F; Love, Jeffrey J.

    2015-01-01

    Several recently published reports have suggested that semi-stationary linear-cloud formations might be causally precursory to earthquakes. We examine the report of Guangmeng and Jie (2013), who claim to have predicted the 2012 M 6.0 earthquake in the Po Valley of northern Italy after seeing a satellite photograph (a digital image) showing a linear-cloud formation over the eastern Apennine Mountains of central Italy. From inspection of 4 years of satellite images we find numerous examples of linear-cloud formations over Italy. A simple test shows no obvious statistical relationship between the occurrence of these cloud formations and earthquakes that occurred in and around Italy. All of the linear-cloud formations we have identified in satellite images, including that which Guangmeng and Jie (2013) claim to have used to predict the 2012 earthquake, appear to be orographic – formed by the interaction of moisture-laden wind flowing over mountains. Guangmeng and Jie (2013) have not clearly stated how linear-cloud formations can be used to predict the size, location, and time of an earthquake, and they have not published an account of all of their predictions (including any unsuccessful predictions). We are skeptical of the validity of the claim by Guangmeng and Jie (2013) that they have managed to predict any earthquakes.

  17. Historical tsunami earthquakes in the Southwest Pacific: an extension to Δ > 80° of the energy-to-moment parameter Θ

    Science.gov (United States)

    Okal, Emile A.; Saloor, Nooshin

    2017-08-01

    We extend to distances beyond 80° the computation of the energy-to-moment slowness parameter Θ introduced by Newman and Okal, by defining a regional empirical correction based on recordings at distant stations for events otherwise routinely studied. In turn, this procedure allows the study of earthquakes in a similar source-station geometry, but for which the only available data are located beyond the original distance threshold, notably in the case of historical earthquakes predating the development of dense networks of short-period seismometers. This methodology is applied to the twin 1947 earthquakes off the Hikurangi coast of New Zealand for which we confirm slowness parameters characteristic of tsunami earthquakes. In addition, we identify as such the large aftershock of 1934 July 21 in the Santa Cruz Islands, which took place in the immediate vicinity of the more recent 2013 shock, which also qualifies as a tsunami earthquake. In that subduction zone, the systematic compilation of Θ for both recent and pre-digital events shows a diversity in slowness correlating with local tectonic regimes controlled by the subduction of fossil structures. Our methodology is also well adapted to the case of analogue records of large earthquakes for which short-period seismograms at conventional distances are often off-scale.

  18. Automatic Recognition of Seismic Intensity Based on RS and GIS: A Case Study in Wenchuan Ms8.0 Earthquake of China

    Directory of Open Access Journals (Sweden)

    Qiuwen Zhang

    2014-01-01

    Full Text Available In recent years, earthquakes have frequently occurred all over the world, which caused huge casualties and economic losses. It is very necessary and urgent to obtain the seismic intensity map timely so as to master the distribution of the disaster and provide supports for quick earthquake relief. Compared with traditional methods of drawing seismic intensity map, which require many investigations in the field of earthquake area or are too dependent on the empirical formulas, spatial information technologies such as Remote Sensing (RS and Geographical Information System (GIS can provide fast and economical way to automatically recognize the seismic intensity. With the integrated application of RS and GIS, this paper proposes a RS/GIS-based approach for automatic recognition of seismic intensity, in which RS is used to retrieve and extract the information on damages caused by earthquake, and GIS is applied to manage and display the data of seismic intensity. The case study in Wenchuan Ms8.0 earthquake in China shows that the information on seismic intensity can be automatically extracted from remotely sensed images as quickly as possible after earthquake occurrence, and the Digital Intensity Model (DIM can be used to visually query and display the distribution of seismic intensity.

  19. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  20. Normal Fault Type Earthquakes Off Fukushima Region - Comparison of the 1938 Events and Recent Earthquakes -

    Science.gov (United States)

    Murotani, S.; Satake, K.

    2017-12-01

    Off Fukushima region, Mjma 7.4 (event A) and 6.9 (event B) events occurred on November 6, 1938, following the thrust fault type earthquakes of Mjma 7.5 and 7.3 on the previous day. These earthquakes were estimated as normal fault earthquakes by Abe (1977, Tectonophysics). An Mjma 7.0 earthquake occurred on July 12, 2014 near event B and an Mjma 7.4 earthquake occurred on November 22, 2016 near event A. These recent events are the only M 7 class earthquakes occurred off Fukushima since 1938. Except for the two 1938 events, normal fault earthquakes have not occurred until many aftershocks of the 2011 Tohoku earthquake. We compared the observed tsunami and seismic waveforms of the 1938, 2014, and 2016 earthquakes to examine the normal fault earthquakes occurred off Fukushima region. It is difficult to compare the tsunami waveforms of the 1938, 2014 and 2016 events because there were only a few observations at the same station. The teleseismic body wave inversion of the 2016 earthquake yielded with the focal mechanism of strike 42°, dip 35°, and rake -94°. Other source parameters were as follows: source area 70 km x 40 km, average slip 0.2 m, maximum slip 1.2 m, seismic moment 2.2 x 1019 Nm, and Mw 6.8. A large slip area is located near the hypocenter, and it is compatible with the tsunami source area estimated from tsunami travel times. The 2016 tsunami source area is smaller than that of the 1938 event, consistent with the difference in Mw: 7.7 for event A estimated by Abe (1977) and 6.8 for the 2016 event. Although the 2014 epicenter is very close to that of event B, the teleseismic waveforms of the 2014 event are similar to those of event A and the 2016 event. While Abe (1977) assumed that the mechanism of event B was the same as event A, the initial motions at some stations are opposite, indicating that the focal mechanisms of events A and B are different and more detailed examination is needed. The normal fault type earthquake seems to occur following the