WorldWideScience

Sample records for earthquake accelerograms digitized

  1. Strong Motion Earthquake Data Values of Digitized Strong-Motion Accelerograms, 1933-1994

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Strong Motion Earthquake Data Values of Digitized Strong-Motion Accelerograms is a database of over 15,000 digitized and processed accelerograph records from...

  2. BASLIKO. A program for baseline-correction of earthquake-accelerograms

    International Nuclear Information System (INIS)

    Koschmieder, D.; Altes, J.

    1978-12-01

    In the following report a program for baseline-correction of earthquake-accelerograms is presented. By using this program errors in curves, which occur in using the chronographs and digitizers, are eliminated. (orig.) [de

  3. Phase characteristics of earthquake accelerogram and its application

    International Nuclear Information System (INIS)

    Ohsaki, Y.; Iwasaki, R.; Ohkawa, I.; Masao, T.

    1979-01-01

    As the input earthquake motion for seismic design of nuclear power plant structures and equipments, an artificial time history compatible with smoothed design response spectrum is frequently used. This paper deals with a wave generation technique based on phase characteristics in earthquake accelerograms as an alternate of envelope time function. The concept of 'phase differences' distribution' is defined to represent phase characteristics of earthquake motion. The procedure proposed in this paper consists of following steps; (1) Specify a design response spectrum and derive a corresponding initial modal amplitude. (2) Determine a phase differences' distribution corresponding to an envelope function, the shape of which is dependent on magnitude and epicentral distance of an earthquake. (3) Derive the phase angles at all modal frequencies from the phase differences' distribution. (4) Generate a time history by inverse Fourier transeform on the basis of the amplitudes and the phase angles thus determined. (5) Calculate the response spectrum. (6) Compare the specified and calculated response spectra, and correct the amplitude at each frequency so that the response spectrum will be consistent with the specified. (7) Repeat the steps 4 through 6, until the specified and calculated response spectra become consistent with sufficient accuracy. (orig.)

  4. Coherency analysis of accelerograms recorded by the UPSAR array during the 2004 Parkfield earthquake

    DEFF Research Database (Denmark)

    Konakli, Katerina; Kiureghian, Armen Der; Dreger, Douglas

    2014-01-01

    Spatial variability of near-fault strong motions recorded by the US Geological Survey Parkfield Seismograph Array (UPSAR) during the 2004 Parkfield (California) earthquake is investigated. Behavior of the lagged coherency for two horizontal and the vertical components is analyzed by separately...

  5. The near-source strong-motion accelerograms recorded by an experimental array in Tangshan, China

    Science.gov (United States)

    Peng, K.; Xie, Lingtian; Li, S.; Boore, D.M.; Iwan, W.D.; Teng, T.L.

    1985-01-01

    A joint research project on strong-motion earthquake studies between the People's Republic of China and the United States is in progress. As a part of this project, an experimental strong-motion array, consisting of twelve Kinemetrics PDR-1 Digital Event Recorders, was deployed in the meizoseismal area of the Ms = 7.8 Tangshan earthquake of July 28, 1976. These instruments have automatic gain ranging, a specified dynamic range of 102 dB, a 2.5 s pre-event memory, programmable triggering, and are equipped with TCG-1B Time Code Generators with a stability of 3 parts in 107 over a range of 0-50??C. In 2 y of operation beginning July, 1982 a total of 603 near-source 3-component accelerograms were gathered from 243 earthquakes of magnitude ML = 1.2-5.3. Most of these accelerograms have recorded the initial P-wave. The configuration of the experimental array and a representative set of near-source strong-motion accelerograms are presented in this paper. The set of accelerograms exhibited were obtained during the ML = 5.3 Lulong earthquake of October 19, 1982, when digital event recorders were triggered. The epicentral distances ranged from 4 to 41 km and the corresponding range of peak horizontal accelerations was 0.232g to 0.009g. A preliminary analysis of the data indicates that compared to motions in the western United States, the peak acceleration attenuates much more rapidly in the Tangshan area. The scaling of peak acceleration with magnitude, however, is similar in the two regions. Data at more distant sites are needed to confirm the more rapid attenuation. ?? 1985.

  6. Proposed guidelines for synthetic accelerogram generation methods

    International Nuclear Information System (INIS)

    Shaw, D.E.; Rizzo, P.C.; Shukla, D.K.

    1975-01-01

    With the advent of high speed digital computation machines and discrete structural analysis techniques, it has become attractive to use synthetically generated accelerograms as input in the seismic design and analysis of structures. Several procedures are currently available which can generate accelerograms which match a given design response spectra while not paying significant attention to other properties of seismic accelerograms. This paper studies currently available artificial time history generation techniques from the standpoint of various properties of seismic time histories consisting of; 1. Response Spectra; 2. Peak Ground Acceleration; 3. Total Duration; 4. Time dependent enveloping functions defining the rise time to strong motion, duration of significant shaking and decay of the significant shaking portion of the seismic record; 5. Fourier Amplitude and Phase Spectra; 6. Ground Motion Parameters; 7. Apparent Frequency; with the aim of providing guidelines of the time history parameters based on historic strong motion seismic records. (Auth.)

  7. Spectral Shapes for accelerograms recorded at rock sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Muralidharan, N.; Sharma, R.D.

    1986-01-01

    Earthquake accelerograms recorded on rock sites have been analysed to develop site-specific response spectra for use in aseismic design. Normalized pseudo absolute acceleration spectra for various values of damping, pertinent to nuclear power plant design in particular are presented. Various ground motion parameters, viz. peak displacement, velocity acceleration (including v/a, ad/v 2 and the ratios of the three orthogonal components) for fifty four accelerograms are examined through motion time histories to be used in structural response analysis. The analysis presented in this paper aims at specifying site specific response spectra for earthquake resistant design of structures and generation of spectrum compatible accelerograms. The salient features of the data set have been discussed. (author)

  8. Synthesis of artificial spectrum-compatible seismic accelerograms

    International Nuclear Information System (INIS)

    Vrochidou, E; Alvanitopoulos, P F; Andreadis, I; Mallousi, K; Elenas, A

    2014-01-01

    The Hilbert–Huang transform is used to generate artificial seismic signals compatible with the acceleration spectra of natural seismic records. Artificial spectrum-compatible accelerograms are utilized instead of natural earthquake records for the dynamic response analysis of many critical structures such as hospitals, bridges, and power plants. The realistic estimation of the seismic response of structures involves nonlinear dynamic analysis. Moreover, it requires seismic accelerograms representative of the actual ground acceleration time histories expected at the site of interest. Unfortunately, not many actual records of different seismic intensities are available for many regions. In addition, a large number of seismic accelerograms are required to perform a series of nonlinear dynamic analyses for a reliable statistical investigation of the structural behavior under earthquake excitation. These are the main motivations for generating artificial spectrum-compatible seismic accelerograms and could be useful in earthquake engineering for dynamic analysis and design of buildings. According to the proposed method, a single natural earthquake record is deconstructed into amplitude and frequency components using the Hilbert–Huang transform. The proposed method is illustrated by studying 20 natural seismic records with different characteristics such as different frequency content, amplitude, and duration. Experimental results reveal the efficiency of the proposed method in comparison with well-established and industrial methods in the literature. (paper)

  9. A new technique for generating spectrum compatible accelerogram

    International Nuclear Information System (INIS)

    Gosh, A.K.; Muralidharan, N.

    1985-01-01

    A new technique for generating spectrum compatible earthquake accelerogram is presented. Simplified linearised schemes are used to determine the weights of the modulated sinewaves used to represent the ground acceleration in conformity with the instants of time of attaining the maximum responses of the SDOFs. Some typical numerical results are presented in the paper. (orig.)

  10. On the Relationships Between the Fundamental Parameters of Calculation Accelerograms

    Energy Technology Data Exchange (ETDEWEB)

    Savich, A. I., E-mail: office@geodyn.ru; Burdina, N. A., E-mail: nina-burdina@mail.ru [Center of the Office of Geodynamic Observations in the Power Sector, an affiliate of JSC “Institut Gidroproekt,” (Russian Federation)

    2016-05-15

    Analysis of published data on the fundamental parameters of actual accelerograms of strong earthquakes having peak ground acceleration A{sub max}, predominant period T{sub pr}, and duration τ{sub 0.5} at 0.5A{sub max} determined that, for earthquakes of intensity greater than 6.5 – 7.0, the relationship between these quantities is sufficiently well described by the parameters B = ATτ and C = AτT{sup −1.338}, the former of which depends little on earthquake intensity I and is almost completely determined by the earthquake magnitude, while the latter, on the contrary, weakly depends on magnitude and is determined principally by the quantity I. Methods are proposed for using the parameters B and C to improve the reliability of determining parameters of accelerograms used to calculate the seismic resistance of hydraulic engineering facilities.

  11. Spectral shapes for accelerograms recorded at soil sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Sharma, R.D.

    1987-01-01

    Earthquake accelerograms recorded on soil sites have been analysed to develop site-specific response spectra. This report presents the normalised pseudo-absolute acceleration spectra for various values of damping and ground motion parameters viz. v/a, ad/v 2 and the ratios of peak accelerations in the three orthogonal directions. These results will be useful in the earthquake resistant design of structures. 4 tables, 14 figures. (author)

  12. Compilation, assessment and expansion of the strong earthquake ground motion data base. Seismic Safety Margins Research Program (SSMRP)

    International Nuclear Information System (INIS)

    Crouse, C.B.; Hileman, J.A.; Turner, B.E.; Martin, G.R.

    1980-09-01

    A catalog has been prepared which contains information for: (1) world-wide, ground-motion accelerograms (2) the accelerograph sites where these records were obtained, and (3) the seismological parameters of the causative earthquakes. The catalog is limited to data for those accelerograms which have been digitized and published. In addition, the quality and completeness of these data are assessed. This catalog is unique because it is the only publication which contains comprehensive information on the recording conditions of all known digitized accelerograms. However, information for many accelerograms is missing. Although some literature may have been overlooked, most of the missing data has not been published. Nevertheless, the catalog provides a convenient reference and useful tool for earthquake engineering research and applications. (author)

  13. Seismic Safety Margins Research Program, Phase I. Project II: seismic input. Compilation, assessment and expansion of the strong earthquake ground motion data base

    Energy Technology Data Exchange (ETDEWEB)

    Crouse, C B; Hileman, J A; Turner, B E; Martin, G R

    1980-04-01

    A catalog has been prepared which contains information for: (1) world-wide, ground-motion accelerograms, (2) the accelerograph sites where these records were obtained, and (3) the seismological parameters of the causative earthquakes. The catalog is limited to data for those accelerograms which have been digitized and published. In addition, the quality and completeness of these data are assessed. This catalog is unique because it is the only publication which contains comprehensive information on the recording conditions of all known digitized accelerograms. However, information for many accelerograms is missing. Although some literature may have been overlooked, most of the missing data has not been published. Nevertheless, the catalog provides a convenient reference and useful tool for earthquake engineering research and applications.

  14. Studies on Fourier amplitude spectra of accelerograms recorded on rock sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Rao, K.S.

    1990-01-01

    Fourier spectra of 54 earthquake accelerograms recorded on rock sites in the U.S.A. have been analysed. These could be used in generation of synthetic accelerogramms for seismic design. (author). 19 figs., 1 tab., 1 appendix, 19 re fs

  15. On Drift Effects in Velocity and Displacement of Greek Uncorrected Digital Strong Motion Data

    Science.gov (United States)

    Skarlatoudis, A.; Margaris, B.

    2005-12-01

    Fifty years after the first installation of analog accelerographs, digital instruments recording the strong-motion came in operation. Their advantages comparing to the analog ones are obvious and they have been described in detail in several works. Nevertheless it has been pointed out that velocity and displacement values derived from several accelerograms, recorded in various strong earthquakes worldwide (e.g. 1999 Chi-Chi, Taiwan, Hector Mine, 2002 Denali) by digital instruments, are plagued by drifts when only a simple baseline correction derived from the pre-event portion of the record is removed. In Greece a significant number of accelerographic networks and arrays have been deployed covering the whole area. Digital accelerographs now constitute a significant part of the National Strong Motion network of the country. Detailed analyses of the data processing of accelerograms recorded by digital instruments exhibited that the same drifts exist in the Greek strong motion database. In this work, a methodology proposed and described in various articles (Boore, 2001; 2003; 2005) for removing the aforementioned drifts of the accelerograms is applied. It is also attempted a careful look of the nature of the drifts for understanding the noise characteristics relative to the signal. The intrinsic behaviour of signal to noise ratio is crucial for the adequacy of baseline corrections applied on digital uncorrected accelerograms. Velocities and displacements of the uncorrected and corrected accelerograms are compared and the drift effects in the Fourier and response spectra are presented.

  16. Development of spectral shapes and attenuation relations from accelerograms recorded on rock and soil sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Rao, K.S.; Kushwaha, H.S.

    1998-06-01

    Earthquake accelerograms recorded on rock and soil sites have been analysed. Site-specific response spectra and peak ground acceleration attenuation relations have been developed. This report presents the normalised pseudo-absolute acceleration spectra for various values of damping and for various confidence levels. Scaling laws have been developed for the response spectra. The present results are based on a large database and comparison has been made with earlier results. These results will be useful in the earthquake resistant design of structures. (author)

  17. Development of spectral shapes and attenuation relations from accelerograms recorded on rock and soil sites

    Energy Technology Data Exchange (ETDEWEB)

    Ghosh, A K; Rao, K S; Kushwaha, H S [Reactor Safety Div., Bhabha Atomic Research Centre, Mumbai (India)

    1998-06-01

    Earthquake accelerograms recorded on rock and soil sites have been analysed. Site-specific response spectra and peak ground acceleration attenuation relations have been developed. This report presents the normalised pseudo-absolute acceleration spectra for various values of damping and for various confidence levels. Scaling laws have been developed for the response spectra. The present results are based on a large database and comparison has been made with earlier results. These results will be useful in the earthquake resistant design of structures. (author) 22 refs., 7 figs., 5 tabs.

  18. Generation of artificial accelerograms using neural networks for data of Iran

    International Nuclear Information System (INIS)

    Bargi, Kh.; Loux, C.; Rohani, H.

    2002-01-01

    A new method for generation of artificial earthquake accelerograms from response spectra is proposed by Ghaboussi and Lin in 1997 using neural networks. In this paper the methodology has been extended and enhanced for data of Iran. For this purpose, first 40 records of Iran acceleration is chosen, then an RBF neural network which called generalized regression neural network learn the inverse mapping directly from the response spectrum to the Discrete Cosine Transform of accelerograms. Discrete Cosine Transform has been used as an assisting device to extract the content of frequency domain. Learning of network is reasonable and a generalized regression neural network learns it in a few second. Outputs are presented to demonstrate the performance of this method and show its capabilities

  19. SISMA (Site of Italian Strong Motion Accelerograms): a Web-Database of Ground Motion Recordings for Engineering Applications

    International Nuclear Information System (INIS)

    Scasserra, Giuseppe; Lanzo, Giuseppe; D'Elia, Beniamino; Stewart, Jonathan P.

    2008-01-01

    The paper describes a new website called SISMA, i.e. Site of Italian Strong Motion Accelerograms, which is an Internet portal intended to provide natural records for use in engineering applications for dynamic analyses of structural and geotechnical systems. SISMA contains 247 three-component corrected motions recorded at 101 stations from 89 earthquakes that occurred in Italy in the period 1972-2002. The database of strong motion accelerograms was developed in the framework of a joint project between Sapienza University of Rome and University of California at Los Angeles (USA) and is described elsewhere. Acceleration histories and pseudo-acceleration response spectra (5% damping) are available for download from the website. Recordings can be located using simple search parameters related to seismic source and the recording station (e.g., magnitude, V s30 , etc) as well as ground motion characteristics (e.g. peak ground acceleration, peak ground velocity, peak ground displacement, Arias intensity, etc.)

  20. Digital radiography of crush thoracic trauma in the Sichuan earthquake

    Science.gov (United States)

    Dong, Zhi-Hui; Shao, Heng; Chen, Tian-Wu; Chu, Zhi-Gang; Deng, Wen; Tang, Si-Shi; Chen, Jing; Yang, Zhi-Gang

    2011-01-01

    AIM: To investigate the features of crush thoracic trauma in Sichuan earthquake victims using chest digital radiography (CDR). METHODS: We retrospectively reviewed 772 CDR of 417 females and 355 males who had suffered crush thoracic trauma in the Sichuan earthquake. Patient age ranged from 0.5 to 103 years. CDR was performed between May 12, 2008 and June 7, 2008. We looked for injury to the thoracic cage, pulmonary parenchyma and the pleura. RESULTS: Antero-posterior (AP) and lateral CDR were obtained in 349 patients, the remaining 423 patients underwent only AP CDR. Thoracic cage fractures, pulmonary contusion and pleural injuries were noted in 331 (42.9%; 95% CI: 39.4%-46.4%), 67 and 135 patients, respectively. Of the 256 patients with rib fractures, the mean number of fractured ribs per patient was 3. Rib fractures were mostly distributed from the 3rd through to the 8th ribs and the vast majority involved posterior and lateral locations along the rib. Rib fractures had a significant positive association with non-rib thoracic fractures, pulmonary contusion and pleural injuries (P < 0.001). The number of rib fractures and pulmonary contusions were significant factors associated with patient death. CONCLUSION: Earthquake-related crush thoracic trauma has the potential for multiple fractures. The high number of fractured ribs and pulmonary contusions were significant factors which needed appropriate medical treatment. PMID:22132298

  1. Comparative analysis of accelerogram processing methods

    International Nuclear Information System (INIS)

    Goula, X.; Mohammadioun, B.

    1986-01-01

    The work described here inafter is a short development of an on-going research project, concerning high-quality processing of strong-motion recordings of earthquakes. Several processing procedures have been tested, applied to synthetic signals simulating ground-motion designed for this purpose. The methods of correction operating in the time domain are seen to be strongly dependent upon the sampling rate. Two methods of low-frequency filtering followed by an integration of accelerations yielded satisfactory results [fr

  2. Ground amplification determined from borehole accelerograms

    International Nuclear Information System (INIS)

    Archuleta, R.J.; Seale, S.H.

    1991-01-01

    The Garner Valley downhole array (GVDA) consists of one surface accelerometer and four downhole accelerometers at depths of 6 m, 15 m, 22m, and 220 m. The five, three-component vertical array of dual-gain accelerometers are capable of measuring accelerations from 3 x 10 -6 g to 2.0 g over a frequency range from 0.0 Hz (0.025, high-gain) Hz to 100 Hz. The site (33 degree 41.60' N, 116 degree 40.20 degree W) is only seven kilometers off the trace of the San Jacinto fault, the most active strand of the San Andreas fault system in southern California and only about 35 km from the San Andreas fault itself. Analysis of individual spectra and spectral ratios for the various depths shows that the zone of weathered granite has a pronounced effect on the spectral amplitudes for frequencies greater than 40 Hz. The soil layer impedance may amplify the high frequencies more than it attenuates. This result must be checked more thoroughly with special consideration of the spectra of the P-wave coda on the horizontal components. Analysis of the P-wave spectra and the spectral ratios shows an increased amplification in the same frequency range (60-90 Hz) where the S-wave spectral ratios imply a change in the attenuation. Comparison of acceleration spectra from two earthquakes, M L 4.2 and M L 2.5 that have nearly the same hypocenter, shows that the near surface amplification and attenuation is nearly the same for both earthquakes. However, the earthquakes themselves are different if we can assume that the recording at 220 m reflects the source spectra with a slight attenuation. The M L 2.5 earthquake has significantly greater high frequency content if the spectra are normalized at the low frequency, i.e., normalization by seismic moment

  3. Computing broadband accelerograms using kinematic rupture modeling

    International Nuclear Information System (INIS)

    Ruiz Paredes, J.A.

    2007-05-01

    In order to make the broadband kinematic rupture modeling more realistic with respect to dynamic modeling, physical constraints are added to the rupture parameters. To improve the slip velocity function (SVF) modeling, an evolution of the k -2 source model is proposed, which consists to decompose the slip as a sum of sub-events by band of k. This model yields to SVF close to the solution proposed by Kostrov for a crack, while preserving the spectral characteristics of the radiated wave field, i.e. a w 2 model with spectral amplitudes at high frequency scaled to the coefficient of directivity C d . To better control the directivity effects, a composite source description is combined with a scaling law defining the extent of the nucleation area for each sub-event. The resulting model allows to reduce the apparent coefficient of directivity to a fraction of C d , as well as to reproduce the standard deviation of the new empirical attenuation relationships proposed for Japan. To make source models more realistic, a variable rupture velocity in agreement with the physics of the rupture must be considered. The followed approach that is based on an analytical relation between the fracture energy, the slip and the rupture velocity, leads to higher values of the peak ground acceleration in the vicinity of the fault. Finally, to better account for the interaction of the wave field with the geological medium, a semi-empirical methodology is developed combining a composite source model with empirical Green functions, and is applied to the Yamaguchi, M w 5.9 earthquake. The modeled synthetics reproduce satisfactorily well the observed main characteristics of ground motions. (author)

  4. How many accelerograms to use and how to deal with scattering for transient non-linear seismic computations?

    International Nuclear Information System (INIS)

    Viallet, E.; Heinfling, G.

    2005-01-01

    once again but using a statistical approach in order to get a conservative evaluation of the value of the non-linear parameter but using a reduced number of calculations. Student statistical estimator is used for this purpose. The previous results are then used to estimate the average value of the characteristic non-linear parameter with a confidence level of 95 %. The evolution of the estimation is then compared, depending on (i) the sets of accelerograms used and (ii) the number of accelerograms used. These results show that a conservative estimation of the average value of the non-linearity can be determined. This value remains conservative (with a confidence level of 95 %) and allows to use a relatively low number of accelerograms (typically 5 to 10). This method leaves a relative freedom to the designer for engineering purpose. This study will be extended in the future with other earthquake characteristics and with other types on non-linearities (uplift, sliding and rocking of rigid bodies ...). (authors

  5. Generation of spectrum compatible accelerograms for seismic analysis of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Selvaraj, T.; Chellapandi, P.; Chetal, S.C.

    2003-01-01

    For the seismic design of nuclear power plants, time history of earthquake ground motion is required basically to generate time histories at various floors of nuclear island as well as at the component support locations. From such time histories, floor response spectra (FRS) can be generated. The basic input is specified as site dependent response spectra (SDRS), from which a set of uncorrelated time histories is generated whose own response spectrum matches with the design response spectra. These time histories have got a great impact on the structural design and economy. For Kalpakkam, the site for PFBR, the seismic input is defined in terms of SDRS for various damping values and its shapes have been arrived already. Synthetic accelerograms have been generated such that the time-history generated response spectrum (THRS) closely matches the SDRS for 5% of critical damping. Time histories have been developed using CASTEM 2000, a multi purpose FE code. This paper deals with the generation methodology and their compliance with ASCE 4-98. (author)

  6. Design spectrums based on earthquakes recorded at tarbela

    International Nuclear Information System (INIS)

    Rizwan, M.; Ilyas, M.; Masood, A.

    2008-01-01

    First Seismological Network in Pakistan was setup in early 1969 at Tarbela, which is the location of largest water reservoir of the country. The network consisted of Analog Accelerograms and Seismographs. Since the installation many seismic events of different magnitudes occurred and were recorded by the installed instruments. The analog form of recorded time histories has been digitized and data of twelve earthquakes, irrespective of the type of soil, has been used to derive elastic design spectrums for Tarbela, Pakistan. The PGA scaling factors, based on the risk analysis studies carried out for the region, for each component are also given. The design spectrums suggested will be very useful for carrying out new construction in the region and its surroundings. The digitized data of time histories will be useful for seismic response analysis of structures and seismic risk analysis of the region. (author)

  7. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  8. An algorithm of local earthquake detection from digital records

    Directory of Open Access Journals (Sweden)

    A. PROZOROV

    1978-06-01

    Full Text Available The problem of automatical detection of earthquake signals in seismograms
    and definition of first arrivals of p and s waves is considered.
    The algorithm is based on the analysis of t(A function which represents
    the time of first appearence of a number of going one after another
    swings of amplitudes greather than A in seismic signals. It allows to explore
    such common features of seismograms of earthquakes as sudden
    first p-arrivals of amplitude greater than general amplitude of noise and
    after the definite interval of time before s-arrival the amplitude of which
    overcomes the amplitude of p-arrival. The method was applied to
    3-channel recods of Friuli aftershocks, ¿'-arrivals were defined correctly
    in all cases; p-arrivals were defined in most cases using strict criteria of
    detection. Any false signals were not detected. All p-arrivals were defined
    using soft criteria of detection but less reliability and two false events
    were obtained.

  9. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  10. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  11. Did we really #prayfornepal? Instagram posts as a massive digital funeral in Nepal earthquake aftermath

    Science.gov (United States)

    Kamil, P. I.; Pratama, A. J.; Hidayatulloh, A.

    2016-05-01

    Social media has been part of our daily life for years, and now it has become a treasure trove of data for social scientists to mine. Using our own data mining engine we downloaded 1500 Instagram posts related to the Nepal earthquake in April 2015, a disaster which caused tremendous losses counted in human lives and infrastructures. We predicted that the social media will be a place where people respond and express themselves emotionally in light of a disaster of such massive scale, a "megadeath" event. We ended up with data on 1017 posts tracked with the hashtag #prayfornepal, consisting of the post's date, time, geolocation, image, post ID, username and ID, caption, and hashtag. We categorized the posts into 7 categories and found that most of the photos (30,29%) are related to Nepal but not directly related to the disasters, which reflects death imprint, one of psychosocial responses after a megadeath event. Other analyses were done to compare each photo category, including geo-location, hashtag network and caption network which will be visualized with ArcGIS, NodeXL, Gephi, and our own word cloud engine to examine other digital reactions to Nepal Earthquake in Instagram. This study can give an overview of how community reacts to a disaster in digital world and utilize it for disaster response and awareness.

  12. VOLUNTARY ACTIVITIES AND ONLINE EDUCATION FOR DIGITAL HERITAGE INVENTORY DEVELOPMENT AFTER THE GREAT EAST JAPAN EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    Y. Kondo

    2013-07-01

    Full Text Available Consortium for Earthquake-Damaged Cultural Heritage (CEDACH is a voluntary initiative launched just after the Great East Japan Earthquake on 11 March 2011. The consortium is developing a social network between local cultural resource managers restoring disaster-damaged cultural heritage on one side and remote researchers including historians, archaeologists and specialists of cultural information studies on the other side, in order to facilitate collaborative projects. This paper presents three projects in which CEDACH contributed to the development of a digital inventory for disaster-damaged heritage management through web-based collaborations by self-motivated workers. The first project, CEDACH GIS, developed an online archaeological site inventory for the disaster area. Although a number of individuals voluntarily participated in the project at the beginning, it gradually stagnated due to limited need for local rescue archaeology. However, the experience of online-based collaborations worked well for the second project proposed by local specialists, in which CEDACH restored the book catalogue of a tsunami-devastated research library. This experience highlighted the need for online education to improve information and communication technologies (ICT skills of data builders. Therefore, in the third project called CEDACHeLi, an e-Learning management system was developed to facilitate learning the fundamental knowledge and techniques required for information processing in rescue operations of disaster-damaged cultural heritage. This system will contribute to improved skills and motivation of potential workers for further developments in digital heritage inventory.

  13. Voluntary Activities and Online Education for Digital Heritage Inventory Development after the Great East Japan Earthquake

    Science.gov (United States)

    Kondo, Y.; Uozu, T.; Seino, Y.; Ako, T.; Goda, Y.; Fujimoto, Y.; Yamaguchi, H.

    2013-07-01

    Consortium for Earthquake-Damaged Cultural Heritage (CEDACH) is a voluntary initiative launched just after the Great East Japan Earthquake on 11 March 2011. The consortium is developing a social network between local cultural resource managers restoring disaster-damaged cultural heritage on one side and remote researchers including historians, archaeologists and specialists of cultural information studies on the other side, in order to facilitate collaborative projects. This paper presents three projects in which CEDACH contributed to the development of a digital inventory for disaster-damaged heritage management through web-based collaborations by self-motivated workers. The first project, CEDACH GIS, developed an online archaeological site inventory for the disaster area. Although a number of individuals voluntarily participated in the project at the beginning, it gradually stagnated due to limited need for local rescue archaeology. However, the experience of online-based collaborations worked well for the second project proposed by local specialists, in which CEDACH restored the book catalogue of a tsunami-devastated research library. This experience highlighted the need for online education to improve information and communication technologies (ICT) skills of data builders. Therefore, in the third project called CEDACHeLi, an e-Learning management system was developed to facilitate learning the fundamental knowledge and techniques required for information processing in rescue operations of disaster-damaged cultural heritage. This system will contribute to improved skills and motivation of potential workers for further developments in digital heritage inventory.

  14. Prediction of site specific ground motion for large earthquake

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro; Irikura, Kojiro; Fukuchi, Yasunaga.

    1990-01-01

    In this paper, we apply the semi-empirical synthesis method by IRIKURA (1983, 1986) to the estimation of site specific ground motion using accelerograms observed at Kumatori in Osaka prefecture. Target earthquakes used here are a comparatively distant earthquake (Δ=95 km, M=5.6) caused by the YAMASAKI fault and a near earthquake (Δ=27 km, M=5.6). The results obtained are as follows. 1) The accelerograms from the distant earthquake (M=5.6) are synthesized using the aftershock records (M=4.3) for 1983 YAMASAKI fault earthquake whose source parameters have been obtained by other authors from the hypocentral distribution of the aftershocks. The resultant synthetic motions show a good agreement with the observed ones. 2) The synthesis for a near earthquake (M=5.6, we call this target earthquake) are made using a small earthquake which occurred in the neighborhood of the target earthquake. Here, we apply two methods for giving the parameters for synthesis. One method is to use the parameters of YAMASAKI fault earthquake which has the same magnitude as the target earthquake, and the other is to use the parameters obtained from several existing empirical formulas. The resultant synthetic motion with the former parameters shows a good agreement with the observed one, but that with the latter does not. 3) We estimate the source parameters from the source spectra of several earthquakes which have been observed in this site. Consequently we find that the small earthquakes (M<4) as Green's functions should be carefully used because the stress drops are not constant. 4) We propose that we should designate not only the magnitudes but also seismic moments of the target earthquake and the small earthquake. (J.P.N.)

  15. On the Regional Dependence of Earthquake Response Spectra

    OpenAIRE

    Douglas , John

    2007-01-01

    International audience; It is common practice to use ground-motion models, often developed by regression on recorded accelerograms, to predict the expected earthquake response spectra at sites of interest. An important consideration when selecting these models is the possible dependence of ground motions on geographical region, i.e., are median ground motions in the (target) region of interest for a given magnitude and distance the same as those in the (host) region where a ground-motion mode...

  16. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  17. Community Digital Library Requirements for the Southern California Earthquake Center Community Modeling Environment (SCEC/CME)

    Science.gov (United States)

    Moore, R.; Faerman, M.; Minster, J.; Day, S. M.; Ely, G.

    2003-12-01

    A community digital library provides support for ingestion, organization, description, preservation, and access of digital entities. The technologies that traditionally provide these capabilities are digital libraries (ingestion, organization, description), persistent archives (preservation) and data grids (access). We present a design for the SCEC community digital library that incorporates aspects of all three systems. Multiple groups have created integrated environments that sustain large-scale scientific data collections. By examining these projects, the following stages of implementation can be identified: \\begin{itemize} Definition of semantic terms to associate with relevant information. This includes definition of uniform content descriptors to describe physical quantities relevant to the scientific discipline, and creation of concept spaces to define how the uniform content descriptors are logically related. Organization of digital entities into logical collections that make it simple to browse and manage related material. Definition of services that are used to access and manipulate material in the collection. Creation of a preservation environment for the long-term management of the collection. Each community is faced with heterogeneity that is introduced when data is distributed across multiple sites, or when multiple sets of collection semantics are used, and or when multiple scientific sub-disciplines are federated. We will present the relevant standards that simplify the implementation of the SCEC community library, the resource requirements for different types of data sets that drive the implementation, and the digital library processes that the SCEC community library will support. The SCEC community library can be viewed as the set of processing steps that are required to build the appropriate SCEC reference data sets (SCEC approved encoding format, SCEC approved descriptive metadata, SCEC approved collection organization, and SCEC managed storage

  18. Development of uniform hazard response spectra from accelerograms recorded on rock sites

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Kushwaha, H.S.

    2000-05-01

    Traditionally, the seismic design basis ground motion has been specified by response spectral shapes and the peak ground acceleration (PGA). The mean recurrence interval (MRI) is evaluated for PGA only. The present work has developed response spectra having the same MRI at all frequencies. This report extends the work of Cornell (on PGA) to consider an aerial source model and a general form of the spectral acceleration at various frequencies. The latter has been derived from a number of strong motion earthquake recorded on rock sites. Sensitivity of the results to the changes in various parameters has also been presented. These results will help to determine the seismic hazard at a given site and the associated uncertainties. (author)

  19. Design of the Digital Satellite Link Interface for a System That Detects the Precursory Electromagnetic Emissions Associated with Earthquakes

    Science.gov (United States)

    1986-12-01

    earthquake that is likely to occur in a given louality [Ref. 8:p. 1082]. The accumulation law of seismotectonic movement relates the amount of...mechanism - fault creep anomaly - seismic wave velocity - geomagnetic field - telluric (earth) currents - electromagnetic emissions - resistivity of

  20. Rapid estimation of earthquake magnitude from the arrival time of the peak high‐frequency amplitude

    Science.gov (United States)

    Noda, Shunta; Yamamoto, Shunroku; Ellsworth, William L.

    2016-01-01

    We propose a simple approach to measure earthquake magnitude M using the time difference (Top) between the body‐wave onset and the arrival time of the peak high‐frequency amplitude in an accelerogram. Measured in this manner, we find that Mw is proportional to 2logTop for earthquakes 5≤Mw≤7, which is the theoretical proportionality if Top is proportional to source dimension and stress drop is scale invariant. Using high‐frequency (>2  Hz) data, the root mean square (rms) residual between Mw and MTop(M estimated from Top) is approximately 0.5 magnitude units. The rms residuals of the high‐frequency data in passbands between 2 and 16 Hz are uniformly smaller than those obtained from the lower‐frequency data. Top depends weakly on epicentral distance, and this dependence can be ignored for distances earthquake produces a final magnitude estimate of M 9.0 at 120 s after the origin time. We conclude that Top of high‐frequency (>2  Hz) accelerograms has value in the context of earthquake early warning for extremely large events.

  1. Digitization

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2014-01-01

    what a concept of digital media might add to the understanding of processes of mediatization and what the concept of mediatization might add to the understanding of digital media. It is argued that digital media open an array of new trajectories in human communication, trajectories which were...

  2. Fundamental principles of earthquake resistance calculation to be reflected in the next generation regulations

    Directory of Open Access Journals (Sweden)

    Mkrtychev Oleg

    2016-01-01

    Full Text Available The article scrutinizes the pressing issues of regulation in the domain of seismic construction. The existing code of rules SNIP II-7-81* “Construction in seismic areas” provides that earthquake resistance calculation be performed on two levels of impact: basic safety earthquake (BSE and maximum considered earthquake (MCE. However, the very nature of such calculation cannot be deemed well-founded and contradicts the fundamental standards of foreign countries. The authors of the article have identified the main problems of the conceptual foundation underlying the current regulation. The first and foremost step intended to overcome the discrepancy in question is renunciation of the K1 damage tolerance factor when calculating the BSE. The second measure to be taken is implementing the response spectrum method of calculation, but the β spectral curve of the dynamic response factor must be replaced by a spectrum of worst-case accelerograms for this particular structure or a spectrum of simulated accelerograms obtained for the specific construction site. Application of the response spectrum method when calculating the MCE impact level makes it possible to proceed into the frequency domain and to eventually obtain spectra of the accelerograms. As a result we get to know the response of the building to some extent, i.e. forces, the required reinforcement, and it can be checked whether the conditions of the ultimate limit state apply. Then, the elements under the most intense load are excluded from the design model the way it is done in case of progressive collapse calculations, because the assumption is that these elements are destroyed locally by seismic load. This procedure is based on the already existing design practices of progressive collapse calculation.

  3. Short presentation on some researches activities about near field earthquakes

    International Nuclear Information System (INIS)

    Donald, John

    2002-01-01

    The major hazard posed by earthquakes is often thought to be due to moderate to large magnitude events. However, there have been many cases where earthquakes of moderate and even small magnitude have caused very significant destruction when they have coincided with population centres. Even though the area of intense ground shaking caused by such events is generally small, the epicentral motions can be severe enough to cause damage even in well-engineered structures. Two issues are addressed here, the first being the identification of the minimum earthquake magnitude likely to cause damage to engineered structures and the limits of the near-field for small-to-moderate magnitude earthquakes. The second issue addressed is whether features of near-field ground motions such as directivity, which can significantly enhance the destructive potential, occur in small-to-moderate magnitude events. The accelerograms from the 1986 San Salvador (El Salvador) earthquake indicate that it may be non conservative to assume that near-field directivity effects only need to be considered for earthquakes of moment magnitude M 6.5 and greater. (author)

  4. Application of τc*Pd for identifying damaging earthquakes for earthquake early warning

    Science.gov (United States)

    Huang, P. L.; Lin, T. L.; Wu, Y. M.

    2014-12-01

    Earthquake Early Warning System (EEWS) is an effective approach to mitigate earthquake damage. In this study, we used the seismic record by the Kiban Kyoshin network (KiK-net), because it has dense station coverage and co-located borehole strong-motion seismometers along with the free-surface strong-motion seismometers. We used inland earthquakes with moment magnitude (Mw) from 5.0 to 7.3 between 1998 and 2012. We choose 135 events and 10950 strong ground accelerograms recorded by the 696 strong ground accelerographs. Both the free-surface and the borehole data are used to calculate τc and Pd, respectively. The results show that τc*Pd has a good correlation with PGV and is a robust parameter for assessing the potential of damaging earthquake. We propose the value of τc*Pd determined from seconds after the arrival of P wave could be a threshold for the on-site type of EEW.

  5. DIGITAL

    Data.gov (United States)

    Federal Emergency Management Agency, Department of Homeland Security — The Digital Flood Insurance Rate Map (DFIRM) Database depicts flood risk information and supporting data used to develop the risk data. The primary risk...

  6. Artificial earthquake record generation using cascade neural network

    Directory of Open Access Journals (Sweden)

    Bani-Hani Khaldoon A.

    2017-01-01

    Full Text Available This paper presents the results of using artificial neural networks (ANN in an inverse mapping problem for earthquake accelerograms generation. This study comprises of two parts: 1-D site response analysis; performed for Dubai Emirate at UAE, where eight earthquakes records are selected and spectral matching are performed to match Dubai response spectrum using SeismoMatch software. Site classification of Dubai soil is being considered for two classes C and D based on shear wave velocity of soil profiles. Amplifications factors are estimated to quantify Dubai soil effect. Dubai’s design response spectra are developed for site classes C & D according to International Buildings Code (IBC -2012. In the second part, ANN is employed to solve inverse mapping problem to generate time history earthquake record. Thirty earthquakes records and their design response spectrum with 5% damping are used to train two cascade forward backward neural networks (ANN1, ANN2. ANN1 is trained to map the design response spectrum to time history and ANN2 is trained to map time history records to the design response spectrum. Generalized time history earthquake records are generated using ANN1 for Dubai’s site classes C and D, and ANN2 is used to evaluate the performance of ANN1.

  7. Earthquake accelerations estimation for construction calculating with different responsibility degrees

    International Nuclear Information System (INIS)

    Dolgaya, A.A.; Uzdin, A.M.; Indeykin, A.V.

    1993-01-01

    The investigation object is the design amplitude of accelerograms, which are used in the evaluation of seismic stability of responsible structures, first and foremost, NPS. The amplitude level is established depending on the degree of responsibility of the structure and on the prevailing period of earthquake action on the construction site. The investigation procedure is based on statistical analysis of 310 earthquakes. At the first stage of statistical data-processing we established the correlation dependence of both the mathematical expectation and root-mean-square deviation of peak acceleration of the earthquake on its prevailing period. At the second stage the most suitable law of acceleration distribution about the mean was chosen. To determine of this distribution parameters, we specified the maximum conceivable acceleration, the excess of which is not allowed. Other parameters of distribution are determined according to statistical data. At the third stage the dependencies of design amplitude on the prevailing period of seismic effect for different structures and equipment were established. The obtained data made it possible to recommend to fix the level of safe-shutdown (SSB) and operating basis earthquakes (OBE) for objects of various responsibility categories when designing NPS. (author)

  8. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  9. Development of uniform hazard response spectra for rock sites considering line and point sources of earthquakes

    International Nuclear Information System (INIS)

    Ghosh, A.K.; Kushwaha, H.S.

    2001-12-01

    Traditionally, the seismic design basis ground motion has been specified by normalised response spectral shapes and peak ground acceleration (PGA). The mean recurrence interval (MRI) used to computed for PGA only. It is shown that the MRI associated with such response spectra are not the same at all frequencies. The present work develops uniform hazard response spectra i.e. spectra having the same MRI at all frequencies for line and point sources of earthquakes by using a large number of strong motion accelerograms recorded on rock sites. Sensitivity of the number of the results to the changes in various parameters has also been presented. This work is an extension of an earlier work for aerial sources of earthquakes. These results will help to determine the seismic hazard at a given site and the associated uncertainities. (author)

  10. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  11. Determine Earthquake Rupture Directivity Using Taiwan TSMIP Strong Motion Waveforms

    Science.gov (United States)

    Chang, Kaiwen; Chi, Wu-Cheng; Lai, Ying-Ju; Gung, YuanCheng

    2013-04-01

    Inverting seismic waveforms for the finite fault source parameters is important for studying the physics of earthquake rupture processes. It is also significant to image seismogenic structures in urban areas. Here we analyze the finite-source process and test for the causative fault plane using the accelerograms recorded by the Taiwan Strong-Motion Instrumentation Program (TSMIP) stations. The point source parameters for the mainshock and aftershocks were first obtained by complete waveform moment tensor inversions. We then use the seismograms generated by the aftershocks as empirical Green's functions (EGFs) to retrieve the apparent source time functions (ASTFs) of near-field stations using projected Landweber deconvolution approach. The method for identifying the fault plane relies on the spatial patterns of the apparent source time function durations which depend on the angle between rupture direction and the take-off angle and azimuth of the ray. These derived duration patterns then are compared with the theoretical patterns, which are functions of the following parameters, including focal depth, epicentral distance, average crustal 1D velocity, fault plane attitude, and rupture direction on the fault plane. As a result, the ASTFs derived from EGFs can be used to infer the ruptured fault plane and the rupture direction. Finally we used part of the catalogs to study important seismogenic structures in the area near Chiayi, Taiwan, where a damaging earthquake has occurred about a century ago. The preliminary results show a strike-slip earthquake on 22 October 1999 (Mw 5.6) has ruptured unilaterally toward SSW on a sub-vertical fault. The procedure developed from this study can be applied to other strong motion waveforms recorded from other earthquakes to better understand their kinematic source parameters.

  12. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  13. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  14. Digital Recording and Non-Destructive Techniques for the Understanding of Structural Performance for Rehabilitating Historic Structures at the Kathmandu Valley after Gorkha Earthquake 2015

    Science.gov (United States)

    Shrestha, S.; Reina Ortiz, M.; Gutland, M.; Napolitano, R.; Morris, I. M.; Santana Quintero, M.; Erochko, J.; Kawan, S.; Shrestha, R. G.; Awal, P.; Suwal, S.; Duwal, S.; Maharjan, D. K.

    2017-08-01

    On 25 April 2015, the Gorkha earthquake of magnitude 7.8, severely damaged the cultural heritage sites of Nepal. In particular, the seven monument zones of the Kathmandu Valley World Heritage Site suffered extensive damage. Out of 195 surveyed monuments, 38 have completely collapsed and 157 partially damaged (DoA, 2015). In particular, the world historic city of Bhaktapur was heavily affected by the earthquake. There is, in general, a lack of knowledge regarding the traditional construction technology used in many of the most important temple monuments in Bhaktapur. To address this limitation and to assist in reconstruction and rehabilitation of the area, this study documents the existing condition of different historic structures in the Kathmandu Valley. In particular, the Nyatapola Temple is studied in detail. To record and document the condition of this temple, a combination of laser scanning and terrestrial and aerial photogrammetry are used. By also including evaluation of the temple and its supporting plinth structure using non-destructive evaluation techniques like geo-radar and micro-tremor dynamic analysis, this study will form the basis of a structural analysis study to assess the anticipated future seismic performance of the Nyatapola Temple.

  15. Earthquake Facts

    Science.gov (United States)

    ... North Dakota, and Wisconsin. The core of the earth was the first internal structural element to be identified. In 1906 R.D. Oldham discovered it from his studies of earthquake records. The inner core is solid, and the outer core is liquid and so does not transmit ...

  16. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  17. Effects of earthquake rupture shallowness and local soil conditions on simulated ground motions

    International Nuclear Information System (INIS)

    Apsel, Randy J.; Hadley, David M.; Hart, Robert S.

    1983-03-01

    The paucity of strong ground motion data in the Eastern U.S. (EUS), combined with well recognized differences in earthquake source depths and wave propagation characteristics between Eastern and Western U.S. (WUS) suggests that simulation studies will play a key role in assessing earthquake hazard in the East. This report summarizes an extensive simulation study of 5460 components of ground motion representing a model parameter study for magnitude, distance, source orientation, source depth and near-surface site conditions for a generic EUS crustal model. The simulation methodology represents a hybrid approach to modeling strong ground motion. Wave propagation is modeled with an efficient frequency-wavenumber integration algorithm. The source time function used for each grid element of a modeled fault is empirical, scaled from near-field accelerograms. This study finds that each model parameter has a significant influence on both the shape and amplitude of the simulated response spectra. The combined effect of all parameters predicts a dispersion of response spectral values that is consistent with strong ground motion observations. This study provides guidelines for scaling WUS data from shallow earthquakes to the source depth conditions more typical in the EUS. The modeled site conditions range from very soft soil to hard rock. To the extent that these general site conditions model a specific site, the simulated response spectral information can be used to either correct spectra to a site-specific environment or used to compare expected ground motions at different sites. (author)

  18. CALCULATION OF THE UNIQUE HIGH-RISE BUILDING FOR EARTHQUAKES IN NONLINEAR DYNAMIC FORMULATION

    Directory of Open Access Journals (Sweden)

    Mkrtychev Oleg Vartanovich

    2016-06-01

    Full Text Available The article contains the calculation of a 80-storey high-rise building on 3-component accelerograms with different dominant frequencies. The “Akhmat Tower” belongs to the complex “Grozny-city 2” and is classified as a unique construction, its height is 400 m. During the construction unique high-rise buildings and high-rise buildings in seismic areas an additional computational studies are required, which should take into account the nonlinear nature of the design. For the case of linear instrumental-synthesized accelerograms, it is necessary to apply nonlinear dynamic methods. The studies were conducted using the software LS-DYNA, implementing the methods of direct integration of the equations of motion by the explicit scheme. The constructive scheme of the building frame is braced, the spatial stability is ensured by load-bearing interior walls, columns and hard disks, and frame metal coatings. The choice of the type and dimensions of the finite element and the step of integration is due to the ability to perform calculations in reasonable time, and to the required accuracy of calculation. For this aim the issues of convergence of the solutions on a number of settlement schemes were investigated with the terms of thickened mesh of finite elements: 0.5 m; 1 m; 2 m; 3 m. As a result of the research it was obtained that the best is to split into finite elements with a characteristic size of 2 m. The calculation of the building is made on rigid foundation. The authors used accelerograms normalized for earthquakes of 8 and 9 points on the MSK-64 scale. The destruction of the elements in the process of loading, and the interaction of the elements during their contact was taken into account, i.e. the calculation was made taking into account physical, geometrical and structural nonlinearities. The article analyzes the results of the calculation. The authors evaluated the seismic stability of the building. Possible ways to improve the seismic

  19. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  20. The 13 January 2001 El Salvador earthquake: A multidata analysis

    Science.gov (United States)

    ValléE, Martin; Bouchon, Michel; Schwartz, Susan Y.

    2003-04-01

    On 13 January 2001, a large normal faulting intermediate depth event (Mw = 7.7) occurred 40 km off the El Salvadorian coast (Central America). We analyze this earthquake using teleseismic, regional, and local data. We first build a kinematic source model by simultaneously inverting P and SH displacement waveforms and source time functions derived from surface waves using an empirical Green's function analysis. In an attempt to discriminate between the two nodal planes (30° trenchward dipping and 60° landward dipping), we perform identical inversions using both possible fault planes. After relocating the hypocentral depth at 54 km, we retrieve the kinematic features of the rupture using a combination of the Neighborhood algorithm of [1999] and the Simplex method allowing for variable rupture velocity and slip. We find updip rupture propagation yielding a centroid depth around 47 km for both assumed fault planes with a larger variance reduction obtained using the 60° landward dipping nodal plane. We test the two possible fault models using regional broadband data and near-field accelerograms provided by [2001]. Near-field data confirm that the steeper landward dipping nodal plane is preferred. Rupture propagated mostly updip and to the northwest, resulting in a main moment release zone of approximately 25 km × 50 km with an average slip of ˜3.5 m. The large slip occurs near the interplate interface at a location where the slab steepens dip significantly. The occurrence of this event is well-explained by bending of the subducting plate.

  1. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  2. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  3. An application of earthquake prediction algorithm M8 in eastern ...

    Indian Academy of Sciences (India)

    2Institute of Earthquake Prediction Theory and Mathematical Geophysics, ... located about 70 km from a preceding M7.3 earthquake that occurred in ... local extremes of the seismic density distribution, and in the third approach, CI centers were distributed ...... Bird P 2003 An updated digital model of plate boundaries;.

  4. It's Our Fault: better defining earthquake risk in Wellington, New Zealand

    Science.gov (United States)

    Van Dissen, R.; Brackley, H. L.; Francois-Holden, C.

    2012-12-01

    increasing the region's resilience to earthquakes. We present latest results on ground motion simulations for large plate interface earthquakes under Wellington in terms of response spectra and acceleration time histories. We derive realistic broadband accelerograms based on a stochastic modelling technique. First we characterise the potential interface rupture area based on previous geodetically-derived estimates interface of slip deficit. Then, we entertain a suitable range of source parameters, including various rupture areas, moment magnitudes, stress drops, slip distributions and rupture propagation directions. The resulting rupture scenarios all produce long duration shaking, and peak ground accelerations that, typically, range between 0.2-0.7 g in Wellington city. Many of these scenarios also produce long period motions that are currently not captured by the current NZ design spectra.

  5. Pattern of ground deformation in Kathmandu valley during 2015 Gorkha Earthquake, central Nepal

    Science.gov (United States)

    Ghimire, S.; Dwivedi, S. K.; Acharya, K. K.

    2016-12-01

    The 25th April 2015 Gorkha Earthquake (Mw=7.8) epicentered at Barpak along with thousands of aftershocks released seismic moment nearly equivalent to an 8.0 Magnitude earthquake rupturing a 150km long fault segment. Although Kathmandu valley was supposed to be severely devastated by such major earthquake, post earthquake scenario is completely different. The observed destruction is far less than anticipated as well as the spatial pattern is different than expected. This work focuses on the behavior of Kathmandu valley sediments during the strong shaking by the 2015 Gorkha Earthquake. For this purpose spatial pattern of destruction is analyzed at heavily destructed sites. To understand characteristics of subsurface soil 2D-MASW survey was carried out using a 24-channel seismograph system. An accellerogram recorded by Nepal Seismological Center was analyzed to characterize the strong ground motion. The Kathmandu valley comprises fluvio-lacustrine deposit with gravel, sand, silt and clay along with few exposures of basement rocks within the sediments. The observations show systematic repetition of destruction at an average interval of 2.5km mostly in sand, silt and clay dominated formations. Results of 2D-MASW show the sites of destruction are characterized by static deformation of soil (liquefaction and southerly dipping cracks). Spectral analysis of the accelerogram indicates maximum power associated with frequency of 1.0Hz. The result of this study explains the observed spatial pattern of destruction in Kathmandu valley. This is correlated with the seismic energy associated with the frequency of 1Hz, which generates an average wavelength of 2.5km with an average S-wave velocity of 2.5km/s. The cumulative effect of dominant frequency and associated wavelength resulted in static deformation of surface soil layers at an average interval of 2.5km. This phenomenon clearly describes the reason for different scenario than that was anticipated in Kathmandu valley.

  6. Accessing northern California earthquake data via Internet

    Science.gov (United States)

    Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David

    The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.

  7. Deterministic modeling for microzonation of Bucharest: Case study for August 30, 1986, and May 30-31, 1990. Vrancea earthquakes

    International Nuclear Information System (INIS)

    Cioflan, C.O.; Apostol, B.F.; Moldoveanu, C.L.; Marmureanu, G.; Panza, G.F.

    2002-03-01

    The mapping of the seismic ground motion in Bucharest, due to the strong Vrancea earthquakes is carried out using a complex hybrid waveform modeling method which combines the modal summation technique, valid for laterally homogenous anelastic media, with finite-differences technique and optimizes the advantages of both methods. For recent earthquakes, it is possible to validate the modeling by comparing the synthetic seismograms with the records. As controlling records we consider the accelerograms of the Magurele station, low pass filtered with a cut off frequency of 1.0 Hz, of the 3 last major strong (M w >6) Vrancea earthquakes. Using the hybrid method with a double-couple- seismic source approximation, scaled for the source dimensions and relatively simple regional (bedrock) and local structure models we succeeded in reproducing the recorded ground motion in Bucharest, at a satisfactory level for seismic engineering. Extending the modeling to the whole territory of the Bucharest area, we construct a new seismic microzonation map, where five different zones are identified by their characteristic response spectra. (author)

  8. Instrumental shaking thresholds for seismically induced landslides and preliminary report on landslides triggered by the October 17, 1989, Loma Prieta, California earthquake

    Science.gov (United States)

    Harp, E.L.

    1993-01-01

    The generation of seismically induced landslide depends on the characteristics of shaking as well as mechanical properties of geologic materials. A very important parameter in the study of seismically induced landslide is the intensity based on a strong-motion accelerogram: it is defined as Arias intensity and is proportional to the duration of the shaking record as well as the amplitude. Having a theoretical relationship between Arias intensity, magnitude and distance it is possible to predict how far away from the seismic source landslides are likely to occur for a given magnitude earthquake. Field investigations have established that the threshold level of Arias intensity depends also on site effects, particularly the fracture characteristics of the outcrops present. -from Author

  9. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  10. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  11. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  12. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  13. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  14. Features of tuned mass damper behavior under strong earthquakes

    Science.gov (United States)

    Nesterova, Olga; Uzdin, Alexander; Fedorova, Maria

    2018-05-01

    Plastic deformations, cracks and destruction of structure members appear in the constructions under strong earthquakes. Therefore constructions are characterized by a nonlinear deformation diagram. Two types of construction non-linearity are considered in the paper. The first type of nonlinearity is elastoplastic one. In this case, plastic deformations occur in the structural elements, and when the element is unloaded, its properties restores. Among such diagrams are the Prandtl diagram, the Prandtl diagram with hardening, the Ramberg-Osgood diagram and others. For systems with such nonlinearity there is an amplitude-frequency characteristic and resonance oscillation frequencies. In this case one can pick up the most dangerous accelerograms for the construction. The second type of nonlinearity is nonlinearity with degrading rigidity and dependence of behavior on the general loading history. The Kirikov-Amankulov model is one of such ones. Its behavior depends on the maximum displacement in the stress history. Such systems do not have gain frequency characteristic and resonance frequency. The period of oscillation of such system is increasing during the system loading, and the system eigen frequency decreases to zero at the time of collapse. In the cases under consideration, when investigating the system with MD behavior, the authors proposed new efficiency criteria. These include the work of plastic deformation forces for the first type of nonlinearity, which determines the possibility of progressive collapse or low cycle fatigue of the structure members. The period of system oscillations and the time to collapse of the structural support members are the criterion for systems with degrading rigidity. In the case of non-linear system behavior, the efficiency of MD application decreases, because the fundamental structure period is reduced because of structure damages and the MD will be rebound from the blanking regime. However, the MD using can significantly reduce

  15. Computing broadband accelerograms using kinematic rupture modeling; Generation d'accelerogrammes synthetiques large-bande par modelisation cinematique de la rupture sismique

    Energy Technology Data Exchange (ETDEWEB)

    Ruiz Paredes, J.A

    2007-05-15

    In order to make the broadband kinematic rupture modeling more realistic with respect to dynamic modeling, physical constraints are added to the rupture parameters. To improve the slip velocity function (SVF) modeling, an evolution of the k{sup -2} source model is proposed, which consists to decompose the slip as a sum of sub-events by band of k. This model yields to SVF close to the solution proposed by Kostrov for a crack, while preserving the spectral characteristics of the radiated wave field, i.e. a w{sup 2} model with spectral amplitudes at high frequency scaled to the coefficient of directivity C{sub d}. To better control the directivity effects, a composite source description is combined with a scaling law defining the extent of the nucleation area for each sub-event. The resulting model allows to reduce the apparent coefficient of directivity to a fraction of C{sub d}, as well as to reproduce the standard deviation of the new empirical attenuation relationships proposed for Japan. To make source models more realistic, a variable rupture velocity in agreement with the physics of the rupture must be considered. The followed approach that is based on an analytical relation between the fracture energy, the slip and the rupture velocity, leads to higher values of the peak ground acceleration in the vicinity of the fault. Finally, to better account for the interaction of the wave field with the geological medium, a semi-empirical methodology is developed combining a composite source model with empirical Green functions, and is applied to the Yamaguchi, M{sub w} 5.9 earthquake. The modeled synthetics reproduce satisfactorily well the observed main characteristics of ground motions. (author)

  16. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  17. Topographic changes and their driving factors after 2008 Wenchuan Earthquake

    Science.gov (United States)

    Li, C.; Wang, M.; Xie, J.; Liu, K.

    2017-12-01

    The Wenchuan Ms 8.0 Earthquake caused topographic change in the stricken areas because of the formation of numerous coseismic landslides. The emergence of new landslides and debris flows and movement of loose materials under the driving force of heavy rainfall could further shape the local topography. Dynamic topographic changes in mountainous areas stricken by major earthquakes have a strong linkage to the development and occurrence of secondary disasters. However, little attention has been paid to continuously monitoring mountain environment change after such earthquakes. A digital elevation model (DEM) is the main feature of the terrain surface, in our research, we extracted DEM in 2013 and 2015 of a typical mountainous area severely impacted by the 2008 Wenchuan earthquake from the ZY-3 stereo pair images with validation by field measurement. Combined with the elevation dataset in 2002 and 2010, we quantitatively assessed elevation changes in different years and qualitatively analyzed spatiotemporal variation of the terrain and mass movement across the study area. The results show that the earthquake stricken area experienced substantial elevation changes caused by seismic forces and subsequent rainfalls. Meanwhile, deposits after the earthquake are mainly accumulated on the river-channels and mountain ridges and deep gullies which increase the risk of other geo-hazards. And the heavy rainfalls after the earthquake have become the biggest driver of elevation reduction, which overwhelmed elevation increase during the major earthquake. Our study provided a better understanding of subsequent hazards and risks faced by residents and communities stricken by major earthquakes.

  18. Expanding Horizons in Mitigating Earthquake Related Disasters in Urban Areas: Global Development of Real-Time Seismology

    OpenAIRE

    Utkucu, Murat; Küyük, Hüseyin Serdar; Demir, İsmail Hakkı

    2016-01-01

    Abstract Real-time seismology is a newly developing alternative approach in seismology to mitigate earthquake hazard. It exploits up-to-date advances in seismic instrument technology, data acquisition, digital communications and computer systems for quickly transforming data into earthquake information in real-time to reduce earthquake losses and its impact on social and economic life in the earthquake prone densely populated urban and industrial areas.  Real-time seismology systems are not o...

  19. A Mw 6.3 earthquake scenario in the city of Nice (southeast France): ground motion simulations

    Science.gov (United States)

    Salichon, Jérome; Kohrs-Sansorny, Carine; Bertrand, Etienne; Courboulex, Françoise

    2010-07-01

    The southern Alps-Ligurian basin junction is one of the most seismically active zone of the western Europe. A constant microseismicity and moderate size events (3.5 case of an offshore Mw 6.3 earthquake located at the place where two moderate size events (Mw 4.5) occurred recently and where a morphotectonic feature has been detected by a bathymetric survey. We used a stochastic empirical Green’s functions (EGFs) summation method to produce a population of realistic accelerograms on rock and soil sites in the city of Nice. The ground motion simulations are calibrated on a rock site with a set of ground motion prediction equations (GMPEs) in order to estimate a reasonable stress-drop ratio between the February 25th, 2001, Mw 4.5, event taken as an EGF and the target earthquake. Our results show that the combination of the GMPEs and EGF techniques is an interesting tool for site-specific strong ground motion estimation.

  20. Investigating landslides caused by earthquakes - A historical review

    Science.gov (United States)

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  1. Earthquakes and faults in southern California (1970-2010)

    Science.gov (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.

    2012-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.3 in southern California (1970–2010). The bathymetry was generated from digital files from the California Department of Fish And Game, Marine Region, Coastal Bathymetry Project. Elevation data are from the U.S. Geological Survey National Elevation Database. Landsat satellite image is from fourteen Landsat 5 Thematic Mapper scenes collected between 2009 and 2010. Fault data are reproduced with permission from 2006 California Geological Survey and U.S. Geological Survey data. The earthquake data are from the U.S. Geological Survey National Earthquake Information Center.

  2. Discrimination between earthquakes and chemical explosions using artificial neural networks

    International Nuclear Information System (INIS)

    Kundu, Ajit; Bhadauria, Y.S.; Roy, Falguni

    2012-05-01

    An Artificial Neural Network (ANN) for discriminating between earthquakes and chemical explosions located at epicentral distances, Δ <5 deg from Gauribidanur Array (GBA) has been developed using the short period digital seismograms recorded at GBA. For training the ANN spectral amplitude ratios between P and Lg phases computed at 13 different frequencies in the frequency range of 2-8 Hz, corresponding to 20 earthquakes and 23 chemical explosions were used along with other parameters like magnitude, epicentral distance and amplitude ratios Rg/P and Rg/Lg. After training and development, the ANN has correctly identified a set of 21 test events, comprising 6 earthquakes and 15 chemical explosions. (author)

  3. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  4. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  5. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  6. Digital broadcasting

    International Nuclear Information System (INIS)

    Park, Ji Hyeong

    1999-06-01

    This book contains twelve chapters, which deals with digitization of broadcast signal such as digital open, digitization of video signal and sound signal digitization of broadcasting equipment like DTPP and digital VTR, digitization of equipment to transmit such as digital STL, digital FPU and digital SNG, digitization of transmit about digital TV transmit and radio transmit, digital broadcasting system on necessity and advantage, digital broadcasting system abroad and Korea, digital broadcasting of outline, advantage of digital TV, ripple effect of digital broadcasting and consideration of digital broadcasting, ground wave digital broadcasting of DVB-T in Europe DTV in U.S.A and ISDB-T in Japan, HDTV broadcasting, satellite broadcasting, digital TV broadcasting in Korea, digital radio broadcasting and new broadcasting service.

  7. Investigating Landslides Caused by Earthquakes A Historical Review

    Science.gov (United States)

    Keefer, David K.

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated withearthquake-induced landslides. This paper traces thehistorical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquakes are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession ofpost-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing ``retrospective'' analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, syntheses of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  8. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  9. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  10. Bam Earthquake in Iran

    CERN Multimedia

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  11. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  12. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  13. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  14. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  15. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  16. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  17. Earthquake responses of a beam supported by a mechanical snubber

    International Nuclear Information System (INIS)

    Ohmata, Kenichiro; Ishizu, Seiji.

    1989-01-01

    The mechanical snubber is an earthquakeproof device for piping systems under particular circumstances such as high temperature and radioactivity. It has nonlinearities in both load and frequency response. In this report, the resisting force characteristics of the snubber and earthquake responses of piping (a simply supported beam) which is supported by the snubber are simulated using Continuous System Simulation Language (CSSL). Digital simulations are carried out for various kinds of physical properties of the snubber. The restraint effect and the maximum resisting force of the snubber during earthquakes are discussed and compared with the case of an oil damper. The earthquake waves used here are E1 Centro N-S and Akita Harbour N-S (Nihonkai-Chubu earthquake). (author)

  18. Preparing Haitian youth for digital jobs | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Unemployment, at 40%, is the highest in the region, and the 2010 earthquake ... For countries like Haiti, the digital economy could offer new employment prospects ... Wages and labour regulation, especially for online outsourcing, are often ...

  19. Implications of the Mw9.0 Tohoku-Oki earthquake for ground motion scaling with source, path, and site parameters

    Science.gov (United States)

    Stewart, Jonathan P.; Midorikawa, Saburoh; Graves, Robert W.; Khodaverdi, Khatareh; Kishida, Tadahiro; Miura, Hiroyuki; Bozorgnia, Yousef; Campbell, Kenneth W.

    2013-01-01

    The Mw9.0 Tohoku-oki Japan earthquake produced approximately 2,000 ground motion recordings. We consider 1,238 three-component accelerograms corrected with component-specific low-cut filters. The recordings have rupture distances between 44 km and 1,000 km, time-averaged shear wave velocities of VS30 = 90 m/s to 1,900 m/s, and usable response spectral periods of 0.01 sec to >10 sec. The data support the notion that the increase of ground motions with magnitude saturates at large magnitudes. High-frequency ground motions demonstrate faster attenuation with distance in backarc than in forearc regions, which is only captured by one of the four considered ground motion prediction equations for subduction earthquakes. Recordings within 100 km of the fault are used to estimate event terms, which are generally positive (indicating model underprediction) at short periods and zero or negative (overprediction) at long periods. We find site amplification to scale minimally with VS30 at high frequencies, in contrast with other active tectonic regions, but to scale strongly with VS30 at low frequencies.

  20. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  1. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  2. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  3. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  4. Seismic ground motion modelling and damage earthquake scenarios: A bridge between seismologists and seismic engineers

    International Nuclear Information System (INIS)

    Panza, G.F.; Romanelli, F.; Vaccari. F.; . E-mails: Luis.Decanini@uniroma1.it; Fabrizio.Mollaioli@uniroma1.it)

    2002-07-01

    The input for the seismic risk analysis can be expressed with a description of 'roundshaking scenarios', or with probabilistic maps of perhaps relevant parameters. The probabilistic approach, unavoidably based upon rough assumptions and models (e.g. recurrence and attenuation laws), can be misleading, as it cannot take into account, with satisfactory accuracy, some of the most important aspects like rupture process, directivity and site effects. This is evidenced by the comparison of recent recordings with the values predicted by the probabilistic methods. We prefer a scenario-based, deterministic approach in view of the limited seismological data, of the local irregularity of the occurrence of strong earthquakes, and of the multiscale seismicity model, that is capable to reconcile two apparently conflicting ideas: the Characteristic Earthquake concept and the Self Organized Criticality paradigm. Where the numerical modeling is successfully compared with records, the synthetic seismograms permit the microzoning, based upon a set of possible scenario earthquakes. Where no recordings are available the synthetic signals can be used to estimate the ground motion without having to wait for a strong earthquake to occur (pre-disaster microzonation). In both cases the use of modeling is necessary since the so-called local site effects can be strongly dependent upon the properties of the seismic source and can be properly defined only by means of envelopes. The joint use of reliable synthetic signals and observations permits the computation of advanced hazard indicators (e.g. damaging potential) that take into account local soil properties. The envelope of synthetic elastic energy spectra reproduces the distribution of the energy demand in the most relevant frequency range for seismic engineering. The synthetic accelerograms can be fruitfully used for design and strengthening of structures, also when innovative techniques, like seismic isolation, are employed. For these

  5. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  6. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  7. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  8. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  9. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  10. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  11. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  12. Earthquakes; May-June 1982

    Science.gov (United States)

    Person, W.J.

    1982-01-01

    There were four major earthquakes (7.0-7.9) during this reporting period: two struck in Mexico, one in El Salvador, and one in teh Kuril Islands. Mexico, El Salvador, and China experienced fatalities from earthquakes.

  13. Principles for selecting earthquake motions in engineering design of large dams

    Science.gov (United States)

    Krinitzsky, E.L.; Marcuson, William F.

    1983-01-01

    This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at

  14. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  15. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  16. Earthquakes, May-June 1991

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  17. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  18. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  19. [Earthquakes in El Salvador].

    Science.gov (United States)

    de Ville de Goyet, C

    2001-02-01

    The Pan American Health Organization (PAHO) has 25 years of experience dealing with major natural disasters. This piece provides a preliminary review of the events taking place in the weeks following the major earthquakes in El Salvador on 13 January and 13 February 2001. It also describes the lessons that have been learned over the last 25 years and the impact that the El Salvador earthquakes and other disasters have had on the health of the affected populations. Topics covered include mass-casualties management, communicable diseases, water supply, managing donations and international assistance, damages to the health-facilities infrastructure, mental health, and PAHO's role in disasters.

  20. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  1. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  2. The EM Earthquake Precursor

    Science.gov (United States)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  3. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  4. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  5. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  6. A new way of telling earthquake stories: MOBEE - the MOBile Earthquake Exhibition

    Science.gov (United States)

    Tataru, Dragos; Toma-Danila, Dragos; Nastase, Eduard

    2016-04-01

    In the last decades, the demand and acknowledged importance of science outreach, in general and geophysics in particular, has grown, as demonstrated by many international and national projects and other activities performed by research institutes. The National Institute for Earth Physics (NIEP) from Romania is the leading national institution on earthquake monitoring and research, having at the same time a declared focus on informing and educating a wide audience about geosciences and especially seismology. This is more than welcome, since Romania is a very active country from a seismological point of view, but not too reactive when it comes to diminishing the possible effect of a major earthquake. Over the last few decades, the country has experienced several major earthquakes which have claimed thousands of lives and millions in property damage (1940; 1977; 1986 and 1990 Vrancea earthquakes). In this context, during a partnership started in 2014 together with the National Art University and Siveco IT company, a group of researchers from NIEP initiated the MOBile Earthquake Exhibition (MOBEE) project. The main goal was to design a portable museum to bring on the road educational activities focused on seismology, seismic hazard and Earth science. The exhibition is mainly focused on school students of all ages as it explains the main topics of geophysics through a unique combination of posters, digital animations and apps, large markets and exciting hand-on experiments, 3D printed models and posters. This project is singular in Romania and aims to transmit properly reviewed actual information, regarding the definition of earthquakes, the way natural hazards can affect people, buildings and the environment and the measures to be taken for prevent an aftermath. Many of the presented concepts can be used by teachers as a complementary way of demonstrating physics facts and concepts and explaining processes that shape the dynamic Earth features. It also involves

  7. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  8. Digital radiography

    International Nuclear Information System (INIS)

    Brody, W.R.

    1984-01-01

    Digital Radiography begins with an orderly introduction to the fundamental concepts of digital imaging. The entire X-ray digital imagining system is described, from an overall characterization of image quality to specific components required for a digital radiographic system. Because subtraction is central to digital radiographic systems, the author details the use of various subtraction methods for image enhancement. Complex concepts are illustrated with numerous examples and presented in terms that can readily be understood by physicians without an advanced mathematics background. The second part of the book discusses implementations and applications of digital imagining systems based on area and scanned detector technologies. This section includes thorough coverage of digital fluoroscopy, scanned projection radiography, and film-based digital imaging systems, and features a state-of-the-art synopsis of the applications of digital subtraction angiography. The book concludes with a timely assessment of anticipated technological advances

  9. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  10. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  11. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  12. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  13. Housing Damage Following Earthquake

    Science.gov (United States)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  14. Digital Culture and Digital Library

    Directory of Open Access Journals (Sweden)

    Yalçın Yalçınkaya

    2016-12-01

    Full Text Available In this study; digital culture and digital library which have a vital connection with each other are examined together. The content of the research consists of the interaction of culture, information, digital culture, intellectual technologies, and digital library concepts. The study is an entry work to integrity of digital culture and digital library theories and aims to expand the symmetry. The purpose of the study is to emphasize the relation between the digital culture and digital library theories acting intersection of the subjects that are examined. Also the perspective of the study is based on examining the literature and analytical evaluation in both studies (digital culture and digital library. Within this context, the methodology of the study is essentially descriptive and has an attribute for the transmission and synthesis of distributed findings produced in the field of the research. According to the findings of the study results, digital culture is an inclusive term that describes the effects of intellectual technologies in the field of information and communication. Information becomes energy and the spectrum of the information is expanding in the vertical rise through the digital culture. In this context, the digital library appears as a new living space of a new environment. In essence, the digital library is information-oriented; has intellectual technology support and digital platform; is in a digital format; combines information resources and tools in relationship/communication/cooperation by connectedness, and also it is the dynamic face of the digital culture in time and space independence. Resolved with the study is that the digital libraries are active and effective in the formation of global knowing and/or mass wisdom in the process of digital culture.

  15. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  16. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  17. An interactive program on digitizing historical seismograms

    Science.gov (United States)

    Xu, Yihe; Xu, Tao

    2014-02-01

    Retrieving information from analog seismograms is of great importance since they are considered as the unique sources that provide quantitative information of historical earthquakes. We present an algorithm for automatic digitization of the seismograms as an inversion problem that forms an interactive program using Matlab® GUI. The program integrates automatic digitization with manual digitization and users can easily switch between the two modalities and carry out different combinations for the optimal results. Several examples about applying the interactive program are given to illustrate the merits of the method.

  18. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  19. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  20. Digital mammography; Mamografia digital

    Energy Technology Data Exchange (ETDEWEB)

    Chevalier, M.; Torres, R.

    2010-07-01

    Mammography represents one of the most demanding radiographic applications, simultaneously requiring excellent contrast sensitivity, high spatial resolution, and wide dynamic range. Film/screen is the most widely extended image receptor in mammography due to both its high spatial resolution and contrast. The film/screen limitations are related with its narrow latitude, structural noise and that is at the same time the medium for the image acquisition, storage and presentation. Several digital detector made with different technologies can overcome these difficulties. Here, these technologies as well as their main advantages and disadvantages are analyzed. Also it is discussed its impact on the mammography examinations, mainly on the breast screening programs. (Author).

  1. Digital Tectonics

    DEFF Research Database (Denmark)

    Christiansen, Karl; Borup, Ruben; Søndergaard, Asbjørn

    2014-01-01

    Digital Tectonics treats the architectonical possibilities in digital generation of form and production. The publication is the first volume of a series, in which aspects of the strategic focus areas of the Aarhus School of Architecture will be disseminated.......Digital Tectonics treats the architectonical possibilities in digital generation of form and production. The publication is the first volume of a series, in which aspects of the strategic focus areas of the Aarhus School of Architecture will be disseminated....

  2. Digital squares

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Kim, Chul E

    1988-01-01

    Digital squares are defined and their geometric properties characterized. A linear time algorithm is presented that considers a convex digital region and determines whether or not it is a digital square. The algorithm also determines the range of the values of the parameter set of its preimages....... The analysis involves transforming the boundary of a digital region into parameter space of slope and y-intercept...

  3. Digital skrivedidaktik

    DEFF Research Database (Denmark)

    Digital skrivedidaktik består af to dele. Første del præsenterer teori om skrivekompetence og digital skrivning. Digital skrivning er karakteriseret ved at tekster skrives på computer og med digitale værktøjer, hvilket ændrer skrivningens traditionelle praksis, produkt og processer. Hvad er digital...... om elevens skriveproces) og Blogskrivning (der styrker eleverne i at bruge blogs i undervisningen)....

  4. Digital Citizenship

    Science.gov (United States)

    Isman, Aytekin; Canan Gungoren, Ozlem

    2014-01-01

    Era in which we live is known and referred as digital age.In this age technology is rapidly changed and developed. In light of these technological advances in 21st century, schools have the responsibility of training "digital citizen" as well as a good citizen. Digital citizens must have extensive skills, knowledge, Internet and …

  5. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  6. Digital subtraktion

    DEFF Research Database (Denmark)

    Mussmann, Bo Redder

    2004-01-01

    Digital subtraktion er en metode til at fjerne uønskede oplysninger i et røntgenbillede. Subtraktionsteknikken bruges primært i forbindelse med angiografi hvor man kun er interesseret i at se selve karret. Derfor er digital subtraktion i daglig tale synonymt med DSA eller DVI – hhv. Digital...... Subtraction Angiography eller Digital Vascular Imaging. Benævnelserne er to røntgenfirmaers navn for den samme teknik. Digital subtraktion kræver speciel software, samt at apparaturet kan eksponere i serier....

  7. Digital preservation

    CERN Document Server

    Deegan, Marilyn

    2013-01-01

    Digital preservation is an issue of huge importance to the library and information profession right now. With the widescale adoption of the internet and the rise of the world wide web, the world has been overwhelmed by digital information. Digital data is being produced on a massive scale by individuals and institutions: some of it is born, lives and dies only in digital form, and it is the potential death of this data, with its impact on the preservation of culture, that is the concern of this book. So how can information professionals try to remedy this? Digital preservation is a complex iss

  8. Digital Natives or Digital Tribes?

    Science.gov (United States)

    Watson, Ian Robert

    2013-01-01

    This research builds upon the discourse surrounding digital natives. A literature review into the digital native phenomena was undertaken and found that researchers are beginning to identify the digital native as not one cohesive group but of individuals influenced by other factors. Primary research by means of questionnaire survey of technologies…

  9. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  10. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  11. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  12. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  13. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  14. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  15. Ground Motion Characteristics of Induced Earthquakes in Central North America

    Science.gov (United States)

    Atkinson, G. M.; Assatourians, K.; Novakovic, M.

    2017-12-01

    The ground motion characteristics of induced earthquakes in central North America are investigated based on empirical analysis of a compiled database of 4,000,000 digital ground-motion records from events in induced-seismicity regions (especially Oklahoma). Ground-motion amplitudes are characterized non-parametrically by computing median amplitudes and their variability in magnitude-distance bins. We also use inversion techniques to solve for regional source, attenuation and site response effects. Ground motion models are used to interpret the observations and compare the source and attenuation attributes of induced earthquakes to those of their natural counterparts. Significant conclusions are that the stress parameter that controls the strength of high-frequency radiation is similar for induced earthquakes (depth of h 5 km) and shallow (h 5 km) natural earthquakes. By contrast, deeper natural earthquakes (h 10 km) have stronger high-frequency ground motions. At distances close to the epicenter, a greater focal depth (which increases distance from the hypocenter) counterbalances the effects of a larger stress parameter, resulting in motions of similar strength close to the epicenter, regardless of event depth. The felt effects of induced versus natural earthquakes are also investigated using USGS "Did You Feel It?" reports; 400,000 reports from natural events and 100,000 reports from induced events are considered. The felt reports confirm the trends that we expect based on ground-motion modeling, considering the offsetting effects of the stress parameter versus focal depth in controlling the strength of motions near the epicenter. Specifically, felt intensity for a given magnitude is similar near the epicenter, on average, for all event types and depths. At distances more than 10 km from the epicenter, deeper events are felt more strongly than shallow events. These ground-motion attributes imply that the induced-seismicity hazard is most critical for facilities in

  16. Digital mammography

    International Nuclear Information System (INIS)

    Bick, Ulrich; Diekmann, Felix

    2010-01-01

    This state-of-the-art reference book provides in-depth coverage of all aspects of digital mammography, including detector technology, image processing, computer-aided diagnosis, soft-copy reading, digital workflow, and PACS. Specific advantages and disadvantages of digital mammography in comparison to screen-film mammography are thoroughly discussed. By including authors from both North America and Europe, the book is able to outline variations in the use, acceptance, and quality assurance of digital mammography between the different countries and screening programs. Advanced imaging techniques and future developments such as contrast mammography and digital breast tomosynthesis are also covered in detail. All of the chapters are written by internationally recognized experts and contain numerous high-quality illustrations. This book will be of great interest both to clinicians who already use or are transitioning to digital mammography and to basic scientists working in the field. (orig.)

  17. Digital Insights

    DEFF Research Database (Denmark)

    Knudsen, Gry Høngsmark

    , by incorporating media as both channel, frame, and apparatus for advertising response, the dissertation brings into attention that more aspects than the text-reader relationship influence ad response. Finally, the dissertation proposes the assemblage approach for exploring big data in consumer culture research...... and practices with digital media, when they meet and interpret advertising. Through studies of advertising response on YouTube and experiments with consumers’ response to digitally manipulated images, the dissertation shows how digital media practices facilitate polysemic and socially embedded advertising......This dissertation forwards the theory of digital consumer-response as a perspective to examine how digital media practices influence consumers’ response to advertising. Digital consumer-response is a development of advertising theory that encompasses how consumers employ their knowledge...

  18. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  19. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  20. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  1. Digital Signage

    OpenAIRE

    Fischer, Karl Peter

    2011-01-01

    Digital Signage for in-store advertising at gas stations/retail stores in Germany: A field study Digital Signage networks provide a novel means of advertising with the advantage of easily changeable and highly customizable animated content. Despite the potential and increasing use of these media empirical research is scarce. In a field study at 8 gas stations (with integrated convenience stores) we studied the effect of digital signage advertising on sales for different products and servi...

  2. Physics of Earthquake Rupture Propagation

    Science.gov (United States)

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  3. Radon observation for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  4. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  5. Sports Digitalization

    DEFF Research Database (Denmark)

    Xiao, Xiao; Hedman, Jonas; Tan, Felix Ter Chian

    2017-01-01

    evolution, as digital technologies are increasingly entrenched in a wide range of sporting activities and for applications beyond mere performance enhancement. Despite such trends, research on sports digitalization in the IS discipline is surprisingly still nascent. This paper aims at establishing...... a discourse on sports digitalization within the discipline. Toward this, we first provide an understanding of the institutional characteristics of the sports industry, establishing its theoretical importance and relevance in our discipline; second, we reveal the latest trends of digitalization in the sports...

  6. Digital printing

    Science.gov (United States)

    Sobotka, Werner K.

    1997-02-01

    Digital printing is described as a tool to replace conventional printing machines completely. Still this goal was not reached until now with any of the digital printing technologies to be described in the paper. Productivity and costs are still the main parameters and are not really solved until now. Quality in digital printing is no problem anymore. Definition of digital printing is to transfer digital datas directly on the paper surface. This step can be carried out directly or with the use of an intermediate image carrier. Keywords in digital printing are: computer- to-press; erasable image carrier; image carrier with memory. Digital printing is also the logical development of the new digital area as it is pointed out in Nicholas Negropotes book 'Being Digital' and also the answer to networking and Internet technologies. Creating images text and color in one country and publishing the datas in another country or continent is the main advantage. Printing on demand another big advantage and last but not least personalization the last big advantage. Costs and being able to coop with this new world of prepress technology is the biggest disadvantage. Therefore the very optimistic growth rates for the next few years are really nonexistent. The development of complete new markets is too slow and the replacing of old markets is too small.

  7. Precisely locating the Klamath Falls, Oregon, earthquakes

    Science.gov (United States)

    Qamar, A.; Meagher, K.L.

    1993-01-01

    The Klamath Falls earthquakes on September 20, 1993, were the largest earthquakes centered in Oregon in more than 50 yrs. Only the magnitude 5.75 Milton-Freewater earthquake in 1936, which was centered near the Oregon-Washington border and felt in an area of about 190,000 sq km, compares in size with the recent Klamath Falls earthquakes. Although the 1993 earthquakes surprised many local residents, geologists have long recognized that strong earthquakes may occur along potentially active faults that pass through the Klamath Falls area. These faults are geologically related to similar faults in Oregon, Idaho, and Nevada that occasionally spawn strong earthquakes

  8. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  9. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  10. Earthquakes and faults in the San Francisco Bay area (1970-2003)

    Science.gov (United States)

    Sleeter, Benjamin M.; Calzia, James P.; Walter, Stephen R.; Wong, Florence L.; Saucedo, George J.

    2004-01-01

    The map depicts both active and inactive faults and earthquakes magnitude 1.5 to 7.0 in the greater San Francisco Bay area. Twenty-two earthquakes magnitude 5.0 and greater are indicated on the map and listed chronologically in an accompanying table. The data are compiled from records from 1970-2003. The bathymetry was generated from a digital version of NOAA maps and hydrogeographic data for San Francisco Bay. Elevation data are from the USGS National Elevation Database. Landsat satellite image is from seven Landsat 7 Enhanced Thematic Mapper Plus scenes. Fault data are reproduced with permission from the California Geological Survey. The earthquake data are from the Northern California Earthquake Catalog.

  11. Spatiotemporal evolution of the completeness magnitude of the Icelandic earthquake catalogue from 1991 to 2013

    Science.gov (United States)

    Panzera, Francesco; Mignan, Arnaud; Vogfjörð, Kristin S.

    2017-07-01

    In 1991, a digital seismic monitoring network was installed in Iceland with a digital seismic system and automatic operation. After 20 years of operation, we explore for the first time its nationwide performance by analysing the spatiotemporal variations of the completeness magnitude. We use the Bayesian magnitude of completeness (BMC) method that combines local completeness magnitude observations with prior information based on the density of seismic stations. Additionally, we test the impact of earthquake location uncertainties on the BMC results, by filtering the catalogue using a multivariate analysis that identifies outliers in the hypocentre error distribution. We find that the entire North-to-South active rift zone shows a relatively low magnitude of completeness Mc in the range 0.5-1.0, highlighting the ability of the Icelandic network to detect small earthquakes. This work also demonstrates the influence of earthquake location uncertainties on the spatiotemporal magnitude of completeness analysis.

  12. The Pocatello Valley, Idaho, earthquake

    Science.gov (United States)

    Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

    1975-01-01

    A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26. 

  13. The threat of silent earthquakes

    Science.gov (United States)

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  14. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  15. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  16. Experience database of Romanian facilities subjected to the last three Vrancea earthquakes

    International Nuclear Information System (INIS)

    1999-01-01

    The scope of this research project is to use the past seismic experience of similar components from power and industrial facilities to establish the generic seismic resistance of nuclear power plant safe shutdown equipment. The first part of the project provide information about the Vrancea. earthquakes which affect the Romanian territory and also the Kozloduy NPP site as a background of the investigations of the seismic performance of mechanical and electrical equipment in the industrial facilities. This project has the following, objectives: collect and process all available seismic information about Vrancea earthquakes; perform probabilistic hazard analysis of the Vrancea earthquakes; determine attenuation low, correlation between the focal depth, earthquake power, soil conditions and frequency characteristics of the seismic ground motion; investigate and collect information regarding seismic behavior during the 1977, 1986 and 1990 earthquakes of mechanical and electrical components from industrial facilities. The seismic database used for the analysis of the Vrancea earthquakes includes digitized triaxial records as follows: March 4, 1977 - I station, Aug, 30 1986 - 42 stations, May 1990 - 54 stations. A catalogue of the Vrancea earthquakes occurred during the period 1901-1994, is presented as well

  17. Archiving, sharing, processing and publishing historical earthquakes data: the IT point of view

    Science.gov (United States)

    Locati, Mario; Rovida, Andrea; Albini, Paola

    2014-05-01

    Digital tools devised for seismological data are mostly designed for handling instrumentally recorded data. Researchers working on historical seismology are forced to perform their daily job using a general purpose tool and/or coding their own to address their specific tasks. The lack of out-of-the-box tools expressly conceived to deal with historical data leads to a huge amount of time lost in performing tedious task to search for the data and, to manually reformat it in order to jump from one tool to the other, sometimes causing a loss of the original data. This reality is common to all activities related to the study of earthquakes of the past centuries, from the interpretations of past historical sources, to the compilation of earthquake catalogues. A platform able to preserve the historical earthquake data, trace back their source, and able to fulfil many common tasks was very much needed. In the framework of two European projects (NERIES and SHARE) and one global project (Global Earthquake History, GEM), two new data portals were designed and implemented. The European portal "Archive of Historical Earthquakes Data" (AHEAD) and the worldwide "Global Historical Earthquake Archive" (GHEA), are aimed at addressing at least some of the above mentioned issues. The availability of these new portals and their well-defined standards makes it easier than before the development of side tools for archiving, publishing and processing the available historical earthquake data. The AHEAD and GHEA portals, their underlying technologies and the developed side tools are presented.

  18. Mechanism of post-seismic floods after the Wenchuan earthquake in ...

    Indian Academy of Sciences (India)

    Ding Hairong

    2017-10-06

    Oct 6, 2017 ... development of devastating post-seismic floods. Thirdly, the ... The segment from. Dujiangyan city to the upstream source of the river is known .... trends downward in the region. ..... quake: A case study in the upper reaches of the Min River,. Sichuan .... the digital strong earthquake network in Sichuan and.

  19. Building damage assessment after the earthquake in Haiti using two postevent satellite stereo imagery and DSMs

    DEFF Research Database (Denmark)

    Tian, Jiaojiao; Nielsen, Allan Aasbjerg; Reinartz, Peter

    2015-01-01

    In this article, a novel after-disaster building damage monitoring method is presented. This method combines the multispectral imagery and digital surface models (DSMs) from stereo matching of two dates to obtain three kinds of changes: collapsed buildings, newly built buildings and temporary she...... changes after the 2010 Haiti earthquake, and the obtained results are further evaluated both visually and numerically....

  20. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  1. How citizen seismology is transforming rapid public earthquake information and interactions between seismologists and society

    Science.gov (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Fréderic; Caroline, Etivant

    2015-04-01

    Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.

  2. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  3. Digital Audiobooks

    DEFF Research Database (Denmark)

    Have, Iben; Pedersen, Birgitte Stougaard

    Audiobooks are rapidly gaining popularity with widely accessible digital downloading and streaming services. The paper is framing how the digital audiobook expands and changes the target groups for book publications and how it as an everyday activity is creating new reading experiences, places...

  4. Digital TMI

    Science.gov (United States)

    Rios, Joseph

    2012-01-01

    Presenting the current status of the Digital TMI project to visiting members of the FAA Command Center. Digital TMI is an effort to store national-level traffic management initiatives in a standards-compliant manner. Work is funded by the FAA.

  5. Digital displacements

    DEFF Research Database (Denmark)

    Pors, Anja Svejgaard

    2014-01-01

    In recent years digital reforms are being introduced in the municipal landscape of Denmark. The reforms address the interaction between citizen and local authority. The aim is, that by 2015 at least 80 per cent of all correspondence between citizens and public authority will be transmitted through...... digital interface. However, the transformation of citizen services from traditional face-to-face interaction to digital self-service gives rise to new practices; some citizens need support to be able to manage self-service through digital tools. A mixture of support and teaching, named co......-service, is a new task in public administration, where street level bureaucrats assist citizens in using the new digital solutions. The paper is based on a case study conducted primarily in a citizen service centre in Copenhagen, Denmark. Based on ethnography the paper gives an empirical account of the ongoing...

  6. Digitized mammograms

    International Nuclear Information System (INIS)

    Bruneton, J.N.; Balu-Maestro, C.; Rogopoulos, A.; Chauvel, C.; Geoffray, A.

    1988-01-01

    Two observers conducted a blind evaluation of 100 mammography files, including 47 malignant cases. Films were read both before and after image digitization at 50 μm and 100 μm with the FilmDRSII. Digitization permitted better analysis of the normal anatomic structures and moderately improved diagnostic sensitivity. Searches for microcalcifications before and after digitization at 100 μm and 50 μm showed better analysis of anatomic structures after digitization (especially for solitary microcalcifications). The diagnostic benefit, with discovery of clustered microcalcifications, was more limited (one case at 100 μm, nine cases at 50 μm). Recognition of microcalcifications was clearly improved in dense breasts, which can benefit from reinterpretation after digitization at 50 μm rather 100μm

  7. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  8. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  9. Segmented seismicity of the Mw 6.2 Baladeh earthquake sequence (Alborz mountains, Iran) revealed from regional moment tensors

    DEFF Research Database (Denmark)

    Donner, Stefanie; Rössler, Dirk; Krüger, Frank

    2013-01-01

    The M w 6.2 Baladeh earthquake occurred on 28 May 2004 in the Alborz Mountains, northern Iran. This earthquake was the first strong shock in this intracontinental orogen for which digital regional broadband data are available. The Baladeh event provides a rare opportunity to study fault geometry...... model, regional waveform data of the mainshock and larger aftershocks (M w  ≥3.3) were inverted for moment tensors. For the Baladeh mainshock, this included inversion for kinematic parameters. All analysed earthquakes show dominant thrust mechanisms at depths between 14 and 26 km, with NW–SE striking...

  10. Data quality of seismic records from the Tohoku, Japan earthquake as recorded across the Albuquerque Seismological Laboratory networks

    Science.gov (United States)

    Ringler, A.T.; Gee, L.S.; Marshall, B.; Hutt, C.R.; Storm, T.

    2012-01-01

    Great earthquakes recorded across modern digital seismographic networks, such as the recent Tohoku, Japan, earthquake on 11 March 2011 (Mw = 9.0), provide unique datasets that ultimately lead to a better understanding of the Earth's structure (e.g., Pesicek et al. 2008) and earthquake sources (e.g., Ammon et al. 2011). For network operators, such events provide the opportunity to look at the performance across their entire network using a single event, as the ground motion records from the event will be well above every station's noise floor.

  11. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  12. Earthquake data base for Romania

    International Nuclear Information System (INIS)

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.

    2002-01-01

    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  13. Mapping Tectonic Stress Using Earthquakes

    International Nuclear Information System (INIS)

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-01-01

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust

  14. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  15. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  16. Large earthquakes and creeping faults

    Science.gov (United States)

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  17. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  18. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  19. Digital Ethics/Going Digital.

    Science.gov (United States)

    Wilson, Bradley

    1996-01-01

    Finds that the recent National Press Photographers Association code of ethics can serve as a model for any photography staff. Discusses how digital imaging is becoming commonplace in classrooms, due to decreasing costs and easier software. Explains digital terminology. Concludes that time saved in the darkroom and at the printer is now spent on…

  20. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  1. Digital radiography

    International Nuclear Information System (INIS)

    Coulomb, M.; Dal Soglio, S.; Pittet-Barbier, L.; Ranchoup, Y.; Thony, F.; Ferretti, G.; Robert, F.

    1992-01-01

    Digital projection radiography may replace conventional radiography some day, provided it can meet several requirements: equal or better diagnostic effectiveness of the screen-film systems; reasonable image cost; real improvement in the productivity of the Departments of Imaging. All digital radiographic systems include an X-ray source, an image acquisition and formatting sub-system, a display and manipulation sub-system, and archiving subsystem and a laser editing system, preferably shared by other sources of digital images. Three digitization processes are available: digitization of the radiographic film, digital fluorography and phospholuminescent detectors with memory. The advantages of digital fluoroscopy are appealing: real-time image acquisition, suppression of cassettes; but its disadvantages are far from negligible: it cannot be applied to bedside radiography, the field of examination is limited, and the wide-field spatial resolution is poor. Phospholuminescent detectors with memory have great advantages: they can be used for bedside radiographs and on all the common radiographic systems; spatial resolution is satisfactory; its current disadvantages are considerable. These two systems, have common properties making up the entire philosophy of digital radiology and specific features that must guide our choice according to the application. Digital fluorography is best applied in pediatric radiology. However, evaluation works have showed that it was applicable with sufficient quality to many indications of general radiology in which a fluoroscopic control and fast acquisition of the images are essential; the time gained on the examination may be considerable, as well as the savings on film. Detectors with memory are required for bedside radiographs, in osteoarticular and thoracic radiology, in all cases of traumatic emergency and in the resuscitation and intensive care departments

  2. Becoming digital

    DEFF Research Database (Denmark)

    Pors, Anja Svejgaard

    2015-01-01

    . An ethnographic account of how digital reforms are implemented in practice shows how street-level bureaucrat’s classic tasks such as specialized casework are being reconfigured into educational tasks that promote the idea of “becoming digital”. In the paper, the author argues that the work of “becoming digital....... Originality/value: The study contributes to ethnographic research in public administration by combining two separate subfields, e-government and street-level bureaucracy, to discern recent transformations in public service delivery. In the digital era, tasks, control and equality are distributed in ways...

  3. Digital Humanities

    DEFF Research Database (Denmark)

    Brügger, Niels

    2016-01-01

    , and preserving material to study, as an object of study in its own right, as an analytical tool, or for collaborating, and for disseminating results. The term "digital humanities" was coined around 2001, and gained currency within academia in the following years. However, computers had been used within......Digital humanities is an umbrella term for theories, methodologies, and practices related to humanities scholarship that use the digital computer as an integrated and essential part of its research and teaching activities. The computer can be used for establishing, finding, collecting...

  4. Digital Snaps

    DEFF Research Database (Denmark)

    Sandbye, Mette; Larsen, Jonas

    . Distance as the New Punctum / Mikko Villi -- pt. II. FAMILY ALBUMS IN TRANSITION -- ch. 4. How Digital Technologies Do Family Snaps, Only Better / Gillian Rose -- ch. 5. Friendship Photography: Memory, Mobility and Social Networking / Joanne Garde-Hansen -- ch. 6. Play, Process and Materiality in Japanese...... -- ch. 9. Retouch Yourself: The Pleasures and Politics of Digital Cosmetic Surgery / Tanya Sheehan -- ch. 10. Virtual Selves: Art and Digital Autobiography / Louise Wolthers -- ch. 11. Mobile-Media Photography: New Modes of Engagement / Michael Shanks and Connie Svabo....

  5. Digital electronics

    CERN Document Server

    Morris, John

    2013-01-01

    An essential companion to John C Morris's 'Analogue Electronics', this clear and accessible text is designed for electronics students, teachers and enthusiasts who already have a basic understanding of electronics, and who wish to develop their knowledge of digital techniques and applications. Employing a discovery-based approach, the author covers fundamental theory before going on to develop an appreciation of logic networks, integrated circuit applications and analogue-digital conversion. A section on digital fault finding and useful ic data sheets completes th

  6. Digital Leadership

    DEFF Research Database (Denmark)

    Zupancic, Tadeja; Verbeke, Johan; Achten, Henri

    2016-01-01

    Leadership is an important quality in organisations. Leadership is needed to introduce change and innovation. In our opinion, in architectural and design practices, the role of leadership has not yet been sufficiently studied, especially when it comes to the role of digital tools and media....... With this paper we intend to initiate a discussion in the eCAADe community to reflect and develop ideas in order to develop digital leadership skills amongst the membership. This paper introduces some important aspects, which may be valuable to look into when developing digital leadership skills....

  7. Digital radiography

    International Nuclear Information System (INIS)

    Zani, M.L.

    2002-01-01

    X-ray radiography is a very common technique used to check the homogeneity of a material or the inside of a mechanical part. Generally the radiation that goes through the material to check, produced an image on a sensitized film. This method requires time because the film needs to be developed, digital radiography has no longer this inconvenient. In digital radiography the film is replaced by digital data and can be processed as any computer file. This new technique is promising but its main inconvenient is that today its resolution is not so good as that of film radiography. (A.C.)

  8. Digital radiography

    International Nuclear Information System (INIS)

    Kusano, Shoichi

    1993-01-01

    Firstly, from an historic point of view, fundamental concepts on digital imaging were reviewed to provide a foundation for discussion of digital radiography. Secondly, this review summarized the results of ongoing research in computed radiography that replaces the conventional film-screen system with a photo-stimulable phosphor plate; and thirdly, image quality, radiation protection, and image processing techniques were discussed with emphasis on picture archiving and communication system environment as our final goal. Finally, future expansion of digital radiography was described based on the present utilization of computed tomography at the National Defense Medical College Hospital. (author) 60 refs

  9. Evidence for Ancient Mesoamerican Earthquakes

    Science.gov (United States)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  10. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  11. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  12. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  13. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  14. Digital fabrication

    CERN Document Server

    2012-01-01

    The Winter 2012 (vol. 14 no. 3) issue of the Nexus Network Journal features seven original papers dedicated to the theme “Digital Fabrication”. Digital fabrication is changing architecture in fundamental ways in every phase, from concept to artifact. Projects growing out of research in digital fabrication are dependent on software that is entirely surface-oriented in its underlying mathematics. Decisions made during design, prototyping, fabrication and assembly rely on codes, scripts, parameters, operating systems and software, creating the need for teams with multidisciplinary expertise and different skills, from IT to architecture, design, material engineering, and mathematics, among others The papers grew out of a Lisbon symposium hosted by the ISCTE-Instituto Universitario de Lisboa entitled “Digital Fabrication – A State of the Art”. The issue is completed with four other research papers which address different mathematical instruments applied to architecture, including geometric tracing system...

  15. Digital Relationships

    DEFF Research Database (Denmark)

    Ledborg Hansen, Richard

    -­rich information and highly interesting communication are sky-­high and rising. With a continuous increase in digitized communication follows a decrease in face-­to-­face encounters and our ability to engage in inter-­personal relationships are suffering for it (Davis, 2013). The behavior described in this paper......-­‐Jones, 2011) for increases in effectiveness and efficiency we indiscriminately embrace digital communication and digitized information dissemination with enthusiasm – at the risk of ignoring the potentially dark side of technology. However, technology also holds a promise for better understanding precisely...... for the same reasons – that the growing amount of digitized communication “out there” represents data waiting to be sifted, analyzed and decoded. In this paper “Facebook behavior” refers to a particular behavior characterized by presenting your self and representations of selected self in the hope of getting...

  16. Digital Discretion

    DEFF Research Database (Denmark)

    Busch, Peter Andre; Zinner Henriksen, Helle

    2018-01-01

    discretion is suggested to reduce this footprint by influencing or replacing their discretionary practices using ICT. What is less researched is whether digital discretion can cause changes in public policy outcomes, and under what conditions such changes can occur. Using the concept of public service values......This study reviews 44 peer-reviewed articles on digital discretion published in the period from 1998 to January 2017. Street-level bureaucrats have traditionally had a wide ability to exercise discretion stirring debate since they can add their personal footprint on public policies. Digital......, we suggest that digital discretion can strengthen ethical and democratic values but weaken professional and relational values. Furthermore, we conclude that contextual factors such as considerations made by policy makers on the macro-level and the degree of professionalization of street...

  17. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  18. Protracted fluvial recovery from medieval earthquakes, Pokhara, Nepal

    Science.gov (United States)

    Stolle, Amelie; Bernhardt, Anne; Schwanghart, Wolfgang; Andermann, Christoff; Schönfeldt, Elisabeth; Seidemann, Jan; Adhikari, Basanta R.; Merchel, Silke; Rugel, Georg; Fort, Monique; Korup, Oliver

    2016-04-01

    River response to strong earthquake shaking in mountainous terrain often entails the flushing of sediments delivered by widespread co-seismic landsliding. Detailed mass-balance studies following major earthquakes in China, Taiwan, and New Zealand suggest fluvial recovery times ranging from several years to decades. We report a detailed chronology of earthquake-induced valley fills in the Pokhara region of western-central Nepal, and demonstrate that rivers continue to adjust to several large medieval earthquakes to the present day, thus challenging the notion of transient fluvial response to seismic disturbance. The Pokhara valley features one of the largest and most extensively dated sedimentary records of earthquake-triggered sedimentation in the Himalayas, and independently augments paleo-seismological archives obtained mainly from fault trenches and historic documents. New radiocarbon dates from the catastrophically deposited Pokhara Formation document multiple phases of extremely high geomorphic activity between ˜700 and ˜1700 AD, preserved in thick sequences of alternating fluvial conglomerates, massive mud and silt beds, and cohesive debris-flow deposits. These dated fan-marginal slackwater sediments indicate pronounced sediment pulses in the wake of at least three large medieval earthquakes in ˜1100, 1255, and 1344 AD. We combine these dates with digital elevation models, geological maps, differential GPS data, and sediment logs to estimate the extent of these three pulses that are characterized by sedimentation rates of ˜200 mm yr-1 and peak rates as high as 1,000 mm yr-1. Some 5.5 to 9 km3 of material infilled the pre-existing topography, and is now prone to ongoing fluvial dissection along major canyons. Contemporary river incision into the Pokhara Formation is rapid (120-170 mm yr-1), triggering widespread bank erosion, channel changes, and very high sediment yields of the order of 103 to 105 t km-2 yr-1, that by far outweigh bedrock denudation rates

  19. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  20. Digital Collections, Digital Libraries & the Digitization of Cultural Heritage Information.

    Science.gov (United States)

    Lynch, Clifford

    2002-01-01

    Discusses digital collections and digital libraries. Topics include broadband availability; digital rights protection; content, both non-profit and commercial; digitization of cultural content; sustainability; metadata harvesting protocol; infrastructure; authorship; linking multiple resources; data mining; digitization of reference works;…

  1. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  2. Book review: Earthquakes and water

    Science.gov (United States)

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  3. Overview of Historical Earthquake Document Database in Japan and Future Development

    Science.gov (United States)

    Nishiyama, A.; Satake, K.

    2014-12-01

    In Japan, damage and disasters from historical large earthquakes have been documented and preserved. Compilation of historical earthquake documents started in the early 20th century and 33 volumes of historical document source books (about 27,000 pages) have been published. However, these source books are not effectively utilized for researchers due to a contamination of low-reliability historical records and a difficulty for keyword searching by characters and dates. To overcome these problems and to promote historical earthquake studies in Japan, construction of text database started in the 21 century. As for historical earthquakes from the beginning of the 7th century to the early 17th century, "Online Database of Historical Documents in Japanese Earthquakes and Eruptions in the Ancient and Medieval Ages" (Ishibashi, 2009) has been already constructed. They investigated the source books or original texts of historical literature, emended the descriptions, and assigned the reliability of each historical document on the basis of written age. Another database compiled the historical documents for seven damaging earthquakes occurred along the Sea of Japan coast in Honshu, central Japan in the Edo period (from the beginning of the 17th century to the middle of the 19th century) and constructed text database and seismic intensity data base. These are now publicized on the web (written only in Japanese). However, only about 9 % of the earthquake source books have been digitized so far. Therefore, we plan to digitize all of the remaining historical documents by the research-program which started in 2014. The specification of the data base will be similar for previous ones. We also plan to combine this database with liquefaction traces database, which will be constructed by other research program, by adding the location information described in historical documents. Constructed database would be utilized to estimate the distributions of seismic intensities and tsunami

  4. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  5. Unbonded Prestressed Columns for Earthquake Resistance

    Science.gov (United States)

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  6. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  7. digital natives and digital immigrants

    OpenAIRE

    Cardina, Bruno; Francisco, Jerónimo; Reis, Pedro; trad. Silva, Fátima

    2011-01-01

    This article focuses on the generational gaps in school learning. Initially, we have tried to provide the framework in relation to the term digital native in order to understand the key aspects of the generation born after the advent and the global use of the Internet. They were found to be “multitasking” people, linked to technology and connectivity, as opposed to digital immigrants, born in an earlier period and seeking to adapt to the technological world. We also present some r...

  8. Measuring co-seismic deformation of the Sichuan earthquake by satellite differential INSAR

    Science.gov (United States)

    Zhang, Yonghong; Gong, Wenyu; Zhang, Jixian

    2008-12-01

    The Sichuan Earthquake, occurred on May 12, 2008, is the strongest earthquake to hit China since the 1976 Tangshan earthquake. The earthquake had a magnitude of M 8.0, and caused surface deformation greater than 3 meters. This paper presents the research work of measuring the co-seismic deformations of the earthquake with satellite differential interferometric SAR technique. Four L-band SAR images were used to form the interferogram with 2 pre- scenes imaged on Feb 17, 2008 and 2 post- scenes on May 19, 2008. The Digital Elevation Models extracted from 1:50,000-scale national geo-spatial database were used to remove the topographic contribution and form a differential interferogram. The interferogram presents very high coherence in most areas, although the pre- and post- images were acquired with time interval of 92 days. This indicates that the L-band PALSAR sensor is very powerful for interferometry applications. The baseline error is regarded as the main phase error source in the differential interferogram. Due to the difficulties of doing field works immediately after the earthquake, only one deformation measurement recorded by a permanent GPS station is obtained for this research. An approximation method is proposed to eliminate the orbital phase error with one control point. The derived deformation map shows similar spatial pattern and deformation magnitude compared with deformation field generated by seismic inversion method.

  9. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  10. EARTHQUAKE RESEARCH PROBLEMS OF NUCLEAR POWER GENERATORS

    Energy Technology Data Exchange (ETDEWEB)

    Housner, G. W.; Hudson, D. E.

    1963-10-15

    Earthquake problems associated with the construction of nuclear power generators require a more extensive and a more precise knowledge of earthquake characteristics and the dynamic behavior of structures than was considered necessary for ordinary buildings. Economic considerations indicate the desirability of additional research on the problems of earthquakes and nuclear reactors. The nature of these earthquake-resistant design problems is discussed and programs of research are recommended. (auth)

  11. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  12. Historical earthquake investigations in Greece

    Directory of Open Access Journals (Sweden)

    K. Makropoulos

    2004-06-01

    Full Text Available The active tectonics of the area of Greece and its seismic activity have always been present in the country?s history. Many researchers, tempted to work on Greek historical earthquakes, have realized that this is a task not easily fulfilled. The existing catalogues of strong historical earthquakes are useful tools to perform general SHA studies. However, a variety of supporting datasets, non-uniformly distributed in space and time, need to be further investigated. In the present paper, a review of historical earthquake studies in Greece is attempted. The seismic history of the country is divided into four main periods. In each one of them, characteristic examples, studies and approaches are presented.

  13. Digital evidence

    Directory of Open Access Journals (Sweden)

    Lukić Tatjana

    2012-01-01

    Full Text Available Although computer makes human activities faster and easier, innovating and creating new forms of work and other kinds of activities, it also influenced the criminal activity. The development of information technology directly affects the development of computer forensics without which, it can not even imagine the discovering and proving the computer offences and apprehending the perpetrator. Information technology and computer forensic allows us to detect and prove the crimes committed by computer and capture the perpetrators. Computer forensics is a type of forensics which can be defined as a process of collecting, preserving, analyzing and presenting digital evidence in court proceedings. Bearing in mind, that combat against crime, in which computers appear as an asset or object of the offense, requires knowledge of digital evidence as well as specific rules and procedures, the author in this article specifically addresses the issues of digital evidence, forensic (computer investigation, specific rules and procedures for detecting, fixing and collecting digital evidence and use of this type of evidence in criminal proceedings. The author also delas with international standards regarding digital evidence and cyber-space investigation.

  14. Digital watermark

    Directory of Open Access Journals (Sweden)

    Jasna Maver

    2000-01-01

    Full Text Available The huge amount of multimedia contents available on the World-Wide-Web is beginning to raise the question of their protection. Digital watermarking is a technique which can serve various purposes, including intellectual property protection, authentication and integrity verification, as well as visible or invisible content labelling of multimedia content. Due to the diversity of digital watermarking applicability, there are many different techniques, which can be categorised according to different criteria. A digital watermark can be categorised as visible or invisible and as robust or fragile. In contrast to the visible watermark where a visible pattern or image is embedded into the original image, the invisible watermark does not change the visual appearance of the image. The existence of such a watermark can be determined only through a watermark ex¬traction or detection algorithm. The robust watermark is used for copyright protection, while the fragile watermark is designed for authentication and integrity verification of multimedia content. A watermark must be detectable or extractable to be useful. In some watermarking schemes, a watermark can be extracted in its exact form, in other cases, we can detect only whether a specific given watermarking signal is present in an image. Digital libraries, through which cultural institutions will make multimedia contents available, should support a wide range of service models for intellectual property protection, where digital watermarking may play an important role.

  15. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  16. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  17. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  18. Computational methods in earthquake engineering

    CERN Document Server

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  19. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  20. Radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Planinic, J.; Radolic, V.; Vukovic, B.

    2004-01-01

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude ≥3 at epicentral distances ≤200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined

  1. Radon as an earthquake precursor

    Energy Technology Data Exchange (ETDEWEB)

    Planinic, J. E-mail: planinic@pedos.hr; Radolic, V.; Vukovic, B

    2004-09-11

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude {>=}3 at epicentral distances {<=}200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined.

  2. Earthquake location in island arcs

    Science.gov (United States)

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  3. Influence of earthquake strong motion duration on nonlinear structural response

    International Nuclear Information System (INIS)

    Meskouris, K.

    1983-01-01

    The effects of motion duration on nonlinear structural response of high-rise, moment resisting frames are studied by subjecting shear beam models of a 10- and a 5-story frame to a series of synthetic accelerograms, all matching the same NEWMARK/HALL design spectrum. Two different hysteretic laws are used for the story springs, and calculations are carried out for target ductility values of 2 and 4. Maximum ductilities reached and energy-based damage indicators (maximum seismically input energy, hysteretically dissipated energy) are evaluated and correlated with the motion characteristics. A reasonable extrapolative determination of structural response characteristics based on these indicators seems possible. (orig.)

  4. Digital Creativity

    DEFF Research Database (Denmark)

    Petersson Brooks, Eva; Brooks, Anthony Lewis

    2014-01-01

    This paper reports on a study exploring the outcomes from children’s play with technology in early childhood learning practices. The paper addresses questions related to how digital technology can foster creativity in early childhood learning environments. It consists of an analysis of children......’s interaction with the KidSmart furniture focusing on digital creativity potentials and play values suggested by the technology. The study applied a qualitative approach and included125 children (aged three to five), 10 pedagogues, and two librarians. The results suggests that educators should sensitively...... consider intervening when children are interacting with technology, and rather put emphasize into the integration of the technology into the environment and to the curriculum in order to shape playful structures for children’s digital creativity....

  5. Digital radiography

    International Nuclear Information System (INIS)

    Rath, M.; Lissner, J.; Rienmueller, R.; Haendle, J.; Siemens A.G., Erlangen

    1984-01-01

    Using a prototype of an electronic, universal examination unit equipped with a special X-ray TV installation, spotfilm exposures and digital angiographies with high spatial resolution and wide-range contrast could be made in the clinic for the first time. With transvenous contrast medium injection, the clinical results of digital angiography show excellent image quality in the region of the carotids and renal arteries as well as the arteries of the extremities. The electronic series exposures have an image quality almost comparable to the quality obtained with cutfilm changers in conventional angiography. There are certain limitations due to the input field of 25 cm X-ray image intensified used. In respect of the digital angiography imaging technique, the electronic universal unit is fully suitable for clinical application. (orig.) [de

  6. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    Science.gov (United States)

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  7. A Method for Estimation of Death Tolls in Disastrous Earthquake

    Science.gov (United States)

    Pai, C.; Tien, Y.; Teng, T.

    2004-12-01

    Fatality tolls caused by the disastrous earthquake are the one of the most important items among the earthquake damage and losses. If we can precisely estimate the potential tolls and distribution of fatality in individual districts as soon as the earthquake occurrences, it not only make emergency programs and disaster management more effective but also supply critical information to plan and manage the disaster and the allotments of disaster rescue manpower and medicine resources in a timely manner. In this study, we intend to reach the estimation of death tolls caused by the Chi-Chi earthquake in individual districts based on the Attributive Database of Victims, population data, digital maps and Geographic Information Systems. In general, there were involved many factors including the characteristics of ground motions, geological conditions, types and usage habits of buildings, distribution of population and social-economic situations etc., all are related to the damage and losses induced by the disastrous earthquake. The density of seismic stations in Taiwan is the greatest in the world at present. In the meantime, it is easy to get complete seismic data by earthquake rapid-reporting systems from the Central Weather Bureau: mostly within about a minute or less after the earthquake happened. Therefore, it becomes possible to estimate death tolls caused by the earthquake in Taiwan based on the preliminary information. Firstly, we form the arithmetic mean of the three components of the Peak Ground Acceleration (PGA) to give the PGA Index for each individual seismic station, according to the mainshock data of the Chi-Chi earthquake. To supply the distribution of Iso-seismic Intensity Contours in any districts and resolve the problems for which there are no seismic station within partial districts through the PGA Index and geographical coordinates in individual seismic station, the Kriging Interpolation Method and the GIS software, The population density depends on

  8. Peak ground motion distribution in Romania due to Vrancea earthquakes

    International Nuclear Information System (INIS)

    Grecu, B.; Rizescu, M.; Radulian, M.; Mandrescu, N.; Moldovan, I.-A.; Bonjer, K.-P

    2002-01-01

    Vrancea is a particular seismic region situated at the SE-Carpathians bend (Romania). It is characterized by persistent seismicity in a concentrated focal volume, at depths of 60-200 km, with 2 to 3 major earthquakes per century (M W >7). The purpose of our study is to investigate in detail the ground motion patterns for small and moderate Vrancea events (M W = 3.5 to 5.3) occurred during 1999, taking advantage of the unique data set offered by the Calixto'99 Project and the permanent Vrancea-K2 network (150 stations). The observed patterns are compared with available macroseismic maps of large Vrancea earthquakes, showing similar general patterns elongated in the NE-SW direction which mimic the S-waves source radiation, but patches with pronounced maxima are also evidenced rather far from the epicenter, at the NE and SW edges of the Focsani sedimentary basin, as shown firstly by Atanasiu (1961). This feature is also visible on instrumental data of strong events (Mandrescu and Radulian, 1999) as well as for moderate events recently recorded by digital K2 network (Bonjer et al., 2001) and correlates with the distribution of predominant response frequencies of shallow sedimentary layers. The influence of the local structure and/or focussing effects, caused by deeper lithospheric structure, on the observed site effects and the implications on the seismic hazard assessment for Vrancea earthquakes are discussed. (authors)

  9. Technical features of a low-cost earthquake alert system

    International Nuclear Information System (INIS)

    Harben, P.

    1991-01-01

    The concept and features of an Earthquake Alert System (EAS) involving a distributed network of strong motion sensors is discussed. The EAS analyzes real-time data telemetered to a central facility and issues an areawide warning of a large earthquake in advance of the spreading elastic wave energy. A low-cost solution to high-cost estimates for installation and maintenance of a dedicated EAS is presented that makes use of existing microseismic stations. Using the San Francisco Bay area as an example, we show that existing US Geological Survey microseismic monitoring stations are of sufficient density to form the elements of a prototype EAS. By installing strong motion instrumentation and a specially developed switching device, strong ground motion can be telemetered in real-time to the central microseismic station on the existing communication channels. When a large earthquake occurs, a dedicated real-time central processing unit at the central microseismic station digitizes and analyzes the incoming data and issues a warning containing location and magnitude estimations. A 50-station EAS of this type in the San Francisco Bay area should cost under $70,000 to install and less than $5,000 annually to maintain

  10. Digital photogrammetry

    CERN Document Server

    Egels, Yves

    2003-01-01

    Photogrammetry is the use of photography for surveying primarily and is used for the production of maps from aerial photographs. Along with remote sensing, it represents the primary means of generating data for Geographic Information Systems (GIS). As technology develops, it is becoming easier to gain access to it. The cost of digital photogrammetric workstations are falling quickly and these new tools are therefore becoming accessible to more and more users. Digital Photogrammetry is particularly useful as a text for graduate students in geomantic and is also suitable for people with a good basic scientific knowledge who need to understand photogrammetry, and who wish to use the book as a reference.

  11. Digital Marketing

    OpenAIRE

    Jerry Wind; Vijay Mahajan

    2002-01-01

    The digital revolution has shaken marketing to its core with consumers being offered greater price transparency and often even the chance to dictate the price. What does pricing mean in a world in which customers propose their own prices (as at priceline.com) or buyers and sellers haggle independently in auctions (as at e-Bay)? The most significant changes in the digital marketing show the emergence of 'cyber consumers', the cyber business-to-business world and the changing reality of an incr...

  12. Digital "X"

    DEFF Research Database (Denmark)

    Baiyere, Abayomi; Grover, Varun; Gupta, Alok

    2017-01-01

    Interest in using digital before existing research concepts seem to be on the rise in the IS field. This panel is positioned to explore what value lies in labelling our research as digital “x” as opposed to the well established IT “x” (where “x” can be strategy, infrastructure, innovation, artifa...... between this stream of research and existing research. Central among the expected output of the panel is the advancement of suggestions for future research and the critical pitfalls to avoid in doing so....

  13. Digital Radiography

    Science.gov (United States)

    1986-01-01

    System One, a digital radiography system, incorporates a reusable image medium (RIM) which retains an image. No film is needed; the RIM is read with a laser scanner, and the information is used to produce a digital image on an image processor. The image is stored on an optical disc. System allows the radiologist to "dial away" unwanted images to compare views on three screens. It is compatible with existing equipment and cost efficient. It was commercialized by a Stanford researcher from energy selective technology developed under a NASA grant.

  14. Digital filters

    CERN Document Server

    Hamming, Richard W

    1997-01-01

    Digital signals occur in an increasing number of applications: in telephone communications; in radio, television, and stereo sound systems; and in spacecraft transmissions, to name just a few. This introductory text examines digital filtering, the processes of smoothing, predicting, differentiating, integrating, and separating signals, as well as the removal of noise from a signal. The processes bear particular relevance to computer applications, one of the focuses of this book.Readers will find Hamming's analysis accessible and engaging, in recognition of the fact that many people with the s

  15. Digital voltmeter

    International Nuclear Information System (INIS)

    Yohannes Kamadi; Soekarno.

    1976-01-01

    The electrical voltage measuring equipment with digital display has been made. This equipment uses four digits display with single polarity measurement and integrating system. Pulses from the oscillator will be counted and converted to the staircase voltages, and compared to the voltage measured. When the balance is already achieved, the pulse will appear at the comparator circuit. This pulse will be used to trigger univibrator circuit. The univibrator output is used as signal for stopping the counting, and when reading time T already stops, the counting system will be reset. (authors)

  16. Digital communication

    CERN Document Server

    Das, Apurba

    2010-01-01

    ""Digital Communications"" presents the theory and application of the philosophy of Digital Communication systems in a unique but lucid form. This book inserts equal importance to the theory and application aspect of the subject whereby the authors selected a wide class of problems. The Salient features of the book are: the foundation of Fourier series, Transform and wavelets are introduces in a unique way but in lucid language; the application area is rich and resemblance to the present trend of research, as we are attached with those areas professionally; a CD is included which contains code

  17. Digital literacies

    CERN Document Server

    Hockly, Nicky; Pegrum, Mark

    2014-01-01

    Dramatic shifts in our communication landscape have made it crucial for language teaching to go beyond print literacy and encompass the digital literacies which are increasingly central to learners' personal, social, educational and professional lives. By situating these digital literacies within a clear theoretical framework, this book provides educators and students alike with not just the background for a deeper understanding of these key 21st-century skills, but also the rationale for integrating these skills into classroom practice. This is the first methodology book to address not jus

  18. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    Science.gov (United States)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  19. Earthquake predictions using seismic velocity ratios

    Science.gov (United States)

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  20. Measuring the size of an earthquake

    Science.gov (United States)

    Spence, W.; Sipkin, S.A.; Choy, G.L.

    1989-01-01

    Earthquakes range broadly in size. A rock-burst in an Idaho silver mine may involve the fracture of 1 meter of rock; the 1965 Rat Island earthquake in the Aleutian arc involved a 650-kilometer length of the Earth's crust. Earthquakes can be even smaller and even larger. If an earthquake is felt or causes perceptible surface damage, then its intensity of shaking can be subjectively estimated. But many large earthquakes occur in oceanic areas or at great focal depths and are either simply not felt or their felt pattern does not really indicate their true size.

  1. Earthquakes-Rattling the Earth's Plumbing System

    Science.gov (United States)

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  2. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  3. Earthquake design for controlled structures

    Directory of Open Access Journals (Sweden)

    Nikos G. Pnevmatikos

    2017-04-01

    Full Text Available An alternative design philosophy, for structures equipped with control devices, capable to resist an expected earthquake while remaining in the elastic range, is described. The idea is that a portion of the earthquake loading is under¬taken by the control system and the remaining by the structure which is designed to resist elastically. The earthquake forces assuming elastic behavior (elastic forces and elastoplastic behavior (design forces are first calculated ac¬cording to the codes. The required control forces are calculated as the difference from elastic to design forces. The maximum value of capacity of control devices is then compared to the required control force. If the capacity of the control devices is larger than the required control force then the control devices are accepted and installed in the structure and the structure is designed according to the design forces. If the capacity is smaller than the required control force then a scale factor, α, reducing the elastic forces to new design forces is calculated. The structure is redesigned and devices are installed. The proposed procedure ensures that the structure behaves elastically (without damage for the expected earthquake at no additional cost, excluding that of buying and installing the control devices.

  4. Using Smartphones to Detect Earthquakes

    Science.gov (United States)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  5. Explanation of earthquake response spectra

    OpenAIRE

    Douglas, John

    2017-01-01

    This is a set of five slides explaining how earthquake response spectra are derived from strong-motion records and simple models of structures and their purpose within seismic design and assessment. It dates from about 2002 and I have used it in various introductory lectures on engineering seismology.

  6. Digital delicacies

    OpenAIRE

    Holley, Rose

    2001-01-01

    This presentation outlines the purpose and work of the newly appointed Digital Projects Librarian at the University of Auckland. It gives a brief overview of what digitisation is, the benefits, the stages of a digitisation project and also samples of interesting international digitisation projects and new University of Auckland Library Digitisation projects.

  7. Digital Forensics

    Science.gov (United States)

    Harron, Jason; Langdon, John; Gonzalez, Jennifer; Cater, Scott

    2017-01-01

    The term forensic science may evoke thoughts of blood-spatter analysis, DNA testing, and identifying molds, spores, and larvae. A growing part of this field, however, is that of digital forensics, involving techniques with clear connections to math and physics. This article describes a five-part project involving smartphones and the investigation…

  8. Digital Disruption

    DEFF Research Database (Denmark)

    Rosenstand, Claus Andreas Foss

    det digitale domæne ud over det niveau, der kendetegner den nuværende debat, så præsenteres der ny viden om digital disruption. Som noget nyt udlægges Clayton Christens teori om disruptiv innovation med et særligt fokus på små organisationers mulighed for eksponentiel vækst. Specielt udfoldes...... forholdet mellem disruption og den stadig accelererende digitale udvikling i konturerne til ny teoridannelse om digital disruption. Bogens undertitel ”faretruende og fascinerende forandringer” peger på, at der er behov for en nuanceret debat om digital disruption i modsætning til den tone, der er slået an i...... videre kalder et ”disruption-råd”. Faktisk er rådet skrevet ind i 2016 regeringsgrundlaget for VLK-regeringen. Disruption af organisationer er ikke et nyt fænomen; men hastigheden, hvormed det sker, er stadig accelererende. Årsagen er den globale mega-trend: Digitalisering. Og derfor er specielt digital...

  9. Digital books.

    Science.gov (United States)

    Wink, Diane M

    2011-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes digital books.

  10. Digital forvaltning

    DEFF Research Database (Denmark)

    Remmen, Arne; Larsen, Torben; Mosgaard, Mette

    2004-01-01

    Større effektivitet, bedre service og mere demokrai er blot nogle af forventningerne til indførelse af digital forveltning i kommunerne. Kapitlet giver bland andet svar på spørgsmålene : Hvordan lever kommunerne op hertil i dagligdagen? hvilke virkemidler anvender de? Hvilke barrierer har der været...

  11. Digital Methods

    NARCIS (Netherlands)

    Rogers, R.

    2013-01-01

    In Digital Methods, Richard Rogers proposes a methodological outlook for social and cultural scholarly research on the Web that seeks to move Internet research beyond the study of online culture. It is not a toolkit for Internet research, or operating instructions for a software package; it deals

  12. Solar eruptions - soil radon - earthquakes

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time a new natural phenomenon was established: a contrasting increase in the soil radon level under the influence of solar flares. Such an increase is one of geochemical indicators of earthquakes. Most researchers consider this a phenomenon of exclusively terrestrial processes. Investigations regarding the link of earthquakes to solar activity carried out during the last decade in different countries are based on the analysis of statistical data ΣΕ (t) and W (t). As established, the overall seismicity of the Earth and its separate regions depends of an 11-year long cycle of solar activity. Data provided in the paper based on experimental studies serve the first step on the way of experimental data on revealing cause-and-reason solar-terrestrials bonds in a series s olar eruption-lithosphere radon-earthquakes . They need further collection of experimental data. For the first time, through radon constituent of terrestrial radiation objectification has been made of elementary lattice of the Hartmann's network contoured out by bio location method. As found out, radon concentration variations in Hartmann's network nodes determine the dynamics of solar-terrestrial relationships. Of the three types of rapidly running processes conditioned by solar-terrestrial bonds earthquakes are attributed to rapidly running destructive processes that occur in the most intense way at the juncture of tectonic massifs, along transformed and deep failures. The basic factors provoking the earthquakes are both magnetic-structural effects and a long-term (over 5 months) bombing of the surface of lithosphere by highly energetic particles of corpuscular solar flows, this being approved by photometry. As a result of solar flares that occurred from 29 October to 4 November 2003, a sharply contrasting increase in soil radon was established which is an earthquake indicator on the territory of Yerevan City. A month and a half later, earthquakes occurred in San-Francisco, Iran, Turkey

  13. A new reference global instrumental earthquake catalogue (1900-2009)

    Science.gov (United States)

    Di Giacomo, D.; Engdahl, B.; Bondar, I.; Storchak, D. A.; Villasenor, A.; Bormann, P.; Lee, W.; Dando, B.; Harris, J.

    2011-12-01

    For seismic hazard studies on a global and/or regional scale, accurate knowledge of the spatial distribution of seismicity, the magnitude-frequency relation and the maximum magnitudes is of fundamental importance. However, such information is normally not homogeneous (or not available) for the various seismically active regions of the Earth. To achieve the GEM objectives (www.globalquakemodel.org) of calculating and communicating earthquake risk worldwide, an improved reference global instrumental catalogue for large earthquakes spanning the entire 100+ years period of instrumental seismology is an absolute necessity. To accomplish this task, we apply the most up-to-date techniques and standard observatory practices for computing the earthquake location and magnitude. In particular, the re-location procedure benefits both from the depth determination according to Engdahl and Villaseñor (2002), and the advanced technique recently implemented at the ISC (Bondár and Storchak, 2011) to account for correlated error structure. With regard to magnitude, starting from the re-located hypocenters, the classical surface and body-wave magnitudes are determined following the new IASPEI standards and by using amplitude-period data of phases collected from historical station bulletins (up to 1970), which were not available in digital format before the beginning of this work. Finally, the catalogue will provide moment magnitude values (including uncertainty) for each seismic event via seismic moment, via surface wave magnitude or via other magnitude types using empirical relationships. References Engdahl, E.R., and A. Villaseñor (2002). Global seismicity: 1900-1999. In: International Handbook of Earthquake and Engineering Seismology, eds. W.H.K. Lee, H. Kanamori, J.C. Jennings, and C. Kisslinger, Part A, 665-690, Academic Press, San Diego. Bondár, I., and D. Storchak (2011). Improved location procedures at the International Seismological Centre, Geophys. J. Int., doi:10.1111/j

  14. Source processes of strong earthquakes in the North Tien-Shan region

    Science.gov (United States)

    Kulikova, G.; Krueger, F.

    2013-12-01

    Tien-Shan region attracts attention of scientists worldwide due to its complexity and tectonic uniqueness. A series of very strong destructive earthquakes occurred in Tien-Shan at the turn of XIX and XX centuries. Such large intraplate earthquakes are rare in seismology, which increases the interest in the Tien-Shan region. The presented study focuses on the source processes of large earthquakes in Tien-Shan. The amount of seismic data is limited for those early times. In 1889, when a major earthquake has occurred in Tien-Shan, seismic instruments were installed in very few locations in the world and these analog records did not survive till nowadays. Although around a hundred seismic stations were operating at the beginning of XIX century worldwide, it is not always possible to get high quality analog seismograms. Digitizing seismograms is a very important step in the work with analog seismic records. While working with historical seismic records one has to take into account all the aspects and uncertainties of manual digitizing and the lack of accurate timing and instrument characteristics. In this study, we develop an easy-to-handle and fast digitization program on the basis of already existing software which allows to speed up digitizing process and to account for all the recoding system uncertainties. Owing to the lack of absolute timing for the historical earthquakes (due to the absence of a universal clock at that time), we used time differences between P and S phases to relocate the earthquakes in North Tien-Shan and the body-wave amplitudes to estimate their magnitudes. Combining our results with geological data, five earthquakes in North Tien-Shan were precisely relocated. The digitizing of records can introduce steps into the seismograms which makes restitution (removal of instrument response) undesirable. To avoid the restitution, we simulated historic seismograph recordings with given values for damping and free period of the respective instrument and

  15. ARMA models for earthquake ground motions. Seismic Safety Margins Research Program

    International Nuclear Information System (INIS)

    Chang, Mark K.; Kwiatkowski, Jan W.; Nau, Robert F.; Oliver, Robert M.; Pister, Karl S.

    1981-02-01

    This report contains an analysis of four major California earthquake records using a class of discrete linear time-domain processes commonly referred to as ARMA (Autoregressive/Moving-Average) models. It has been possible to analyze these different earthquakes, identify the order of the appropriate ARMA model(s), estimate parameters and test the residuals generated by these models. It has also been possible to show the connections, similarities and differences between the traditional continuous models (with parameter estimates based on spectral analyses) and the discrete models with parameters estimated by various maximum likelihood techniques applied to digitized acceleration data in the time domain. The methodology proposed in this report is suitable for simulating earthquake ground motions in the time domain and appears to be easily adapted to serve as inputs for nonlinear discrete time models of structural motions. (author)

  16. Surface rupture and vertical deformation associated with 20 May 2016 M6 Petermann Ranges earthquake, Northern Territory, Australia

    Science.gov (United States)

    Gold, Ryan; Clark, Dan; King, Tamarah; Quigley, Mark

    2017-04-01

    Surface-rupturing earthquakes in stable continental regions (SCRs) occur infrequently, though when they occur in heavily populated regions the damage and loss of life can be severe (e.g., 2001 Bhuj earthquake). Quantifying the surface-rupture characteristics of these low-probability events is therefore important, both to improve understanding of the on- and off-fault deformation field near the rupture trace and to provide additional constraints on earthquake magnitude to rupture length and displacement, which are critical inputs for seismic hazard calculations. This investigation focuses on the 24 August 2016 M6.0 Petermann Ranges earthquake, Northern Territory, Australia. We use 0.3-0.5 m high-resolution optical Worldview satellite imagery to map the trace of the surface rupture associated with the earthquake. From our mapping, we are able to trace the rupture over a length of 20 km, trending NW, and exhibiting apparent north-side-up motion. To quantify the magnitude of vertical surface deformation, we use stereo Worldview images processed using NASA Ames Stereo Pipeline software to generate pre- and post-earthquake digital terrain models with a spatial resolution of 1.5 to 2 m. The surface scarp is apparent in much of the post-event digital terrain model. Initial efforts to difference the pre- and post-event digital terrain models yield noisy results, though we detect vertical deformation of 0.2 to 0.6 m over length scales of 100 m to 1 km from the mapped trace of the rupture. Ongoing efforts to remove ramps and perform spatial smoothing will improve our understanding of the extent and pattern of vertical deformation. Additionally, we will compare our results with InSAR and field measurements obtained following the earthquake.

  17. Digital Humanities and networked digital media

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2014-01-01

    This article discusses digital humanities and the growing diversity of digital media, digital materials and digital methods. The first section describes the humanities computing tradition formed around the interpretation of computation as a rule-based process connected to a concept of digital...... materials centred on the digitisation of non-digital, finite works, corpora and oeuvres. The second section discusses “the big tent” of contemporary digital humanities. It is argued that there can be no unifying interpretation of digital humanities above the level of studying digital materials with the help...... of software-supported methods. This is so, in part, because of the complexity of the world and, in part, because digital media remain open to the projection of new epistemologies onto the functional architecture of these media. The third section discusses the heterogeneous character of digital materials...

  18. Napa earthquake: An earthquake in a highly connected world

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  19. Digital Materia

    OpenAIRE

    Lindgren, Marcus; Richey, Emma

    2014-01-01

    Med tankar från pedagogen Montessori och filosoferna Platon och Baudrillard har detta arbete behandlat frågor om datorn och dess betydelse för en grafiker. Frågeställningen formulerades efter hand och lydde tillslut: ”Hur kan materia te sig i digital form?” Forskningen resulterade i en hypotes för hur digital materia skulle födas i datorn: genom att blanda två uppsättningar av data, såsom två genuppsättningar tillsammans skapar en ny organism. Under produktionen utvecklades därmed en metod fö...

  20. Becoming digital

    DEFF Research Database (Denmark)

    Pors, Anja Svejgaard

    2015-01-01

    government, and draws on empirical material generated through observations, field notes, interviews and policy documents. The material is documenting how service is performed by frontline agents in the ‘bureaucratic encounter’ with citizens, who needs assistance to use digital self-service in order to apply...... online for a public benefit. Findings: The paper shows that e-government technology changes the mode of professionalism in citizen service from service to support. The paper gives an empirical account of recent Danish digital reforms and shows how the reforms both enable and constrain the work...... of ‘becoming digital’ by frontline agents. Overall the street-level bureaucrat’s classical tasks such as specialized casework are being displaced into promoting and educational tasks. An implication of this is blurred distinctions between professional skills and personal competences of the frontline agent...

  1. Digital resources

    Directory of Open Access Journals (Sweden)

    Redazione Reti Medievali (a cura di

    2005-12-01

    Full Text Available Bibliotheca Latinitatis Mediaevalis (circa VII sec. - XIV sec. IntraText Digital Library [01/06] Corpus Scriptorum Latinorum. A digital library of Latin literature by David Camden [01/06] Fonti disponibili online concernenti la vita religiosa medievale Rete Vitae Religiosae Mediaevalis Studia Conectens [01/06] Fuentes del Medievo Hispanico Instituto de Historia, Consejo Superior de Investigaciones Científicas [01/06] Latin Literature Forum Romanum [01/06] Ludovico Antonio Muratori, Dissertazioni sopra le antichità italiane, 1751 Biblioteca dei Classici Italiani di Giuseppe Bonghi [01/06] Medieval Latin The Latin Library [01/06] Médiévales Presses Universitaires de Vincennes - Revues.org [01/06] Regesta imperii Deutsche Kommission für die Bearbeitung der Regesta Imperii e.V. [01/06] Suda On Line Byzantine Lexicography [01/06

  2. Digital teenagers

    Directory of Open Access Journals (Sweden)

    Rafael Conde Melguizo

    2011-12-01

    Full Text Available Espín, Manuel (Coord.(2011 Adolescentes Digitales.  Revista Estudios de Juventud, Nº 92. Marzo 2011. INJUVE, Madrid."Adolescentes digitales" se nos muestra como una obra que pretende explorar desde distintos puntos de vista la realidad de la generación conocida como “nativos digitales” (Prensky, la generación actual de adolescentes que ha crecido con internet y el mundo digital como su entorno normal de socialización.

  3. Digital Flora

    OpenAIRE

    Bunnell, Katie

    2007-01-01

    This research is concerned with developing a new business model for flexible small scale ceramic production that exploits the customisation capabilities of digital manufacturing technologies and the market potential and global connectivity of the world wide web. It is arguably of particular relevance to regional economic development in remote areas (Amin, Tomaney, Sabel) such as Cornwall where there is a culture of high quality small scale production, limited market and manufacturing opportun...

  4. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  5. An open repository of earthquake-triggered ground-failure inventories

    Science.gov (United States)

    Schmitt, Robert G.; Tanyas, Hakan; Nowicki Jessee, M. Anna; Zhu, Jing; Biegel, Katherine M.; Allstadt, Kate E.; Jibson, Randall W.; Thompson, Eric M.; van Westen, Cees J.; Sato, Hiroshi P.; Wald, David J.; Godt, Jonathan W.; Gorum, Tolga; Xu, Chong; Rathje, Ellen M.; Knudsen, Keith L.

    2017-12-20

    Earthquake-triggered ground failure, such as landsliding and liquefaction, can contribute significantly to losses, but our current ability to accurately include them in earthquake-hazard analyses is limited. The development of robust and widely applicable models requires access to numerous inventories of ground failures triggered by earthquakes that span a broad range of terrains, shaking characteristics, and climates. We present an openly accessible, centralized earthquake-triggered groundfailure inventory repository in the form of a ScienceBase Community to provide open access to these data with the goal of accelerating research progress. The ScienceBase Community hosts digital inventories created by both U.S. Geological Survey (USGS) and non-USGS authors. We present the original digital inventory files (when available) as well as an integrated database with uniform attributes. We also summarize the mapping methodology and level of completeness as reported by the original author(s) for each inventory. This document describes the steps taken to collect, process, and compile the inventories and the process for adding additional ground-failure inventories to the ScienceBase Community in the future.

  6. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  7. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  8. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  9. Renormalization group theory of earthquakes

    Directory of Open Access Journals (Sweden)

    H. Saleur

    1996-01-01

    Full Text Available We study theoretically the physical origin of the proposed discrete scale invariance of earthquake processes, at the origin of the universal log-periodic corrections to scaling, recently discovered in regional seismic activity (Sornette and Sammis (1995. The discrete scaling symmetries which may be present at smaller scales are shown to be robust on a global scale with respect to disorder. Furthermore, a single complex exponent is sufficient in practice to capture the essential properties of the leading correction to scaling, whose real part may be renormalized by disorder, and thus be specific to the system. We then propose a new mechanism for discrete scale invariance, based on the interplay between dynamics and disorder. The existence of non-linear corrections to the renormalization group flow implies that an earthquake is not an isolated 'critical point', but is accompanied by an embedded set of 'critical points', its foreshocks and any subsequent shocks for which it may be a foreshock.

  10. The 2016 Kumamoto earthquake sequence.

    Science.gov (United States)

    Kato, Aitaro; Nakamura, Kouji; Hiyama, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An M j 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an M j 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest.

  11. Earthquake lights and rupture processes

    Directory of Open Access Journals (Sweden)

    T. V. Losseva

    2005-01-01

    Full Text Available A physical model of earthquake lights is proposed. It is suggested that the magnetic diffusion from the electric and magnetic fields source region is a dominant process, explaining rather high localization of the light flashes. A 3D numerical code allowing to take into account the arbitrary distribution of currents caused by ground motion, conductivity in the ground and at its surface, including the existence of sea water above the epicenter or (and near the ruptured segments of the fault have been developed. Simulations for the 1995 Kobe earthquake were conducted taking into account the existence of sea water with realistic geometry of shores. The results do not contradict the eyewitness reports and scarce measurements of the electric and magnetic fields at large distances from the epicenter.

  12. The 2016 Kumamoto earthquake sequence

    Science.gov (United States)

    KATO, Aitaro; NAKAMURA, Kouji; HIYAMA, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An Mj 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an Mj 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest. PMID:27725474

  13. Digital citizens Digital nations: the next agenda

    NARCIS (Netherlands)

    A.W. (Bert) Mulder; M.W. (Martijn) Hartog

    2015-01-01

    DIGITAL CITIZENS CREATE A DIGITAL NATION Citizens will play the lead role as they – in the next phase of the information society – collectively create a digital nation. Personal adoption of information and communication technology will create a digital infrastructure that supports individual and

  14. Dim prospects for earthquake prediction

    Science.gov (United States)

    Geller, Robert J.

    I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

  15. On the plant operators performance during earthquake

    International Nuclear Information System (INIS)

    Kitada, Y.; Yoshimura, S.; Abe, M.; Niwa, H.; Yoneda, T.; Matsunaga, M.; Suzuki, T.

    1994-01-01

    There is little data on which to judge the performance of plant operators during and after strong earthquakes. In order to obtain such data to enhance the reliability on the plant operation, a Japanese utility and a power plant manufacturer carried out a vibration test using a shaking table. The purpose of the test was to investigate operator performance, i.e., the quickness and correctness in switch handling and panel meter read-out. The movement of chairs during earthquake as also of interest, because if the chairs moved significantly or turned over during a strong earthquake, some arresting mechanism would be required for the chair. Although there were differences between the simulated earthquake motions used and actual earthquakes mainly due to the specifications of the shaking table, the earthquake motions had almost no influence on the operators of their capability (performance) for operating the simulated console and the personal computers

  16. Earthquake evaluation of a substation network

    International Nuclear Information System (INIS)

    Matsuda, E.N.; Savage, W.U.; Williams, K.K.; Laguens, G.C.

    1991-01-01

    The impact of the occurrence of a large, damaging earthquake on a regional electric power system is a function of the geographical distribution of strong shaking, the vulnerability of various types of electric equipment located within the affected region, and operational resources available to maintain or restore electric system functionality. Experience from numerous worldwide earthquake occurrences has shown that seismic damage to high-voltage substation equipment is typically the reason for post-earthquake loss of electric service. In this paper, the authors develop and apply a methodology to analyze earthquake impacts on Pacific Gas and Electric Company's (PG and E's) high-voltage electric substation network in central and northern California. The authors' objectives are to identify and prioritize ways to reduce the potential impact of future earthquakes on our electric system, refine PG and E's earthquake preparedness and response plans to be more realistic, and optimize seismic criteria for future equipment purchases for the electric system

  17. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  18. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  19. St. Louis Area Earthquake Hazards Mapping Project - A Progress Report-November 2008

    Science.gov (United States)

    Karadeniz, D.; Rogers, J.D.; Williams, R.A.; Cramer, C.H.; Bauer, R.A.; Hoffman, D.; Chung, J.; Hempen, G.L.; Steckel, P.H.; Boyd, O.L.; Watkins, C.M.; McCallister, N.S.; Schweig, E.

    2009-01-01

    St. Louis has experienced minor earthquake damage at least 12 times in the past 200 years. Because of this history and its proximity to known active earthquake zones, the St. Louis Area Earthquake Hazards Mapping Project (SLAEHMP) is producing digital maps that show variability of earthquake hazards, including liquefaction and ground shaking, in the St. Louis area. The maps will be available free via the internet. Although not site specific enough to indicate the hazard at a house-by-house resolution, they can be customized by the user to show specific areas of interest, such as neighborhoods or transportation routes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as the result of an earthquake. Earthquake hazard maps provide one way of conveying such estimates. The U.S. Geological Survey (USGS), which produces earthquake hazard maps for the Nation, is working with local partners to develop detailed maps for urban areas vulnerable to strong ground shaking. These partners, which along with the USGS comprise the SLAEHMP, include the Missouri University of Science and Technology-Rolla (Missouri S&T), Missouri Department of Natural Resources (MDNR), Illinois State Geological Survey (ISGS), Saint Louis University, Missouri State Emergency Management Agency, and URS Corporation. Preliminary hazard maps covering a test portion of the 29-quadrangle St. Louis study area have been produced and are currently being evaluated by the SLAEHMP. A USGS Fact Sheet summarizing this project was produced and almost 1000 copies have been distributed at several public outreach meetings and field trips that have featured the SLAEHMP (Williams and others, 2007). In addition, a USGS website focusing on the SLAEHMP, which provides links to project results and relevant earthquake hazard information, can be found at: http://earthquake.usgs.gov/regional/ceus/urban_map/st_louis/index.php. This progress report summarizes the

  20. Understanding Great Earthquakes in Japan's Kanto Region

    Science.gov (United States)

    Kobayashi, Reiji; Curewitz, Daniel

    2008-10-01

    Third International Workshop on the Kanto Asperity Project; Chiba, Japan, 16-19 February 2008; The 1703 (Genroku) and 1923 (Taisho) earthquakes in Japan's Kanto region (M 8.2 and M 7.9, respectively) caused severe damage in the Tokyo metropolitan area. These great earthquakes occurred along the Sagami Trough, where the Philippine Sea slab is subducting beneath Japan. Historical records, paleoseismological research, and geophysical/geodetic monitoring in the region indicate that such great earthquakes will repeat in the future.

  1. Earthquake-triggered landslides in southwest China

    OpenAIRE

    X. L. Chen; Q. Zhou; H. Ran; R. Dong

    2012-01-01

    Southwest China is located in the southeastern margin of the Tibetan Plateau and it is a region of high seismic activity. Historically, strong earthquakes that occurred here usually generated lots of landslides and brought destructive damages. This paper introduces several earthquake-triggered landslide events in this region and describes their characteristics. Also, the historical data of earthquakes with a magnitude of 7.0 or greater, having occurred in this region, is col...

  2. Digital fluorimeter

    International Nuclear Information System (INIS)

    Mello, H.A. de.

    1980-11-01

    The specifications of a digital fluorimeter. That with adequated analytical techniques permits to determine trace amounts of fluorescents materials in samples, are described. The fluorimeter is of the reflection type, and uses fluorescents lamps for the excitation and an optical system which is connected to a photomultiplyer machine and permits the measurement of the light intensity. In the case of IEN (Instituto de Engenharia Nuclear) the equipment is used for to determine the uranium content in sample materials to be exported. The precision of the instrument is about + - 1% in the scale of 0.1 which is the normally one used in the current researchs. (E.G.) [pt

  3. Digital entrepreneurship

    DEFF Research Database (Denmark)

    Brem, Alexander; Richter, Chris; Kraus, Sascha

    2017-01-01

    comprising guided interviews with 14 companies from Germany, Austria and Switzerland provides detailed insights into different aspects of the sharing economy phenomenon. Our results make a direct contribution to sharing economy research, especially regarding the new business models of start-ups. Here, we...... find a clear difference between the relevance of economic and social orientation. The latter appears to be in higher demand among customers than entrepreneurs. The increasingly digitalized environment has led to a changed living situation characterized by urbanity, openness to new solutions, changed...

  4. Focus: Digital

    DEFF Research Database (Denmark)

    Technology has been an all-important and defining element within the arts throughout the 20th century, and it has fundamentally changed the ways in which we produce and consume music. With this Focus we investigate the latest developments in the digital domain – and their pervasiveness and rapid...... production and reception of contemporary music and sound art. With ‘Digital’ we present four composers' very different answers to how technology impact their work. To Juliana Hodkinson it has become an integral part of her sonic writing. Rudiger Meyer analyses the relationships between art and design and how...

  5. Retrospective analysis of the Spitak earthquake

    Directory of Open Access Journals (Sweden)

    A. K. Tovmassian

    1995-06-01

    Full Text Available Based on the retrospective analysis of numerous data and studies of the Spitak earthquake the present work at- tempts to shed light on different aspects of that catastrophic seismic event which occurred in Northern Arme- nia on December 7, 1988. The authors follow a chronological order of presentation, namely: changes in geo- sphere, atmosphere, biosphere during the preparation of the Spitak earthquake, foreshocks, main shock, after- shocks, focal mechanisms, historical seismicity; seismotectonic position of the source, strong motion records, site effects; the macroseismic effect, collapse of buildings and structures; rescue activities; earthquake conse- quences; and the lessons of the Spitak earthquake.

  6. Smoking prevalence increases following Canterbury earthquakes.

    Science.gov (United States)

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  7. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  8. Real Time Earthquake Information System in Japan

    Science.gov (United States)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  9. Impact- and earthquake- proof roof structure

    International Nuclear Information System (INIS)

    Shohara, Ryoichi.

    1990-01-01

    Building roofs are constituted with roof slabs, an earthquake proof layer at the upper surface thereof and an impact proof layer made of iron-reinforced concrete disposed further thereover. Since the roofs constitute an earthquake proof structure loading building dampers on the upper surface of the slabs by the concrete layer, seismic inputs of earthquakes to the buildings can be moderated and the impact-proof layer is formed, to ensure the safety to external conditions such as earthquakes or falling accidents of airplane in important facilities such as reactor buildings. (T.M.)

  10. A minimalist model of characteristic earthquakes

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    In a spirit akin to the sandpile model of self- organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake...... behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...... of the characteristic earthquake....

  11. Global Significant Earthquake Database, 2150 BC to present

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Significant Earthquake Database is a global listing of over 5,700 earthquakes from 2150 BC to the present. A significant earthquake is classified as one that...

  12. Social Media as Seismic Networks for the Earthquake Damage Assessment

    Science.gov (United States)

    Meletti, C.; Cresci, S.; La Polla, M. N.; Marchetti, A.; Tesconi, M.

    2014-12-01

    The growing popularity of online platforms, based on user-generated content, is gradually creating a digital world that mirrors the physical world. In the paradigm of crowdsensing, the crowd becomes a distributed network of sensors that allows us to understand real life events at a quasi-real-time rate. The SoS-Social Sensing project [http://socialsensing.it/] exploits the opportunistic crowdsensing, involving users in the sensing process in a minimal way, for social media emergency management purposes in order to obtain a very fast, but still reliable, detection of emergency dimension to face. First of all we designed and implemented a decision support system for the detection and the damage assessment of earthquakes. Our system exploits the messages shared in real-time on Twitter. In the detection phase, data mining and natural language processing techniques are firstly adopted to select meaningful and comprehensive sets of tweets. Then we applied a burst detection algorithm in order to promptly identify outbreaking seismic events. Using georeferenced tweets and reported locality names, a rough epicentral determination is also possible. The results, compared to Italian INGV official reports, show that the system is able to detect, within seconds, events of a magnitude in the region of 3.5 with a precision of 75% and a recall of 81,82%. We then focused our attention on damage assessment phase. We investigated the possibility to exploit social media data to estimate earthquake intensity. We designed a set of predictive linear models and evaluated their ability to map the intensity of worldwide earthquakes. The models build on a dataset of almost 5 million tweets exploited to compute our earthquake features, and more than 7,000 globally distributed earthquakes data, acquired in a semi-automatic way from USGS, serving as ground truth. We extracted 45 distinct features falling into four categories: profile, tweet, time and linguistic. We run diagnostic tests and

  13. Digital Humanities

    DEFF Research Database (Denmark)

    Nielsen, Hans Jørn

    2015-01-01

    overgangen fra trykkekultur til digital kultur. For det første problemstillingen omkring digitalisering af litterær kulturarv med fokus på kodning og tagging af teksten samt organisering i hypertekststrukturer. For det andet reorganiseringen af det digitale dokument i dataelementer og database. For det......Artiklen præsenterer først nogle generelle problemstillinger omkring Digital Humanities (DH) med det formål at undersøge dem nærmere i relation til konkrete eksempler på forskellige digitaliseringsmåder og ændringer i dokumentproduktion. I en nærmere afgrænsning vælger artiklen den tendens i DH......, der betragter DH som forbundet med "making" og "building" af digitale objekter og former. Dette kan også karakteriseres som DH som praktisk-produktiv vending. Artiklen har valgt tre typer af digitalisering. De er valgt ud fra, at de skal repræsentere forskellige måder at håndtere digitaliseringen på...

  14. The rupture process of the Manjil, Iran earthquake of 20 june 1990 and implications for intraplate strike-slip earthquakes

    Science.gov (United States)

    Choy, G.L.; Zednik, J.

    1997-01-01

    In terms of seismically radiated energy or moment release, the earthquake of 20 January 1990 in the Manjil Basin-Alborz Mountain region of Iran is the second largest strike-slip earthquake to have occurred in an intracontinental setting in the past decade. It caused enormous loss of life and the virtual destruction of several cities. Despite a very large meizoseismal area, the identification of the causative faults has been hampered by the lack of reliable earthquake locations and conflicting field reports of surface displacement. Using broadband data from global networks of digitally recording seismographs, we analyse broadband seismic waveforms to derive characteristics of the rupture process. Complexities in waveforms generated by the earthquake indicate that the main shock consisted of a tiny precursory subevent followed in the next 20 seconds by a series of four major subevents with depths ranging from 10 to 15 km. The focal mechanisms of the major subevents, which are predominantly strike-slip, have a common nodal plane striking about 285??-295??. Based on the coincidence of this strike with the dominant tectonic fabric of the region we presume that the EW striking planes are the fault planes. The first major subevent nucleated slightly south of the initial precursor. The second subevent occurred northwest of the initial precursor. The last two subevents moved progressively southeastward of the first subevent in a direction collinear with the predominant strike of the fault planes. The offsets in the relative locations and the temporal delays of the rupture subevents indicate heterogeneous distribution of fracture strength and the involvement of multiple faults. The spatial distribution of teleseismic aftershocks, which at first appears uncorrelated with meizoseismal contours, can be decomposed into stages. The initial activity, being within and on the periphery of the rupture zone, correlates in shape and length with meizoseismal lines. In the second stage

  15. Seismicity map tools for earthquake studies

    Science.gov (United States)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  16. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  17. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  18. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    Science.gov (United States)

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  19. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  20. Stress triggering of the Lushan M7. 0 earthquake by the Wenchuan Ms8. 0 earthquake

    Directory of Open Access Journals (Sweden)

    Wu Jianchao

    2013-08-01

    Full Text Available The Wenchuan Ms8. 0 earthquake and the Lushan M7. 0 earthquake occurred in the north and south segments of the Longmenshan nappe tectonic belt, respectively. Based on the focal mechanism and finite fault model of the Wenchuan Ms8. 0 earthquake, we calculated the coulomb failure stress change. The inverted coulomb stress changes based on the Nishimura and Chenji models both show that the Lushan M7. 0 earthquake occurred in the increased area of coulomb failure stress induced by the Wenchuan Ms8. 0 earthquake. The coulomb failure stress increased by approximately 0. 135 – 0. 152 bar in the source of the Lushan M7. 0 earthquake, which is far more than the stress triggering threshold. Therefore, the Lushan M7. 0 earthquake was most likely triggered by the coulomb failure stress change.

  1. Foreshock occurrence before large earthquakes

    Science.gov (United States)

    Reasenberg, P.A.

    1999-01-01

    Rates of foreshock occurrence involving shallow M ??? 6 and M ??? 7 mainshocks and M ??? 5 foreshocks were measured in two worldwide catalogs over ???20-year intervals. The overall rates observed are similar to ones measured in previous worldwide and regional studies when they are normalized for the ranges of magnitude difference they each span. The observed worldwide rates were compared to a generic model of earthquake clustering based on patterns of small and moderate aftershocks in California. The aftershock model was extended to the case of moderate foreshocks preceding large mainshocks. Overall, the observed worldwide foreshock rates exceed the extended California generic model by a factor of ???2. Significant differences in foreshock rate were found among subsets of earthquakes defined by their focal mechanism and tectonic region, with the rate before thrust events higher and the rate before strike-slip events lower than the worldwide average. Among the thrust events, a large majority, composed of events located in shallow subduction zones, had a high foreshock rate, while a minority, located in continental thrust belts, had a low rate. These differences may explain why previous surveys have found low foreshock rates among thrust events in California (especially southern California), while the worldwide observations suggests the opposite: California, lacking an active subduction zone in most of its territory, and including a region of mountain-building thrusts in the south, reflects the low rate apparently typical for continental thrusts, while the worldwide observations, dominated by shallow subduction zone events, are foreshock-rich. If this is so, then the California generic model may significantly underestimate the conditional probability for a very large (M ??? 8) earthquake following a potential (M ??? 7) foreshock in Cascadia. The magnitude differences among the identified foreshock-mainshock pairs in the Harvard catalog are consistent with a uniform

  2. Earthquakes, detecting and understanding them

    International Nuclear Information System (INIS)

    2008-05-01

    The signatures at the surface of the Earth is continually changing on a geological timescale. The tectonic plates, which make up this surface, are moving in relation to each other. On human timescale, these movements are the result of earthquakes, which suddenly, release energy accumulated over a period of time. The vibrations they produce propagate through the interior of the Earth: these are seismic waves. However, other phenomena can generate seismic waves, such as volcanoes, quarry blasts, etc. The surf of the ocean waves on the coasts, the wind in the trees and human activity (industry and road traffic) all contribute to the 'seismic background noise'. Sensors are able to detect signals from events which are then discriminated, analyzed and located. Earthquakes and active volcanoes are not distributed randomly over the surface of the globe: they mainly coincide with mountain chains and ocean trenches and ridges. 'An earthquake results from the abrupt release of the energy accumulated by movements and rubbing of different plates'. The study of the propagation of seismic waves has allowed to determine the outline of the plates inside the Earth and has highlighted their movements. There are seven major plates which are colliding, diverging or sliding past each other. Each year the continents move several centimeters with respect to one another. This process, known as 'continental drift', was finally explained by plate tectonics. The initial hypothesis for this science dates from the beginning of the 20. century, but it was not confirmed until the 1960's. It explains that convection inside the Earth is the source of the forces required for these movements. This science, as well as explaining these great movements, has provided a coherent, unifying and quantitative framework, which unites the explanations for all the geophysical phenomena under one mechanism. (authors)

  3. Creating a tsunami disaster archive of the Great Northeastern Japan earthquake using images uploaded to the internet

    International Nuclear Information System (INIS)

    Endo, N; Takehara, A

    2014-01-01

    We think that the that the experiences from the disaster caused by the Great Northeastern Earthquake in Japan must be of great interest to people not only in the stricken areas but in the whole of Japan and the whole world. Accordingly, we tried to create a method to preserve the digital images of Great Northeastern Earthquake for the next generation. The Creative Commons License may be one of the most useful solutions to avoid complicated processes when a person other than authors would like to build a disaster archive using images uploaded to the Internet

  4. Digital radiography

    DEFF Research Database (Denmark)

    Precht, H; Gerke, O; Rosendahl, K

    2012-01-01

    BACKGROUND: New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults....... OBJECTIVE: To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. MATERIALS AND METHODS: A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR...... with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. RESULTS: Optimal image-quality was maintained at a dose...

  5. Fokus: Digital

    DEFF Research Database (Denmark)

    2014-01-01

    i det digitale domæne – udviklinger, der foregår hastigt og er gennemgribende, og som derfor kræver et nærmere blik på forholdet mellem kunsten og teknologien. Komponistens forståelse af sin metier udfordres – samtidig med at befæstede ideer om kunstværket møder modstand fra nye mediemæssige...... sammenhænge og fra forandrede distributionsformer. Dette betyder ændrede betingelser for både produktion og reception af kunstmusik og lydkunst. Med Digital tager vi udgangspunkt i fire komponisters meget forskellige bud på hvordan teknologien spiller en rolle i arbejdet. Juliana Hodkinson beskriver hvordan...

  6. Statistical properties of earthquakes clustering

    Directory of Open Access Journals (Sweden)

    A. Vecchio

    2008-04-01

    Full Text Available Often in nature the temporal distribution of inhomogeneous stochastic point processes can be modeled as a realization of renewal Poisson processes with a variable rate. Here we investigate one of the classical examples, namely, the temporal distribution of earthquakes. We show that this process strongly departs from a Poisson statistics for both catalogue and sequence data sets. This indicate the presence of correlations in the system probably related to the stressing perturbation characterizing the seismicity in the area under analysis. As shown by this analysis, the catalogues, at variance with sequences, show common statistical properties.

  7. Refresher Course on Physics of Earthquakes -98 ...

    Indian Academy of Sciences (India)

    The objective of this course is to help teachers gain an understanding of the earhquake phenomenon and the physical processes involved in its genesis as well as offhe earthquake waves which propagate the energy released by the earthquake rupture outward from the source. The Course will begin with mathematical ...

  8. Tutorial on earthquake rotational effects: historical examples

    Czech Academy of Sciences Publication Activity Database

    Kozák, Jan

    2009-01-01

    Roč. 99, 2B (2009), s. 998-1010 ISSN 0037-1106 Institutional research plan: CEZ:AV0Z30120515 Keywords : rotational seismic models * earthquake rotational effects * historical earthquakes Subject RIV: DC - Siesmology, Volcanology, Earth Structure Impact factor: 1.860, year: 2009

  9. Wood-framed houses for earthquake zones

    DEFF Research Database (Denmark)

    Hansen, Klavs Feilberg

    Wood-framed houses with a sheathing are suitable for use in earthquake zones. The Direction describes a method of determining the earthquake forces in a house and shows how these forces can be resisted by diaphragm action in the walls, floors, and roof, of the house. An appendix explains how...

  10. Earthquake effect on the geological environment

    International Nuclear Information System (INIS)

    Kawamura, Makoto

    1999-01-01

    Acceleration caused by the earthquake, changes in the water pressure, and the rock-mass strain were monitored for a series of 344 earthquakes from 1990 to 1998 at Kamaishi In Situ Test Site. The largest acceleration was registered to be 57.14 gal with the earthquake named 'North coast of Iwate Earthquake' (M4.4) occurred in June, 1996. Changes of the water pressure were recorded with 27 earthquakes; the largest change was -0.35 Kgt/cm 2 . The water-pressure change by earthquake was, however, usually smaller than that caused by rainfall in this area. No change in the electric conductivity or pH of ground water was detected before and after the earthquake throughout the entire period of monitoring. The rock-mass strain was measured with a extensometer whose detection limit was of the order of 10 -8 to 10 -9 degrees and the remaining strain of about 2.5x10 -9 degrees was detected following the 'Offshore Miyagi Earthquake' (M5.1) in October, 1997. (H. Baba)

  11. Designing an Earthquake-Resistant Building

    Science.gov (United States)

    English, Lyn D.; King, Donna T.

    2016-01-01

    How do cross-bracing, geometry, and base isolation help buildings withstand earthquakes? These important structural design features involve fundamental geometry that elementary school students can readily model and understand. The problem activity, Designing an Earthquake-Resistant Building, was undertaken by several classes of sixth- grade…

  12. Passive containment system in high earthquake motion

    International Nuclear Information System (INIS)

    Kleimola, F.W.; Falls, O.B. Jr.

    1977-01-01

    High earthquake motion necessitates major design modifications in the complex of plant structures, systems and components in a nuclear power plant. Distinctive features imposed by seismic category, safety class and quality classification requirements for the high seismic ground acceleration loadings significantly reflect in plant costs. The design features in the Passive Containment System (PCS) responding to high earthquake ground motion are described

  13. Napa Earthquake impact on water systems

    Science.gov (United States)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  14. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  15. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  16. How fault geometry controls earthquake magnitude

    Science.gov (United States)

    Bletery, Q.; Thomas, A.; Karlstrom, L.; Rempel, A. W.; Sladen, A.; De Barros, L.

    2016-12-01

    Recent large megathrust earthquakes, such as the Mw9.3 Sumatra-Andaman earthquake in 2004 and the Mw9.0 Tohoku-Oki earthquake in 2011, astonished the scientific community. The first event occurred in a relatively low-convergence-rate subduction zone where events of its size were unexpected. The second event involved 60 m of shallow slip in a region thought to be aseismicaly creeping and hence incapable of hosting very large magnitude earthquakes. These earthquakes highlight gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution. Here we show that gradients in dip angle exert a primary control on mega-earthquake occurrence. We calculate the curvature along the major subduction zones of the world and show that past mega-earthquakes occurred on flat (low-curvature) interfaces. A simplified analytic model demonstrates that shear strength heterogeneity increases with curvature. Stress loading on flat megathrusts is more homogeneous and hence more likely to be released simultaneously over large areas than on highly-curved faults. Therefore, the absence of asperities on large faults might counter-intuitively be a source of higher hazard.

  17. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    Energy Technology Data Exchange (ETDEWEB)

    na

    2001-02-08

    It is well accepted that the 1992 M 5.6 Little Skull Mountain earthquake, the largest historical event to have occurred within 25 km of Yucca Mountain, Nevada, was triggered by the M 7.2 Landers earthquake that occurred the day before. On the premise that earthquakes can be triggered by applied stresses, we have examined the earthquake catalog from the Southern Great Basin Digital Seismic Network (SGBDSN) for other evidence of triggering by external and internal stresses. This catalog now comprises over 12,000 events, encompassing five years of consistent monitoring, and has a low threshold of completeness, varying from M 0 in the center of the network to M 1 at the fringes. We examined the SGBDSN catalog response to external stresses such as large signals propagating from teleseismic and regional earthquakes, microseismic storms, and earth tides. Results are generally negative. We also examined the interplay of earthquakes within the SGBDSN. The number of ''foreshocks'', as judged by most criteria, is significantly higher than the background seismicity rate. In order to establish this, we first removed aftershocks from the catalog with widely used methodology. The existence of SGBDSN foreshocks is supported by comparing actual statistics to those of a simulated catalog with uniform-distributed locations and Poisson-distributed times of occurrence. The probabilities of a given SGBDSN earthquake being followed by one having a higher magnitude within a short time frame and within a close distance are at least as high as those found with regional catalogs. These catalogs have completeness thresholds two to three units higher in magnitude than the SGBDSN catalog used here. The largest earthquake in the SGBDSN catalog, the M 4.7 event in Frenchman Flat on 01/27/1999, was preceded by a definite foreshock sequence. The largest event within 75 km of Yucca Mountain in historical time, the M 5.7 Scotty's Junction event of 08/01/1999, was also

  18. EARTHQUAKE TRIGGERING AND SPATIAL-TEMPORAL RELATIONS IN THE VICINITY OF YUCCA MOUNTAIN, NEVADA

    International Nuclear Information System (INIS)

    2001-01-01

    It is well accepted that the 1992 M 5.6 Little Skull Mountain earthquake, the largest historical event to have occurred within 25 km of Yucca Mountain, Nevada, was triggered by the M 7.2 Landers earthquake that occurred the day before. On the premise that earthquakes can be triggered by applied stresses, we have examined the earthquake catalog from the Southern Great Basin Digital Seismic Network (SGBDSN) for other evidence of triggering by external and internal stresses. This catalog now comprises over 12,000 events, encompassing five years of consistent monitoring, and has a low threshold of completeness, varying from M 0 in the center of the network to M 1 at the fringes. We examined the SGBDSN catalog response to external stresses such as large signals propagating from teleseismic and regional earthquakes, microseismic storms, and earth tides. Results are generally negative. We also examined the interplay of earthquakes within the SGBDSN. The number of ''foreshocks'', as judged by most criteria, is significantly higher than the background seismicity rate. In order to establish this, we first removed aftershocks from the catalog with widely used methodology. The existence of SGBDSN foreshocks is supported by comparing actual statistics to those of a simulated catalog with uniform-distributed locations and Poisson-distributed times of occurrence. The probabilities of a given SGBDSN earthquake being followed by one having a higher magnitude within a short time frame and within a close distance are at least as high as those found with regional catalogs. These catalogs have completeness thresholds two to three units higher in magnitude than the SGBDSN catalog used here. The largest earthquake in the SGBDSN catalog, the M 4.7 event in Frenchman Flat on 01/27/1999, was preceded by a definite foreshock sequence. The largest event within 75 km of Yucca Mountain in historical time, the M 5.7 Scotty's Junction event of 08/01/1999, was also preceded by foreshocks. The

  19. Digital work in a digitally challenged organization

    NARCIS (Netherlands)

    Davison, R.M.; Ou, Carol

    Digitally literate employees are accustomed to having free access to digital media technologies. However, some organizations enact information technology (IT) governance structures that explicitly proscribe access to these technologies, resulting in considerable tension between employees and the

  20. The October 1992 Parkfield, California, earthquake prediction

    Science.gov (United States)

    Langbein, J.

    1992-01-01

    A magnitude 4.7 earthquake occurred near Parkfield, California, on October 20, 992, at 05:28 UTC (October 19 at 10:28 p.m. local or Pacific Daylight Time).This moderate shock, interpreted as the potential foreshock of a damaging earthquake on the San Andreas fault, triggered long-standing federal, state and local government plans to issue a public warning of an imminent magnitude 6 earthquake near Parkfield. Although the predicted earthquake did not take place, sophisticated suites of instruments deployed as part of the Parkfield Earthquake Prediction Experiment recorded valuable data associated with an unusual series of events. this article describes the geological aspects of these events, which occurred near Parkfield in October 1992. The accompnaying article, an edited version of a press conference b Richard Andrews, the Director of the California Office of Emergency Service (OES), describes governmental response to the prediction.   

  1. Parallelization of the Coupled Earthquake Model

    Science.gov (United States)

    Block, Gary; Li, P. Peggy; Song, Yuhe T.

    2007-01-01

    This Web-based tsunami simulation system allows users to remotely run a model on JPL s supercomputers for a given undersea earthquake. At the time of this reporting, predicting tsunamis on the Internet has never happened before. This new code directly couples the earthquake model and the ocean model on parallel computers and improves simulation speed. Seismometers can only detect information from earthquakes; they cannot detect whether or not a tsunami may occur as a result of the earthquake. When earthquake-tsunami models are coupled with the improved computational speed of modern, high-performance computers and constrained by remotely sensed data, they are able to provide early warnings for those coastal regions at risk. The software is capable of testing NASA s satellite observations of tsunamis. It has been successfully tested for several historical tsunamis, has passed all alpha and beta testing, and is well documented for users.

  2. Low cost earthquake resistant ferrocement small house

    International Nuclear Information System (INIS)

    Saleem, M.A.; Ashraf, M.; Ashraf, M.

    2008-01-01

    The greatest humanitarian challenge faced even today after one year of Kashmir Hazara earthquake is that of providing shelter. Currently on the globe one in seven people live in a slum or refugee camp. The earthquake of October 2005 resulted in a great loss of life and property. This research work is mainly focused on developing a design of small size, low cost and earthquake resistant house. Ferrocement panels are recommended as the main structural elements with lightweight truss roofing system. Earthquake resistance is ensured by analyzing the structure on ETABS for a seismic activity of zone 4. The behavior of structure is found satisfactory under the earthquake loading. An estimate of cost is also presented which shows that it is an economical solution. (author)

  3. Antioptimization of earthquake exitation and response

    Directory of Open Access Journals (Sweden)

    G. Zuccaro

    1998-01-01

    Full Text Available The paper presents a novel approach to predict the response of earthquake-excited structures. The earthquake excitation is expanded in terms of series of deterministic functions. The coefficients of the series are represented as a point in N-dimensional space. Each available ccelerogram at a certain site is then represented as a point in the above space, modeling the available fragmentary historical data. The minimum volume ellipsoid, containing all points, is constructed. The ellipsoidal models of uncertainty, pertinent to earthquake excitation, are developed. The maximum response of a structure, subjected to the earthquake excitation, within ellipsoidal modeling of the latter, is determined. This procedure of determining least favorable response was termed in the literature (Elishakoff, 1991 as an antioptimization. It appears that under inherent uncertainty of earthquake excitation, antioptimization analysis is a viable alternative to stochastic approach.

  4. Ionospheric Anomaly before Kyushu|Japan Earthquake

    Directory of Open Access Journals (Sweden)

    YANG Li

    2017-05-01

    Full Text Available GIM data released by IGS is used in the article and a new method of combining the Sliding Time Window Method and the Ionospheric TEC correlation analysis method of adjacent grid points is proposed to study the relationship between pre-earthquake ionospheric anomalies and earthquake. By analyzing the abnormal change of TEC in the 5 grid points around the seismic region, the abnormal change of ionospheric TEC is found before the earthquake and the correlation between the TEC sequences of lattice points is significantly affected by earthquake. Based on the analysis of the spatial distribution of TEC anomaly, anomalies of 6 h, 12 h and 6 h were found near the epicenter three days before the earthquake. Finally, ionospheric tomographic technology is used to do tomographic inversion on electron density. And the distribution of the electron density in the ionospheric anomaly is further analyzed.

  5. Deep long-period earthquakes beneath Washington and Oregon volcanoes

    Science.gov (United States)

    Nichols, M.L.; Malone, S.D.; Moran, S.C.; Thelen, W.A.; Vidale, J.E.

    2011-01-01

    Deep long-period (DLP) earthquakes are an enigmatic type of seismicity occurring near or beneath volcanoes. They are commonly associated with the presence of magma, and found in some cases to correlate with eruptive activity. To more thoroughly understand and characterize DLP occurrence near volcanoes in Washington and Oregon, we systematically searched the Pacific Northwest Seismic Network (PNSN) triggered earthquake catalog for DLPs occurring between 1980 (when PNSN began collecting digital data) and October 2009. Through our analysis we identified 60 DLPs beneath six Cascade volcanic centers. No DLPs were associated with volcanic activity, including the 1980-1986 and 2004-2008 eruptions at Mount St. Helens. More than half of the events occurred near Mount Baker, where the background flux of magmatic gases is greatest among Washington and Oregon volcanoes. The six volcanoes with DLPs (counts in parentheses) are Mount Baker (31), Glacier Peak (9), Mount Rainier (9), Mount St. Helens (9), Three Sisters (1), and Crater Lake (1). No DLPs were identified beneath Mount Adams, Mount Hood, Mount Jefferson, or Newberry Volcano, although (except at Hood) that may be due in part to poorer network coverage. In cases where the DLPs do not occur directly beneath the volcanic edifice, the locations coincide with large structural faults that extend into the deep crust. Our observations suggest the occurrence of DLPs in these areas could represent fluid and/or magma transport along pre-existing tectonic structures in the middle crust. ?? 2010 Elsevier B.V.

  6. What Can Sounds Tell Us About Earthquake Interactions?

    Science.gov (United States)

    Aiken, C.; Peng, Z.

    2012-12-01

    It is important not only for seismologists but also for educators to effectively convey information about earthquakes and the influences earthquakes can have on each other. Recent studies using auditory display [e.g. Kilb et al., 2012; Peng et al. 2012] have depicted catastrophic earthquakes and the effects large earthquakes can have on other parts of the world. Auditory display of earthquakes, which combines static images with time-compressed sound of recorded seismic data, is a new approach to disseminating information to a general audience about earthquakes and earthquake interactions. Earthquake interactions are influential to understanding the underlying physics of earthquakes and other seismic phenomena such as tremors in addition to their source characteristics (e.g. frequency contents, amplitudes). Earthquake interactions can include, for example, a large, shallow earthquake followed by increased seismicity around the mainshock rupture (i.e. aftershocks) or even a large earthquake triggering earthquakes or tremors several hundreds to thousands of kilometers away [Hill and Prejean, 2007; Peng and Gomberg, 2010]. We use standard tools like MATLAB, QuickTime Pro, and Python to produce animations that illustrate earthquake interactions. Our efforts are focused on producing animations that depict cross-section (side) views of tremors triggered along the San Andreas Fault by distant earthquakes, as well as map (bird's eye) views of mainshock-aftershock sequences such as the 2011/08/23 Mw5.8 Virginia earthquake sequence. These examples of earthquake interactions include sonifying earthquake and tremor catalogs as musical notes (e.g. piano keys) as well as audifying seismic data using time-compression. Our overall goal is to use auditory display to invigorate a general interest in earthquake seismology that leads to the understanding of how earthquakes occur, how earthquakes influence one another as well as tremors, and what the musical properties of these

  7. Earthquake activity along the Himalayan orogenic belt

    Science.gov (United States)

    Bai, L.; Mori, J. J.

    2017-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  8. USNA DIGITAL FORENSICS LAB

    Data.gov (United States)

    Federal Laboratory Consortium — To enable Digital Forensics and Computer Security research and educational opportunities across majors and departments. Lab MissionEstablish and maintain a Digital...

  9. Can We Teach Digital Natives Digital Literacy?

    Science.gov (United States)

    Ng, Wan

    2012-01-01

    In recent years, there has been much debate about the concept of digital natives, in particular the differences between the digital natives' knowledge and adoption of digital technologies in informal versus formal educational contexts. This paper investigates the knowledge about educational technologies of a group of undergraduate students…

  10. Earthquake Source Depths in the Zagros Mountains: A "Jelly Sandwich" or "Creme Brulee" Lithosphere?

    Science.gov (United States)

    Adams, A. N.; Nyblade, A.; Brazier, R.; Rodgers, A.; Al-Amri, A.

    2006-12-01

    The Zagros Mountain Belt of southwestern Iran is one of the most seismically active mountain belts in the world. Previous studies of the depth distribution of earthquakes in this region have shown conflicting results. Early seismic studies of teleseismically recorded events found that earthquakes in the Zagros Mountains nucleated within both the upper crust and upper mantle, indicating that the lithosphere underlying the Zagros Mountains has a strong upper crust and a strong lithospheric mantle, separated by a weak lower crust. Such a model of lithospheric structure is called the "Jelly Sandwich" model. More recent teleseismic studies, however, found that earthquakes in the Zagros Mountains occur only within the upper crust, thus indicating that the strength of the Zagros Mountains' lithosphere is primarily isolated to the upper crust. This model of lithospheric structure is called the "crème brûlée" model. Analysis of regionally recorded earthquakes nucleating within the Zagros Mountains is presented here. Data primarily come from the Saudi Arabian National Digital Seismic Network, although data sources include many regional open and closed networks. The use of regionally recorded earthquakes facilitates the analysis of a larger dataset than has been used in previous teleseismic studies. Regional waveforms have been inverted for source parameters using a range of potential source depths to determine the best fitting source parameters and depths. Results indicate that earthquakes nucleate in two distinct zones. One seismogenic zone lies at shallow, upper crustal depths. The second seismogenic zone lies near the Moho. Due to uncertainty in the source and Moho depths, further study is needed to determine whether these deeper events are nucleating within the lower crust or the upper mantle.

  11. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Stephenson, D.E.; Zandt, G.; Bouchon, M.; Hustrulid, W.A.

    1980-01-01

    In order to assess the seismic risk for an underground facility, a data base was established and analyzed to evaluate the potential for seismic disturbance. Substantial damage to underground facilities is usually the result of displacements primarily along pre-existing faults and fractures, or at the surface entrance to these facilities. Evidence of this comes from both earthquakes and large explosions. Therefore, the displacement due to earthquakes as a function of depth is important in the evaluation of the hazard to underground facilities. To evaluate potential displacements due to seismic effects of block motions along pre-existing or induced fractures, the displacement fields surrounding two types of faults were investigated. Analytical models were used to determine relative displacements of shafts and near-surface displacement of large rock masses. Numerical methods were used to determine the displacement fields associated with pure strike-slip and vertical normal faults. Results are presented as displacements for various fault lengths as a function of depth and distance. This provides input to determine potential displacements in terms of depth and distance for underground facilities, important for assessing potential sites and design parameters

  12. Use of earthquake experience data

    International Nuclear Information System (INIS)

    Eder, S.J.; Eli, M.W.

    1991-01-01

    At many of the older existing US Department of Energy (DOE) facilities, the need has arisen for evaluation guidelines for natural phenomena hazard assessment. The effect of a design basis earthquake at most of these facilities is one of the main concerns. Earthquake experience data can provide a basis for the needed seismic evaluation guidelines, resulting in an efficient screening evaluation methodology for several of the items that are in the scope of the DOE facility reviews. The experience-based screening evaluation methodology, when properly established and implemented by trained engineers, has proven to result in sufficient safety margins and focuses on real concerns via facility walkdowns, usually at costs much less than the alternative options of analysis and testing. This paper summarizes a program that is being put into place to establish uniform seismic evaluation guidelines and criteria for evaluation of existing DOE facilities. The intent of the program is to maximize use of past experience, in conjunction with a walkdown screening evaluation process

  13. Contributions to the European workshop on investigation of strong motion processing procedures

    International Nuclear Information System (INIS)

    Mohammadioun, B.; Goula, X.; Hamaide, D.

    1985-11-01

    The first paper is one contribution to a joint study program in the numerical processing of accelerograms from strong earthquakes. A method is proposed for generating an analytic signal having characteristics similar to those of an actual ground displacement. From this signal, a simulated accelerogram is obtained analytically. Various numerical processing techniques are to be tested using this signal: the ground displacements they yield will be compared with the original analytic signal. The second contribution deals with a high-performance digitization complex, custom-designed to stringent technical criteria by the CISI Petrole Services, which has recently been put into service at the Bureau d'Evaluation des Risques Sismiques pour la Surete des Installations Nucleaires. Specially tailored to cope with the problems raised by the sampling of Strong-Motion photographic recordings, it offers considerable flexibility, due to its self-teaching conception, constant monitoring of the work ongoing, and numerous preprocessing options. In the third contribution, a critical examination of several processing techniques applicable to photographic recordings of SMA-1 type accelerometers is conducted. The basis for comparison was a set of two accelerograms drawn from synthetic signals, the characteristics of which were already well known

  14. Probabilistic approach to earthquake prediction.

    Directory of Open Access Journals (Sweden)

    G. D'Addezio

    2002-06-01

    Full Text Available The evaluation of any earthquake forecast hypothesis requires the application of rigorous statistical methods. It implies a univocal definition of the model characterising the concerned anomaly or precursor, so as it can be objectively recognised in any circumstance and by any observer.A valid forecast hypothesis is expected to maximise successes and minimise false alarms. The probability gain associated to a precursor is also a popular way to estimate the quality of the predictions based on such precursor. Some scientists make use of a statistical approach based on the computation of the likelihood of an observed realisation of seismic events, and on the comparison of the likelihood obtained under different hypotheses. This method can be extended to algorithms that allow the computation of the density distribution of the conditional probability of earthquake occurrence in space, time and magnitude. Whatever method is chosen for building up a new hypothesis, the final assessment of its validity should be carried out by a test on a new and independent set of observations. The implementation of this test could, however, be problematic for seismicity characterised by long-term recurrence intervals. Even using the historical record, that may span time windows extremely variable between a few centuries to a few millennia, we have a low probability to catch more than one or two events on the same fault. Extending the record of earthquakes of the past back in time up to several millennia, paleoseismology represents a great opportunity to study how earthquakes recur through time and thus provide innovative contributions to time-dependent seismic hazard assessment. Sets of paleoseimologically dated earthquakes have been established for some faults in the Mediterranean area: the Irpinia fault in Southern Italy, the Fucino fault in Central Italy, the El Asnam fault in Algeria and the Skinos fault in Central Greece. By using the age of the

  15. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    Science.gov (United States)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  16. Digital produktion

    DEFF Research Database (Denmark)

    Bogen sætter fokus på digital produktion, som er en stærk læringsform, der faciliterer elevernes læreprocesser og kvalificerer elevernes faglige læringsresultater. Det sker når lærerne udarbejder didaktiske rammedesign, hvor eleverne arbejder selvstændigt inden for dette rammedesign, og hvor mål og...... procesevaluering stilladserer elevernes faglige proces. I digitale produktionsprocesser arbejder eleverne iterativt, de udvikler ejerskab til produktionen og fastholder selv deres læreprocesser. It’s multimodalitet, elevernes kollaborative tilgange, videndeling mellem eleverne og elevernes uformelle lege- og...... elevernes digitale produktion er lærernes didaktiske rammesætning og stilladserende tilgange. Her lægger lærerne op til, at eleverne som didaktiske designere i relation til rammesætningen skal organisere og planlægge deres læreprocesser, inddrages i målsætning, evaluering og valg af digitale ressourcer...

  17. Digital dannelse

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe

    2012-01-01

    I al vores iver efter at få presset nogle flere digitale dimser ind i skolen, er vi i fare for at glemme hvad det er vi skal med disse dimser. Der er store forventninger til at de kan gøre det lettere at være lærer (og dermed billigere), og det kan det måske. Men der er jo også et dannelsesspørgs......I al vores iver efter at få presset nogle flere digitale dimser ind i skolen, er vi i fare for at glemme hvad det er vi skal med disse dimser. Der er store forventninger til at de kan gøre det lettere at være lærer (og dermed billigere), og det kan det måske. Men der er jo også et...... dannelsesspørgsmål knyttet til it. Hvad er egentlig digital dannelse? Og hvad betyder det for danskfaget?...

  18. Digital forensics digital evidence in criminal investigations

    CERN Document Server

    Marshall, Angus McKenzie

    2009-01-01

    The vast majority of modern criminal investigations involve some element of digital evidence, from mobile phones, computers, CCTV and other devices. Digital Forensics: Digital Evidence in Criminal Investigations provides the reader with a better understanding of how digital evidence complements "traditional" scientific evidence and examines how it can be used more effectively and efficiently in a range of investigations. Taking a new approach to the topic, this book presents digital evidence as an adjunct to other types of evidence and discusses how it can be deployed effectively in s

  19. 33 CFR 222.4 - Reporting earthquake effects.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Reporting earthquake effects. 222..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... significant earthquakes. It primarily concerns damage surveys following the occurrences of earthquakes. (b...

  20. Earthquakes

    Science.gov (United States)

    ... Extreme Heat Older Adults (Aged 65+) Infants and Children Chronic Medical Conditions Low Income Athletes Outdoor Workers Pets Hot Weather Tips Warning Signs and Symptoms FAQs Social Media How to Stay Cool Missouri Cooling Centers Extreme ...

  1. Digital platforms as enablers for digital transformation

    DEFF Research Database (Denmark)

    Hossain, Mokter; Lassen, Astrid Heidemann

    transformation is crucial. This study aims at exploring how organizations are driven towards transformation in various ways to embrace digital platforms for ideas, technologies, and knowledge. It shows the opportunities and challenges digital platforms bring in organizations. It also highlights underlying......Digital platforms offer new ways for organizations to collaborate with the external environment for ideas, technologies, and knowledge. They provide new possibilities and competence but they also bring new challenges for organizations. Understanding the role of these platforms in digital...... mechanisms and potential outcomes of various digital platforms. The contribution of the submission is valuable for scholars to understand and further explore this area. It provides insight for practitioners to capture value through digital platforms and accelerate the pace of organizations’ digital...

  2. Ionospheric precursors for crustal earthquakes in Italy

    Directory of Open Access Journals (Sweden)

    L. Perrone

    2010-04-01

    Full Text Available Crustal earthquakes with magnitude 6.0>M≥5.5 observed in Italy for the period 1979–2009 including the last one at L'Aquila on 6 April 2009 were considered to check if the earlier obtained relationships for ionospheric precursors for strong Japanese earthquakes are valid for the Italian moderate earthquakes. The ionospheric precursors are based on the observed variations of the sporadic E-layer parameters (h'Es, fbEs and foF2 at the ionospheric station Rome. Empirical dependencies for the seismo-ionospheric disturbances relating the earthquake magnitude and the epicenter distance are obtained and they have been shown to be similar to those obtained earlier for Japanese earthquakes. The dependences indicate the process of spreading the disturbance from the epicenter towards periphery during the earthquake preparation process. Large lead times for the precursor occurrence (up to 34 days for M=5.8–5.9 tells about a prolong preparation period. A possibility of using the obtained relationships for the earthquakes prediction is discussed.

  3. Smartphone MEMS accelerometers and earthquake early warning

    Science.gov (United States)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    The low cost MEMS accelerometers in the smartphones are attracting more and more attentions from the science community due to the vast number and potential applications in various areas. We are using the accelerometers inside the smartphones to detect the earthquakes. We did shake table tests to show these accelerometers are also suitable to record large shakings caused by earthquakes. We developed an android app - MyShake, which can even distinguish earthquake movements from daily human activities from the recordings recorded by the accelerometers in personal smartphones and upload trigger information/waveform to our server for further analysis. The data from these smartphones forms a unique datasets for seismological applications, such as earthquake early warning. In this talk I will layout the method we used to recognize earthquake-like movement from single smartphone, and the overview of the whole system that harness the information from a network of smartphones for rapid earthquake detection. This type of system can be easily deployed and scaled up around the global and provides additional insights of the earthquake hazards.

  4. Stress triggering and the Canterbury earthquake sequence

    Science.gov (United States)

    Steacy, Sandy; Jiménez, Abigail; Holden, Caroline

    2014-01-01

    The Canterbury earthquake sequence, which includes the devastating Christchurch event of 2011 February, has to date led to losses of around 40 billion NZ dollars. The location and severity of the earthquakes was a surprise to most inhabitants as the seismic hazard model was dominated by an expected Mw > 8 earthquake on the Alpine fault and an Mw 7.5 earthquake on the Porters Pass fault, 150 and 80 km to the west of Christchurch. The sequence to date has included an Mw = 7.1 earthquake and 3 Mw ≥ 5.9 events which migrated from west to east. Here we investigate whether the later events are consistent with stress triggering and whether a simple stress map produced shortly after the first earthquake would have accurately indicated the regions where the subsequent activity occurred. We find that 100 per cent of M > 5.5 earthquakes occurred in positive stress areas computed using a slip model for the first event that was available within 10 d of its occurrence. We further find that the stress changes at the starting points of major slip patches of post-Darfield main events are consistent with triggering although this is not always true at the hypocentral locations. Our results suggest that Coulomb stress changes contributed to the evolution of the Canterbury sequence and we note additional areas of increased stress in the Christchurch region and on the Porters Pass fault.

  5. Strong motion duration and earthquake magnitude relationships

    International Nuclear Information System (INIS)

    Salmon, M.W.; Short, S.A.; Kennedy, R.P.

    1992-06-01

    Earthquake duration is the total time of ground shaking from the arrival of seismic waves until the return to ambient conditions. Much of this time is at relatively low shaking levels which have little effect on seismic structural response and on earthquake damage potential. As a result, a parameter termed ''strong motion duration'' has been defined by a number of investigators to be used for the purpose of evaluating seismic response and assessing the potential for structural damage due to earthquakes. This report presents methods for determining strong motion duration and a time history envelope function appropriate for various evaluation purposes, for earthquake magnitude and distance, and for site soil properties. There are numerous definitions of strong motion duration. For most of these definitions, empirical studies have been completed which relate duration to earthquake magnitude and distance and to site soil properties. Each of these definitions recognizes that only the portion of an earthquake record which has sufficiently high acceleration amplitude, energy content, or some other parameters significantly affects seismic response. Studies have been performed which indicate that the portion of an earthquake record in which the power (average rate of energy input) is maximum correlates most closely with potential damage to stiff nuclear power plant structures. Hence, this report will concentrate on energy based strong motion duration definitions

  6. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  7. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.

    1997-01-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  8. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi

    2000-06-01

    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  9. POST Earthquake Debris Management — AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  10. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  11. Automatic Earthquake Detection by Active Learning

    Science.gov (United States)

    Bergen, K.; Beroza, G. C.

    2017-12-01

    In recent years, advances in machine learning have transformed fields such as image recognition, natural language processing and recommender systems. Many of these performance gains have relied on the availability of large, labeled data sets to train high-accuracy models; labeled data sets are those for which each sample includes a target class label, such as waveforms tagged as either earthquakes or noise. Earthquake seismologists are increasingly leveraging machine learning and data mining techniques to detect and analyze weak earthquake signals in large seismic data sets. One of the challenges in applying machine learning to seismic data sets is the limited labeled data problem; learning algorithms need to be given examples of earthquake waveforms, but the number of known events, taken from earthquake catalogs, may be insufficient to build an accurate detector. Furthermore, earthquake catalogs are known to be incomplete, resulting in training data that may be biased towards larger events and contain inaccurate labels. This challenge is compounded by the class imbalance problem; the events of interest, earthquakes, are infrequent relative to noise in continuous data sets, and many learning algorithms perform poorly on rare classes. In this work, we investigate the use of active learning for automatic earthquake detection. Active learning is a type of semi-supervised machine learning that uses a human-in-the-loop approach to strategically supplement a small initial training set. The learning algorithm incorporates domain expertise through interaction between a human expert and the algorithm, with the algorithm actively posing queries to the user to improve detection performance. We demonstrate the potential of active machine learning to improve earthquake detection performance with limited available training data.

  12. A long source area of the 1906 Colombia-Ecuador earthquake estimated from observed tsunami waveforms

    Science.gov (United States)

    Yamanaka, Yusuke; Tanioka, Yuichiro; Shiina, Takahiro

    2017-12-01

    The 1906 Colombia-Ecuador earthquake induced both strong seismic motions and a tsunami, the most destructive earthquake in the history of the Colombia-Ecuador subduction zone. The tsunami propagated across the Pacific Ocean, and its waveforms were observed at tide gauge stations in countries including Panama, Japan, and the USA. This study conducted slip inverse analysis for the 1906 earthquake using these waveforms. A digital dataset of observed tsunami waveforms at the Naos Island (Panama) and Honolulu (USA) tide gauge stations, where the tsunami was clearly observed, was first produced by consulting documents. Next, the two waveforms were applied in an inverse analysis as the target waveform. The results of this analysis indicated that the moment magnitude of the 1906 earthquake ranged from 8.3 to 8.6. Moreover, the dominant slip occurred in the northern part of the assumed source region near the coast of Colombia, where little significant seismicity has occurred, rather than in the southern part. The results also indicated that the source area, with significant slip, covered a long distance, including the southern, central, and northern parts of the region.[Figure not available: see fulltext.

  13. Economic consequences of earthquakes: bridging research and practice with HayWired

    Science.gov (United States)

    Wein, A. M.; Kroll, C.

    2016-12-01

    The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.

  14. The Trembling Earth Before Wenchuan Earthquake: Recognition of Precursory Anomalies through High Frequency Sampling Data of Groundwater

    Science.gov (United States)

    Huang, F.

    2017-12-01

    With a magnitude of MS8.0, the 2008 Wenchuan earthquake is classified as one of the "great earthquakes", which are potentially the most destructive, since it occurred at shallow depth close to a highly populated area without prediction, due to no confirmative precursors which were detected from a large amount of newly carried out digital observation data. Scientists who specilize in prediction routine work had been condemned and self-condemned for a long time then. After the pain of defeat passed, scientists have been some thinking to analyze the old observation data in new perspectives from longer temporal process, multiple-disciplinaries, and in different frequency. This presentation will show the preliminary results from groundwater level and temperature observed in 3 wells which distribute along the boundaries of tectonic blocks nearby and far from Wenchuan earthquake rupture.

  15. H. Sapiens Digital: From Digital Immigrants and Digital Natives to Digital Wisdom

    Science.gov (United States)

    Prensky, Marc

    2009-01-01

    As we move further into the 21st century, the digital native/digital immigrant paradigm created by Marc Prensky in 2001 is becoming less relevant. In this article, Prensky suggests that we should focus instead on the development of what he calls "digital wisdom." Arguing that digital technology can make us not just smarter but truly wiser, Prensky…

  16. Dual beam vidicon digitizer

    International Nuclear Information System (INIS)

    Evans, T.L.

    1976-01-01

    A vidicon waveform digitizer which can simultaneously digitize two independent signals has been developed. Either transient or repetitive waveforms can be digitized with this system. A dual beam oscilloscope is used as the signal input device. The light from the oscilloscope traces is optically coupled to a television camera, where the signals are temporarily stored prior to digitizing

  17. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    Science.gov (United States)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  18. Sense of Community and Depressive Symptoms among Older Earthquake Survivors Following the 2008 Earthquake in Chengdu China

    Science.gov (United States)

    Li, Yawen; Sun, Fei; He, Xusong; Chan, Kin Sun

    2011-01-01

    This study examined the impact of an earthquake as well as the role of sense of community as a protective factor against depressive symptoms among older Chinese adults who survived an 8.0 magnitude earthquake in 2008. A household survey of a random sample was conducted 3 months after the earthquake and 298 older earthquake survivors participated…

  19. Sismosima: A pioneer project for earthquake detection

    International Nuclear Information System (INIS)

    Echague, C. de

    2015-01-01

    Currently you can only study how earthquakes occur and minimizing their consequences, but in Sismosima are studied earthquakes for if possible issue a pre-alert. Geological and Mining Institute of Spain (IGME) launched this project that has already achieved in test the caves in which you installed meters an increase of carbon dioxide (CO 2 ) that match the shot earthquake. Now, it remains check if gas emission occurs simultaneously, before or after. If were before, a couple of minutes would be enough to give an early warning with which save lives and ensure facilities. (Author)

  20. ASSESSMENT OF EARTHQUAKE HAZARDS ON WASTE LANDFILLS

    DEFF Research Database (Denmark)

    Zania, Varvara; Tsompanakis, Yiannis; Psarropoulos, Prodromos

    Earthquake hazards may arise as a result of: (a) transient ground deformation, which is induced due to seismic wave propagation, and (b) permanent ground deformation, which is caused by abrupt fault dislocation. Since the adequate performance of waste landfills after an earthquake is of outmost...... importance, the current study examines the impact of both types of earthquake hazards by performing efficient finite-element analyses. These took also into account the potential slip displacement development along the geosynthetic interfaces of the composite base liner. At first, the development of permanent...

  1. Earthquake free design of pipe lines

    International Nuclear Information System (INIS)

    Kurihara, Chizuko; Sakurai, Akio

    1974-01-01

    Long structures such as cooling sea water pipe lines of nuclear power plants have a wide range of extent along the ground surface, and are incurred by not only the inertia forces but also forces due to ground deformations or the seismic wave propagation during earthquakes. Since previous reports indicated the earthquake free design of underground pipe lines, it is discussed in this report on behaviors of pipe lines on the ground during earthquakes and is proposed the aseismic design of pipe lines considering the effects of both inertia forces and ground deformations. (author)

  2. Earthquake response observation of isolated buildings

    International Nuclear Information System (INIS)

    Harada, O.; Kawai, N.; Ishii, T.; Sawada, Y.; Shiojiri, H.; Mazda, T.

    1989-01-01

    Base isolation system is expected to be a technology for a rational design of FBR plant. In order to apply this system to important structures, accumulation of verification data is necessary. From this point of view, the vibration test and the earthquake response observation of the actual isolated building using laminated rubber bearings and elasto-plastic steel dampers were conducted for the purpose of investigating its dynamic behavior and of proving the reliability of the base isolation system. Since September in 1986, more than thirty earthquakes have been observed. This paper presents the results of the earthquake response observation

  3. Enabling Digital Literacy

    DEFF Research Database (Denmark)

    Ryberg, Thomas; Georgsen, Marianne

    2010-01-01

    There are some tensions between high-level policy definitions of “digital literacy” and actual teaching practice. We need to find workable definitions of digital literacy; obtain a better understanding of what digital literacy might look like in practice; and identify pedagogical approaches, which...... support teachers in designing digital literacy learning. We suggest that frameworks such as Problem Based Learning (PBL) are approaches that enable digital literacy learning because they provide good settings for engaging with digital literacy. We illustrate this through analysis of a case. Furthermore......, these operate on a meso-level mediating between high-level concepts of digital literacy and classroom practice....

  4. Progress in digital radiography

    International Nuclear Information System (INIS)

    Cappelle, A.

    2016-01-01

    Because of its practical aspect digital radiography is more and more used in the industrial sector. There are 2 kinds of digital radiography. First, the 'computed radiography' that uses a photon-stimulated screen, and after radiation exposure this screen must be read by an analyser to get a digit image. The second type is the 'direct radiography' that allows one to get a digit radiograph of the object directly. Digital radiography uses the same radioactive nuclides as radiography with silver films: cobalt, iridium or selenium. The spatial resolution of digital radiography is less good than with classical silver film radiography but digital radiography offers a better visual contrast. (A.C.)

  5. Earthquake prediction with electromagnetic phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Hayakawa, Masashi, E-mail: hayakawa@hi-seismo-em.jp [Hayakawa Institute of Seismo Electomagnetics, Co. Ltd., University of Electro-Communications (UEC) Incubation Center, 1-5-1 Chofugaoka, Chofu Tokyo, 182-8585 (Japan); Advanced Wireless & Communications Research Center, UEC, Chofu Tokyo (Japan); Earthquake Analysis Laboratory, Information Systems Inc., 4-8-15, Minami-aoyama, Minato-ku, Tokyo, 107-0062 (Japan); Fuji Security Systems. Co. Ltd., Iwato-cho 1, Shinjyuku-ku, Tokyo (Japan)

    2016-02-01

    Short-term earthquake (EQ) prediction is defined as prospective prediction with the time scale of about one week, which is considered to be one of the most important and urgent topics for the human beings. If this short-term prediction is realized, casualty will be drastically reduced. Unlike the conventional seismic measurement, we proposed the use of electromagnetic phenomena as precursors to EQs in the prediction, and an extensive amount of progress has been achieved in the field of seismo-electromagnetics during the last two decades. This paper deals with the review on this short-term EQ prediction, including the impossibility myth of EQs prediction by seismometers, the reason why we are interested in electromagnetics, the history of seismo-electromagnetics, the ionospheric perturbation as the most promising candidate of EQ prediction, then the future of EQ predictology from two standpoints of a practical science and a pure science, and finally a brief summary.

  6. Modified Mercalli intensities for some recent California earthquakes and historic San Francisco Bay Region earthquakes

    Science.gov (United States)

    Bakun, William H.

    1998-01-01

    Modified Mercalli Intensity (MMI) data for recent California earthquakes were used by Bakun and Wentworth (1997) to develop a strategy for bounding the location and moment magnitude M of earthquakes from MMI observations only. Bakun (Bull. Seismol. Soc. Amer., submitted) used the Bakun and Wentworth (1997) strategy to analyze 19th century and early 20th century San Francisco Bay Region earthquakes. The MMI data and site corrections used in these studies are listed in this Open-file Report. 

  7. Digital Sensor Technology

    Energy Technology Data Exchange (ETDEWEB)

    Ted Quinn; Jerry Mauck; Richard Bockhorst; Ken Thomas

    2013-07-01

    The nuclear industry has been slow to incorporate digital sensor technology into nuclear plant designs due to concerns with digital qualification issues. However, the benefits of digital sensor technology for nuclear plant instrumentation are substantial in terms of accuracy, reliability, availability, and maintainability. This report demonstrates these benefits in direct comparisons of digital and analog sensor applications. It also addresses the qualification issues that must be addressed in the application of digital sensor technology.

  8. Digital preservation for heritages

    CERN Document Server

    Lu, Dongming

    2011-01-01

    ""Digital Preservation for Heritages: Technologies and Applications"" provides a comprehensive and up-to-date coverage of digital technologies in the area of cultural heritage preservation, including digitalization, research aiding, conservation aiding, digital exhibition, and digital utilization. Processes, technical frameworks, key technologies, as well as typical systems and applications are discussed in the book. It is intended for researchers and students in the fields of computer science and technology, museology, and archaeology. Dr. Dongming Lu is a professor at College of Computer Sci

  9. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  10. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    Science.gov (United States)

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  11. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    Science.gov (United States)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  12. Increased earthquake safety through optimised mounting concept

    International Nuclear Information System (INIS)

    Kollmann, Dieter; Senechal, Holger

    2013-01-01

    Since Fukushima, there has been intensive work on earthquake safety in all nuclear power plants. A large part of these efforts aim at the earthquake safety of safety-relevant pipeline systems. The problem with earthquake safety here is not the pipeline system itself but rather its mountings and connections to components. This is precisely the topic that the KAE dealt with in years of research and development work. It has developed an algorithm that determines the optimal mounting concept with a few iteration steps depending on arbitrary combinations of loading conditions whilst maintaining compliance with relevant regulations for any pipeline systems. With this tool at hand, we are now in a position to plan and realise remedial measures accurately with minimum time and hardware expenditure, and so distinctly improve the earthquake safety of safety-relevant systems. (orig.)

  13. Coping with earthquakes induced by fluid injection

    Science.gov (United States)

    McGarr, Arthur F.; Bekins, Barbara; Burkardt, Nina; Dewey, James W.; Earle, Paul S.; Ellsworth, William L.; Ge, Shemin; Hickman, Stephen H.; Holland, Austin F.; Majer, Ernest; Rubinstein, Justin L.; Sheehan, Anne

    2015-01-01

    Large areas of the United States long considered geologically stable with little or no detected seismicity have recently become seismically active. The increase in earthquake activity began in the mid-continent starting in 2001 (1) and has continued to rise. In 2014, the rate of occurrence of earthquakes with magnitudes (M) of 3 and greater in Oklahoma exceeded that in California (see the figure). This elevated activity includes larger earthquakes, several with M > 5, that have caused significant damage (2, 3). To a large extent, the increasing rate of earthquakes in the mid-continent is due to fluid-injection activities used in modern energy production (1, 4, 5). We explore potential avenues for mitigating effects of induced seismicity. Although the United States is our focus here, Canada, China, the UK, and others confront similar problems associated with oil and gas production, whereas quakes induced by geothermal activities affect Switzerland, Germany, and others.

  14. Strong ground motion prediction using virtual earthquakes.

    Science.gov (United States)

    Denolle, M A; Dunham, E M; Prieto, G A; Beroza, G C

    2014-01-24

    Sedimentary basins increase the damaging effects of earthquakes by trapping and amplifying seismic waves. Simulations of seismic wave propagation in sedimentary basins capture this effect; however, there exists no method to validate these results for earthquakes that have not yet occurred. We present a new approach for ground motion prediction that uses the ambient seismic field. We apply our method to a suite of magnitude 7 scenario earthquakes on the southern San Andreas fault and compare our ground motion predictions with simulations. Both methods find strong amplification and coupling of source and structure effects, but they predict substantially different shaking patterns across the Los Angeles Basin. The virtual earthquake approach provides a new approach for predicting long-period strong ground motion.

  15. Associating an ionospheric parameter with major earthquake ...

    Indian Academy of Sciences (India)

    ionospheric disturbance (SID) and 'td' is the dura- tion of the ... dayside of the earth, ionizing atmospheric parti- ... the increased emanation of excited radon molecules from the ground ..... tration following strong earthquake; Int. J. Remote Sens.

  16. Electrostatically actuated resonant switches for earthquake detection

    KAUST Repository

    Ramini, Abdallah H.; Masri, Karim M.; Younis, Mohammad I.

    2013-01-01

    action can be functionalized for useful functionalities, such as shutting off gas pipelines in the case of earthquakes, or can be used to activate a network of sensors for seismic activity recording in health monitoring applications. By placing a

  17. Earthquakes as Expressions of Tectonic Activity

    Indian Academy of Sciences (India)

    Sources, Types and Examples. Kusala Rajendran ... Science, Bangalore. Her research interests are mostly ... ogy, and some highlights on Indian earthquakes studies, and ..... jects, I did Applied Geophysics from the University of Roorkee.

  18. Earthquake Damping Device for Steel Frame

    Science.gov (United States)

    Zamri Ramli, Mohd; Delfy, Dezoura; Adnan, Azlan; Torman, Zaida

    2018-04-01

    Structures such as buildings, bridges and towers are prone to collapse when natural phenomena like earthquake occurred. Therefore, many design codes are reviewed and new technologies are introduced to resist earthquake energy especially on building to avoid collapse. The tuned mass damper is one of the earthquake reduction products introduced on structures to minimise the earthquake effect. This study aims to analyse the effectiveness of tuned mass damper by experimental works and finite element modelling. The comparisons are made between these two models under harmonic excitation. Based on the result, it is proven that installing tuned mass damper will reduce the dynamic response of the frame but only in several input frequencies. At the highest input frequency applied, the tuned mass damper failed to reduce the responses. In conclusion, in order to use a proper design of damper, detailed analysis must be carried out to have sufficient design based on the location of the structures with specific ground accelerations.

  19. Drinking Water Earthquake Resilience Paper Data

    Data.gov (United States)

    U.S. Environmental Protection Agency — Data for the 9 figures contained in the paper, A SOFTWARE FRAMEWORK FOR ASSESSING THE RESILIENCE OF DRINKING WATER SYSTEMS TO DISASTERS WITH AN EXAMPLE EARTHQUAKE...

  20. Can Dams and Reservoirs Cause Earthquakes?

    Indian Academy of Sciences (India)

    induced earthquakes in that region. Figure 1. A cartoon to illus- trate the spatial relation- ships between dam, reser- ... learning experience for us graduate students. Thus, on that ... infallibility and persuasiveness as in Euclidean geometry. The.

  1. DYFI data for Induced Earthquake Studies

    Data.gov (United States)

    Department of the Interior — The significant rise in seismicity rates in Oklahoma and Kansas (OK–KS) in the last decade has led to an increased interest in studying induced earthquakes. Although...

  2. Safety and survival in an earthquake

    Science.gov (United States)

    ,

    1969-01-01

    Many earth scientists in this country and abroad are focusing their studies on the search for means of predicting impending earthquakes, but, as yet, an accurate prediction of the time and place of such an event cannot be made. From past experience, however, one can assume that earthquakes will continue to harass mankind and that they will occur most frequently in the areas where they have been relatively common in the past. In the United States, earthquakes can be expected to occur most frequently in the western states, particularly in Alaska, California, Washington, Oregon, Nevada, Utah, and Montana. The danger, however, is not confined to any one part of the country; major earthquakes have occurred at widely scattered locations.

  3. Disturbances in equilibrium function after major earthquake.

    Science.gov (United States)

    Honma, Motoyasu; Endo, Nobutaka; Osada, Yoshihisa; Kim, Yoshiharu; Kuriyama, Kenichi

    2012-01-01

    Major earthquakes were followed by a large number of aftershocks and significant outbreaks of dizziness occurred over a large area. However it is unclear why major earthquake causes dizziness. We conducted an intergroup trial on equilibrium dysfunction and psychological states associated with equilibrium dysfunction in individuals exposed to repetitive aftershocks versus those who were rarely exposed. Greater equilibrium dysfunction was observed in the aftershock-exposed group under conditions without visual compensation. Equilibrium dysfunction in the aftershock-exposed group appears to have arisen from disturbance of the inner ear, as well as individual vulnerability to state anxiety enhanced by repetitive exposure to aftershocks. We indicate potential effects of autonomic stress on equilibrium function after major earthquake. Our findings may contribute to risk management of psychological and physical health after major earthquakes with aftershocks, and allow development of a new empirical approach to disaster care after such events.

  4. Shallow moonquakes - How they compare with earthquakes

    Science.gov (United States)

    Nakamura, Y.

    1980-01-01

    Of three types of moonquakes strong enough to be detectable at large distances - deep moonquakes, meteoroid impacts and shallow moonquakes - only shallow moonquakes are similar in nature to earthquakes. A comparison of various characteristics of moonquakes with those of earthquakes indeed shows a remarkable similarity between shallow moonquakes and intraplate earthquakes: (1) their occurrences are not controlled by tides; (2) they appear to occur in locations where there is evidence of structural weaknesses; (3) the relative abundances of small and large quakes (b-values) are similar, suggesting similar mechanisms; and (4) even the levels of activity may be close. The shallow moonquakes may be quite comparable in nature to intraplate earthquakes, and they may be of similar origin.

  5. Normal Faulting in the 1923 Berdún Earthquake and Postorogenic Extension in the Pyrenees

    Science.gov (United States)

    Stich, Daniel; Martín, Rosa; Batlló, Josep; Macià, Ramón; Mancilla, Flor de Lis; Morales, Jose

    2018-04-01

    The 10 July 1923 earthquake near Berdún (Spain) is the largest instrumentally recorded event in the Pyrenees. We recover old analog seismograms and use 20 hand-digitized waveforms for regional moment tensor inversion. We estimate moment magnitude Mw 5.4, centroid depth of 8 km, and a pure normal faulting source with strike parallel to the mountain chain (N292°E), dip of 66° and rake of -88°. The new mechanism fits into the general predominance of normal faulting in the Pyrenees and extension inferred from Global Positioning System data. The unique location of the 1923 earthquake, near the south Pyrenean thrust front, shows that the extensional regime is not confined to the axial zone where high topography and the crustal root are located. Together with seismicity near the northern mountain front, this indicates that gravitational potential energy in the western Pyrenees is not extracted locally but induces a wide distribution of postorogenic deformation.

  6. A detailed analysis of some local earthquakes at Somma-Vesuvius

    Directory of Open Access Journals (Sweden)

    C. Troise

    1999-06-01

    Full Text Available In this paper, we analyze local earthquakes which occurred at Somma-Vesuvius during two episodes of intense seismic swarms, in 1989 and 1995 respectively. For the selected earthquakes we have computed accurate hypocentral locations, focal mechanisms and spectral parameters. We have also studied the ground acceleration produced by the largest events of the sequences (ML 3.0, at various digital stations installed in the area during the periods of higher seismic activity. The main result is that seismicity during the two swarm episodes presents similar features in both locations and focal mechanisms. Strong site dependent effects are evidenced in the seismic radiation and strong amplifications in the frequency band 10-15 Hz are evident at stations located on the younger Vesuvius structure, with respect to one located on the ancient Somma structure. Furthermore, seismic stations show peak accelerations for the same events of more than one order of magnitude apart.

  7. Radon, gas geochemistry, groundwater, and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    King, Chi-Yu [Power Reactor and Nuclear Fuel Development Corp., Tono Geoscience Center, Toki, Gifu (Japan)

    1998-12-31

    Radon monitoring in groundwater, soil air, and atmosphere has been continued in many seismic areas of the world for earthquake-prediction and active-fault studies. Some recent measurements of radon and other geochemical and hydrological parameters have been made for sufficiently long periods, with reliable instruments, and together with measurements of meteorological variables and solid-earth tides. The resultant data are useful in better distinguishing earthquake-related changes from various background noises. Some measurements have been carried out in areas where other geophysical measurements are being made also. Comparative studies of various kinds of geophysical data are helpful in ascertaining the reality of the earthquake-related and fault-related anomalies and in understanding the underlying mechanisms. Spatial anomalies of radon and other terrestrial gasses have been observed for many active faults. Such observations indicate that gas concentrations are very much site dependent, particularly on fault zones where terrestrial fluids may move vertically. Temporal anomalies have been reliably observed before and after some recent earthquakes, including the 1995 Kobe earthquake, and the general pattern of anomaly occurrence remains the same as observed before: They are recorded at only relatively few sensitive sites, which can be at much larger distances than expected from existing earthquake-source models. The sensitivity of a sensitive site is also found to be changeable with time. These results clearly show the inadequacy of the existing dilatancy-fluid diffusion and elastic-dislocation models for earthquake sources to explain earthquake-related geochemical and geophysical changes recorded at large distances. (J.P.N.)

  8. Exploring Earthquakes in Real-Time

    Science.gov (United States)

    Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.

    2013-12-01

    Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.

  9. Catalog of Hawaiian earthquakes, 1823-1959

    Science.gov (United States)

    Klein, Fred W.; Wright, Thomas L.

    2000-01-01

    This catalog of more than 17,000 Hawaiian earthquakes (of magnitude greater than or equal to 5), principally located on the Island of Hawaii, from 1823 through the third quarter of 1959 is designed to expand our ability to evaluate seismic hazard in Hawaii, as well as our knowledge of Hawaiian seismic rhythms as they relate to eruption cycles at Kilauea and Mauna Loa volcanoes and to subcrustal earthquake patterns related to the tectonic evolution of the Hawaiian chain.

  10. Reliability of Soil Sublayers Under Earthquake Excitation

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Mørk, Kim Jørgensen

    A hysteretic model is formulated for a multi-layer subsoil subjected to horizontal earthquake shear waves (SH-waves). For each layer a modified Bouc-Wen model is used, relating the increments of the hysteretic shear stress to increments of the shear strain of the layer. Liquefaction is considered...... for each layer. The horizontal earthquake acceleration process at bedrock level is modelled as a non-stationary white noise, filtered through a time-invariant linear second order filter....

  11. Earthquake geology of the Bulnay Fault (Mongolia)

    Science.gov (United States)

    Rizza, Magali; Ritz, Jean-Franciois; Prentice, Carol S.; Vassallo, Ricardo; Braucher, Regis; Larroque, Christophe; Arzhannikova, A.; Arzhanikov, S.; Mahan, Shannon; Massault, M.; Michelot, J-L.; Todbileg, M.

    2015-01-01

    The Bulnay earthquake of July 23, 1905 (Mw 8.3-8.5), in north-central Mongolia, is one of the world's largest recorded intracontinental earthquakes and one of four great earthquakes that occurred in the region during the 20th century. The 375-km-long surface rupture of the left-lateral, strike-slip, N095°E trending Bulnay Fault associated with this earthquake is remarkable for its pronounced expression across the landscape and for the size of features produced by previous earthquakes. Our field observations suggest that in many areas the width and geometry of the rupture zone is the result of repeated earthquakes; however, in those areas where it is possible to determine that the geomorphic features are the result of the 1905 surface rupture alone, the size of the features produced by this single earthquake are singular in comparison to most other historical strike-slip surface ruptures worldwide. Along the 80 km stretch, between 97.18°E and 98.33°E, the fault zone is characterized by several meters width and the mean left-lateral 1905 offset is 8.9 ± 0.6 m with two measured cumulative offsets that are twice the 1905 slip. These observations suggest that the displacement produced during the penultimate event was similar to the 1905 slip. Morphotectonic analyses carried out at three sites along the eastern part of the Bulnay fault, allow us to estimate a mean horizontal slip rate of 3.1 ± 1.7 mm/yr over the Late Pleistocene-Holocene period. In parallel, paleoseismological investigations show evidence for two earthquakes prior to the 1905 event with recurrence intervals of ~2700-4000 years.

  12. The Christchurch earthquake stroke incidence study.

    Science.gov (United States)

    Wu, Teddy Y; Cheung, Jeanette; Cole, David; Fink, John N

    2014-03-01

    We examined the impact of major earthquakes on acute stroke admissions by a retrospective review of stroke admissions in the 6 weeks following the 4 September 2010 and 22 February 2011 earthquakes. The control period was the corresponding 6 weeks in the previous year. In the 6 weeks following the September 2010 earthquake there were 97 acute stroke admissions, with 79 (81.4%) ischaemic infarctions. This was similar to the 2009 control period which had 104 acute stroke admissions, of whom 80 (76.9%) had ischaemic infarction. In the 6 weeks following the February 2011 earthquake, there were 71 stroke admissions, and 61 (79.2%) were ischaemic infarction. This was less than the 96 strokes (72 [75%] ischaemic infarction) in the corresponding control period. None of the comparisons were statistically significant. There was also no difference in the rate of cardioembolic infarction from atrial fibrillation between the study periods. Patients admitted during the February 2011 earthquake period were less likely to be discharged directly home when compared to the control period (31.2% versus 46.9%, p=0.036). There was no observable trend in the number of weekly stroke admissions between the 2 weeks leading to and 6 weeks following the earthquakes. Our results suggest that severe psychological stress from earthquakes did not influence the subsequent short term risk of acute stroke, but the severity of the earthquake in February 2011 and associated civil structural damages may have influenced the pattern of discharge for stroke patients. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Collaboratory for the Study of Earthquake Predictability

    Science.gov (United States)

    Schorlemmer, D.; Jordan, T. H.; Zechar, J. D.; Gerstenberger, M. C.; Wiemer, S.; Maechling, P. J.

    2006-12-01

    Earthquake prediction is one of the most difficult problems in physical science and, owing to its societal implications, one of the most controversial. The study of earthquake predictability has been impeded by the lack of an adequate experimental infrastructure---the capability to conduct scientific prediction experiments under rigorous, controlled conditions and evaluate them using accepted criteria specified in advance. To remedy this deficiency, the Southern California Earthquake Center (SCEC) is working with its international partners, which include the European Union (through the Swiss Seismological Service) and New Zealand (through GNS Science), to develop a virtual, distributed laboratory with a cyberinfrastructure adequate to support a global program of research on earthquake predictability. This Collaboratory for the Study of Earthquake Predictability (CSEP) will extend the testing activities of SCEC's Working Group on Regional Earthquake Likelihood Models, from which we will present first results. CSEP will support rigorous procedures for registering prediction experiments on regional and global scales, community-endorsed standards for assessing probability-based and alarm-based predictions, access to authorized data sets and monitoring products from designated natural laboratories, and software to allow researchers to participate in prediction experiments. CSEP will encourage research on earthquake predictability by supporting an environment for scientific prediction experiments that allows the predictive skill of proposed algorithms to be rigorously compared with standardized reference methods and data sets. It will thereby reduce the controversies surrounding earthquake prediction, and it will allow the results of prediction experiments to be communicated to the scientific community, governmental agencies, and the general public in an appropriate research context.

  14. Mexican Earthquakes and Tsunamis Catalog Reviewed

    Science.gov (United States)

    Ramirez-Herrera, M. T.; Castillo-Aja, R.

    2015-12-01

    Today the availability of information on the internet makes online catalogs very easy to access by both scholars and the public in general. The catalog in the "Significant Earthquake Database", managed by the National Center for Environmental Information (NCEI formerly NCDC), NOAA, allows access by deploying tabular and cartographic data related to earthquakes and tsunamis contained in the database. The NCEI catalog is the product of compiling previously existing catalogs, historical sources, newspapers, and scientific articles. Because NCEI catalog has a global coverage the information is not homogeneous. Existence of historical information depends on the presence of people in places where the disaster occurred, and that the permanence of the description is preserved in documents and oral tradition. In the case of instrumental data, their availability depends on the distribution and quality of seismic stations. Therefore, the availability of information for the first half of 20th century can be improved by careful analysis of the available information and by searching and resolving inconsistencies. This study shows the advances we made in upgrading and refining data for the earthquake and tsunami catalog of Mexico since 1500 CE until today, presented in the format of table and map. Data analysis allowed us to identify the following sources of error in the location of the epicenters in existing catalogs: • Incorrect coordinate entry • Place name erroneous or mistaken • Too general data that makes difficult to locate the epicenter, mainly for older earthquakes • Inconsistency of earthquakes and the tsunami occurrence: earthquake's epicenter located too far inland reported as tsunamigenic. The process of completing the catalogs directly depends on the availability of information; as new archives are opened for inspection, there are more opportunities to complete the history of large earthquakes and tsunamis in Mexico. Here, we also present new earthquake and

  15. Simultaneous estimation of earthquake source parameters and ...

    Indian Academy of Sciences (India)

    moderate-size aftershocks (Mw 2.1–5.1) of the Mw 7.7 2001 Bhuj earthquake. The horizontal- ... claimed a death toll of 20,000 people. This earth- .... quake occurred west of Kachchh, with an epicenter at 24. ◦. N, 68 ..... for dominance of body waves for R ≤ 100 km ...... Bhuj earthquake sequence; J. Asian Earth Sci. 40.

  16. Natural Gas Extraction, Earthquakes and House Prices

    OpenAIRE

    Hans R.A. Koster; Jos N. van Ommeren

    2015-01-01

    The production of natural gas is strongly increasing around the world. Long-run negative external effects of extraction are understudied and often ignored in social) cost-benefit analyses. One important example is that natural gas extraction leads to soil subsidence and subsequent induced earthquakes that may occur only after a couple of decades. We show that induced earthquakes that are noticeable to residents generate substantial non-monetary economic effects, as measured by their effects o...

  17. Earthquake risk assessment of Alexandria, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  18. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    Science.gov (United States)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  19. Cognitive Hacking and Digital Government: Digital Identity

    OpenAIRE

    Paul Thompson

    2004-01-01

    Recently the National Center for Digital Government held a workshop on "The Virtual Citizen: Identity, Autonomy, and Accountability: A Civic Scenario Exploration of the Role of Identity in On-Line. Discussions at the workshop focused on five scenarios for future authentication policies with respect to digital identity. The underlying technologies considered for authentication were: biometrics: cryptography, with a focus on digital signatures; secure processing/computation; and reputation syst...

  20. Digital Materialisms: Frameworks for Digital Media Studies

    OpenAIRE

    Casemajor, Nathalie

    2015-01-01

    Since the 1980s, digital materialism has received increasing interest in the field of media studies. Materialism as a theoretical paradigm assumes that all things in the world are tied to physical processes and matter. Yet within digital media studies, the understanding of what should be the core object of a materialist analysis is debated. This paper proposes to untangle some of the principal theoretical propositions that compose the field of digital materialism. It outlines six frameworks t...

  1. Field Imaging Spectroscopy. Applications in Earthquake Geology

    Science.gov (United States)

    Ragona, D.; Minster, B.; Rockwell, T. K.; Fialko, Y.; Jussila, J.; Blom, R.

    2005-12-01

    Field Imaging Spectroscopy in the visible and infrared sections of the spectrum can be used as a technique to assist paleoseismological studies. Submeter range hyperspectral images of paleoseismic excavations can assist the analyisis and interpretation of the earthquake history of a site. They also provide an excellent platform for storage of the stratigraphic and structural information collected from such a site. At the present, most field data are collected descriptively. This greatly enhances the range of information that can be recorded in the field. The descriptions are documented on hand drawn field logs and/or photomosaics constructed from individual photographs. Recently developed portable hyperspectral sensors acquire high-quality spectroscopic information at high spatial resolution (pixel size ~ 0.5 mm at 50 cm) over frequencies ranging from the visible band to short wave infrared. The new data collection and interpretation methodology that we are developing (Field Imaging Spectroscopy) makes available, for the first time, a tool to quantitatively analyze paleoseismic and stratigraphic information. The reflectance spectra of each sub-millimeter portion of the material are stored in a 3-D matrix (hyperspectral cube) that can be analyzed by visual inspection, or by using a large variety of algorithms. The reflectance spectrum is related to the chemical composition and physical properties of the surface therefore hyperspectral images are capable of revealing subtle changes in texture, composition and weathering. For paleoseismic studies, we are primarily interested in distinguishing changes between layers at a given site (spectral stratigraphy) rather than the precise composition of the layers, although this is an added benefit. We have experimented with push-broom (panoramic) portable scanners, and acquired data form portions of fault exposures and cores. These images were processed using well-known imaging processing algorithms, and the results have being

  2. Investigating Earthquake-induced Landslides­a Historical Review

    Science.gov (United States)

    Keefer, D. K.; Geological Survey, Us; Park, Menlo; Usa, Ca

    , extensive to relatively complete inventories landslides have been prepared for a relatively small number of earthquakes. Through the 1960's and 1970's the best landslide inventories typically were complete only for a central affected area, although the first virtually complete inventory of a large earthquake was prepared for the M 7.6 Guatemala earthquake in 1976. Beginning in 1980, virtu- ally complete landslide inventories have prepared for several additional earthquakes in California, El Salvador, Japan, Italy, and Taiwan. Most of these used aerial pho- tography in combination with ground field studies, although the studies of the most recent of these events, in Taiwan, have also used satellite imagery, and three of the others (including the two smallest) were compiled largely from ground-based field 1 studies without aerial photography. Since 1989, digital mapping and GIS techniques have come into common use for mapping earthquake-induced landslides, and the use of these techniques has greatly enhanced the level of analysis that can be applied to earthquake-induced landslide occurrence. The first synthesis of data on earthquake- induced landslides, completed in 1984, defined the general characteristics of these landslides, derived relations between landslide occurrence on the one hand and geo- logic and seismic parameters on the other hand, and identified the types of hazards as- sociated with them. Since then, additional synthesis of worldwide data (1999) and na- tional data from New Zealand (1997), Greece (2000), and Italy (2000) have provided additional data on landslide characteristics and hazards and have extended, revised, and refined these relations. Recently completed studies have also identified areas with anomalous landslide distributions, have provided data for correlating the occurrence of landslides with a measure of local ground motion, have verified the occasional delayed triggering of landslides as a consequence of seismic shaking, and have identi- fied

  3. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  4. Studies of the subsurface effects of earthquakes

    International Nuclear Information System (INIS)

    Marine, I.W.

    1980-01-01

    As part of the National Terminal Waste Storage Program, the Savannah River Laboratory is conducting a series of studies on the subsurface effects of earthquakes. This report summarizes three subcontracted studies. (1) Earthquake damage to underground facilities: the purpose of this study was to document damage and nondamage caused by earthquakes to tunnels and shallow underground openings; to mines and other deep openings; and to wells, shafts, and other vertical facilities. (2) Earthquake related displacement fields near underground facilities: the study included an analysis of block motion, an analysis of the dependence of displacement on the orientation and distance of joints from the earthquake source, and displacement related to distance and depth near a causative fault as a result of various shapes, depths, and senses of movement on the causative fault. (3) Numerical simulation of earthquake effects on tunnels for generic nuclear waste repositories: the objective of this study was to use numerical modeling to determine under what conditions seismic waves might cause instability of an underground opening or create fracturing that would increase the permeability of the rock mass

  5. Relationship of heat and cold to earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y.

    1980-06-26

    An analysis of 54 earthquakes of magnitude 7 and above, including 13 of magnitude 8 and above, between 780 BC and the present, shows that the vast majority of them fell in the four major cool periods during this time span, or on the boundaries of these periods. Between 1800 and 1876, four periods of earthquake activity in China can be recognized, and these tend to correspond to relatively cold periods over that time span. An analysis of earthquakes of magnitude 6 or above over the period 1951 to 1965 gives the following results: earthquakes in north and southwest China tended to occur when the preceding year had an above-average annual temperature and winter temperature; in the northeast they tended to occur in a year after a year with an above-average winter temperature; in the northwest there was also a connection with a preceding warm winter, but to a less pronounced degree. The few earthquakes in South China seemed to follow cold winters. Both the Tangshan and Yongshan Pass earthquakes were preceded by unusually warm years and relatively high winter temperatures.

  6. Cyclic characteristics of earthquake time histories

    International Nuclear Information System (INIS)

    Hall, J.R. Jr; Shukla, D.K.; Kissenpfennig, J.F.

    1977-01-01

    From an engineering standpoint, an earthquake record may be characterized by a number of parameters, one of which is its 'cyclic characteristics'. The cyclic characteristics are most significant in fatigue analysis of structures and liquefaction analysis of soils where, in addition to the peak motion, cyclic buildup is significant. Whereas duration peak amplitude and response spectra for earthquakes have been studied extensively, the cyclic characteristics of earthquake records have not received an equivalent attention. Present procedures to define the cyclic characteristics are generally based upon counting the number of peaks at various amplitude ranges on a record. This paper presents a computer approach which describes a time history by an amplitude envelope and a phase curve. Using Fast Fourier Transform Techniques, an earthquake time history is represented as a projection along the x-axis of a rotating vector-the length the vector is given by the amplitude spectra-and the angle between the vector and x-axis is given by the phase curve. Thus one cycle is completed when the vector makes a full rotation. Based upon Miner's cumulative damage concept, the computer code automatically combines the cycles of various amplitudes to obtain the equivalent number of cycles of a given amplitude. To illustrate the overall results, the cyclic characteristics of several real and synthetic earthquake time histories have been studied and are presented in the paper, with the conclusion that this procedure provides a physical interpretation of the cyclic characteristics of earthquakes. (Auth.)

  7. Critical behavior in earthquake energy dissipation

    Science.gov (United States)

    Wanliss, James; Muñoz, Víctor; Pastén, Denisse; Toledo, Benjamín; Valdivia, Juan Alejandro

    2017-09-01

    We explore bursty multiscale energy dissipation from earthquakes flanked by latitudes 29° S and 35.5° S, and longitudes 69.501° W and 73.944° W (in the Chilean central zone). Our work compares the predictions of a theory of nonequilibrium phase transitions with nonstandard statistical signatures of earthquake complex scaling behaviors. For temporal scales less than 84 hours, time development of earthquake radiated energy activity follows an algebraic arrangement consistent with estimates from the theory of nonequilibrium phase transitions. There are no characteristic scales for probability distributions of sizes and lifetimes of the activity bursts in the scaling region. The power-law exponents describing the probability distributions suggest that the main energy dissipation takes place due to largest bursts of activity, such as major earthquakes, as opposed to smaller activations which contribute less significantly though they have greater relative occurrence. The results obtained provide statistical evidence that earthquake energy dissipation mechanisms are essentially "scale-free", displaying statistical and dynamical self-similarity. Our results provide some evidence that earthquake radiated energy and directed percolation belong to a similar universality class.

  8. Earthquakes trigger the loss of groundwater biodiversity

    Science.gov (United States)

    Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero

    2014-09-01

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  9. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    Science.gov (United States)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  10. The 5th July 1930 earthquake at Montilla (S Spain). Use of regionally recorded smoked paper seismograms

    Science.gov (United States)

    Batlló, J.; Stich, D.; Macià, R.; Morales, J.

    2009-04-01

    On the night of 5th July 1930 a damaging earthquake struck the town of Montilla (near Córdoba, S-Spain) and its surroundings. Magnitude estimation for this earthquake is M=5, and its epicentral intensity has been evaluated as VIII (MSK). Even it is an earthquake of moderate size, it is the largest one in-strumentally recorded in this region. This makes this event of interest for a better definition of the regional seismicity. For this reason we decided to study a new its source from the analysis of the available contemporary seismograms and related documents. A total of 25 seismograms from 11 seismic stations have been collected and digitized. Processing of some of the records has been difficult because they were obtained from microfilm or contemporary reproductions on journals. Most of them are on smoked paper and recorded at regional distances. This poses a good opportunity to test the limits of the use of such low frequency - low dynamics recorded seismograms for the study of regional events. Results are promising: Using such regional seismograms the event has been relocated, its magnitude recalculated (Mw 5.1) and inversion of waveforms to elucidate its focal mechanism has been performed. We present the results of this research and its consequences for the regional seismicity and we compare them with present smaller earthquakes occurred in the same place and with the results obtained for earthquakes of similar size occurred more to the East on 1951.

  11. Retrospective stress-forecasting of earthquakes

    Science.gov (United States)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  12. Digital multilayer tomography

    International Nuclear Information System (INIS)

    Dueber, C.; Klose, K.J.; Thelen, M.

    1991-01-01

    With digital multilayer tomography a sequence of projection images is recorded by an image intensifier television system and stored as digital data during a linear run of a layer sequence. Using this data record, tomograms of the examined body region can be computed for any layer thickness by shifts and superimposition of the single projections later at a digital workstation. The qualities of digital and conventional tomograms are basically comparable. A drawback of digital tomography is its lower local resolution (512 x 512 image matrix), advantages are a lower radiation exposure, a shorter patient examination time, and the facilities of digital image processing (later processing, archive setup, transmission). (orig.) [de

  13. Experience with digital mammography

    Directory of Open Access Journals (Sweden)

    G. P. Korzhenkova

    2011-01-01

    Full Text Available The use of digital techniques in mammography has become a last step for completing the process of digitization in diagnostic imaging. It is assumed that such a spatial decision will be required for digital mammography, as well as for high-resolution intensifying screen-film systems used in conventional mammography and that the digital techniques will be limited by the digitizer pixel size on detecting minor structures, such as microcalcifications. The introduction of digital technologies in mammography involves a tight control over an image and assures its high quality.

  14. Logic of the digital

    CERN Document Server

    Evens, Aden

    2015-01-01

    Building a foundational understanding of the digital, Logic of the Digital reveals a unique digital ontology. Beginning from formal and technical characteristics, especially the binary code at the core of all digital technologies, Aden Evens traces the pathways along which the digital domain of abstract logic encounters the material, human world. How does a code using only 0s and 1s give rise to the vast range of applications and information that constitutes a great and growing portion of our world? Evens' analysis shows how any encounter between the actual and the digital must cross an ontolo

  15. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  16. Web Services and Other Enhancements at the Northern California Earthquake Data Center

    Science.gov (United States)

    Neuhauser, D. S.; Zuzlewski, S.; Allen, R. M.

    2012-12-01

    The Northern California Earthquake Data Center (NCEDC) provides data archive and distribution services for seismological and geophysical data sets that encompass northern California. The NCEDC is enhancing its ability to deliver rapid information through Web Services. NCEDC Web Services use well-established web server and client protocols and REST software architecture to allow users to easily make queries using web browsers or simple program interfaces and to receive the requested data in real-time rather than through batch or email-based requests. Data are returned to the user in the appropriate format such as XML, RESP, or MiniSEED depending on the service, and are compatible with the equivalent IRIS DMC web services. The NCEDC is currently providing the following Web Services: (1) Station inventory and channel response information delivered in StationXML format, (2) Channel response information delivered in RESP format, (3) Time series availability delivered in text and XML formats, (4) Single channel and bulk data request delivered in MiniSEED format. The NCEDC is also developing a rich Earthquake Catalog Web Service to allow users to query earthquake catalogs based on selection parameters such as time, location or geographic region, magnitude, depth, azimuthal gap, and rms. It will return (in QuakeML format) user-specified results that can include simple earthquake parameters, as well as observations such as phase arrivals, codas, amplitudes, and computed parameters such as first motion mechanisms, moment tensors, and rupture length. The NCEDC will work with both IRIS and the International Federation of Digital Seismograph Networks (FDSN) to define a uniform set of web service specifications that can be implemented by multiple data centers to provide users with a common data interface across data centers. The NCEDC now hosts earthquake catalogs and waveforms from the US Department of Energy (DOE) Enhanced Geothermal Systems (EGS) monitoring networks. These

  17. Automatic Blocked Roads Assessment after Earthquake Using High Resolution Satellite Imagery

    Science.gov (United States)

    Rastiveis, H.; Hosseini-Zirdoo, E.; Eslamizade, F.

    2015-12-01

    In 2010, an earthquake in the city of Port-au-Prince, Haiti, happened quite by chance an accident and killed over 300000 people. According to historical data such an earthquake has not occurred in the area. Unpredictability of earthquakes has necessitated the need for comprehensive mitigation efforts to minimize deaths and injuries. Blocked roads, caused by debris of destroyed buildings, may increase the difficulty of rescue activities. In this case, a damage map, which specifies blocked and unblocked roads, can be definitely helpful for a rescue team. In this paper, a novel method for providing destruction map based on pre-event vector map and high resolution world view II satellite images after earthquake, is presented. For this purpose, firstly in pre-processing step, image quality improvement and co-coordination of image and map are performed. Then, after extraction of texture descriptor from the image after quake and SVM classification, different terrains are detected in the image. Finally, considering the classification results, specifically objects belong to "debris" class, damage analysis are performed to estimate the damage percentage. In this case, in addition to the area objects in the "debris" class their shape should also be counted. The aforementioned process are performed on all the roads in the road layer.In this research, pre-event digital vector map and post-event high resolution satellite image, acquired by Worldview-2, of the city of Port-au-Prince, Haiti's capital, were used to evaluate the proposed method. The algorithm was executed on 1200×800 m2 of the data set, including 60 roads, and all the roads were labelled correctly. The visual examination have authenticated the abilities of this method for damage assessment of urban roads network after an earthquake.

  18. Monitoring Geologic Hazards and Vegetation Recovery in the Wenchuan Earthquake Region Using Aerial Photography

    Directory of Open Access Journals (Sweden)

    Zhenwang Li

    2014-03-01

    Full Text Available On 12 May 2008, the 8.0-magnitude Wenchuan earthquake occurred in Sichuan Province, China, triggering thousands of landslides, debris flows, and barrier lakes, leading to a substantial loss of life and damage to the local environment and infrastructure. This study aimed to monitor the status of geologic hazards and vegetation recovery in a post-earthquake disaster area using high-resolution aerial photography from 2008 to 2011, acquired from the Center for Earth Observation and Digital Earth (CEODE, Chinese Academy of Sciences. The distribution and range of hazards were identified in 15 large, representative geologic hazard areas triggered by the Wenchuan earthquake. After conducting an overlay analysis, the variations of these hazards between successive years were analyzed to reflect the geologic hazard development and vegetation recovery. The results showed that in the first year after the Wenchuan earthquake, debris flows occurred frequently with high intensity. Resultantly, with the source material becoming less available and the slope structure stabilizing, the intensity and frequency of debris flows gradually decreased with time. The development rate of debris flows between 2008 and 2011 was 3% per year. The lithology played a dominant role in the formation of debris flows, and the topography and hazard size in the earthquake affected area also had an influence on the debris flow development process. Meanwhile, the overall geologic hazard area decreased at 12% per year, and the vegetation recovery on the landslide mass was 15% to 20% per year between 2008 and 2011. The outcomes of this study provide supporting data for ecological recovery as well as debris flow control and prevention projects in hazard-prone areas.

  19. Radon anomalies prior to earthquakes (2). Atmospheric radon anomaly observed before the Hyogoken-Nanbu earthquake

    International Nuclear Information System (INIS)

    Ishikawa, Tetsuo; Tokonami, Shinji; Yasuoka, Yumi; Shinogi, Masaki; Nagahama, Hiroyuki; Omori, Yasutaka; Kawada, Yusuke

    2008-01-01

    Before the 1995 Hyogoken-Nanbu earthquake, various geochemical precursors were observed in the aftershock area: chloride ion concentration, groundwater discharge rate, groundwater radon concentration and so on. Kobe Pharmaceutical University (KPU) is located about 25 km northeast from the epicenter and within the aftershock area. Atmospheric radon concentration had been continuously measured from 1984 at KPU, using a flow-type ionization chamber. The radon concentration data were analyzed using the smoothed residual values which represent the daily minimum of radon concentration with the exclusion of normalized seasonal variation. The radon concentration (smoothed residual values) demonstrated an upward trend about two months before the Hyogoken-Nanbu earthquake. The trend can be well fitted to a log-periodic model related to earthquake fault dynamics. As a result of model fitting, a critical point was calculated to be between 13 and 27 January 1995, which was in good agreement with the occurrence date of earthquake (17 January 1995). The mechanism of radon anomaly before earthquakes is not fully understood. However, it might be possible to detect atmospheric radon anomaly as a precursor before a large earthquake, if (1) the measurement is conducted near the earthquake fault, (2) the monitoring station is located on granite (radon-rich) areas, and (3) the measurement is conducted for more than several years before the earthquake to obtain background data. (author)

  20. Protecting your family from earthquakes: The seven steps to earthquake safety

    Science.gov (United States)

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  1. Slope instabilities triggered by the 2011 Lorca earthquake (M{sub w} 5.1): a comparison and revision of hazard assessments of earthquake-triggered landslides in Murcia; Inestabilidades de ladera provocadas por el terremoto de Lorca de 2011 (Mw 5,1): comparacion y revision de estudios de peligrosidad de movimientos de ladera por efecto sismico en Murcia

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez-Peces, M. J.; Garcia-Mayordomo, J.; Martinez-Diaz, J. J.; Tsige, M.

    2012-11-01

    The Lorca basin has been the object of recent research aimed at studying the phenomenon of earthquake induced landslides and their assessment within the context of different seismic scenarios, bearing in mind the influence of soil and topographical amplification effects. Nevertheless, it was not until the Lorca earthquakes of 11 May 2011 that it became possible to adopt a systematic approach to the problem. We provide here an inventory of slope instabilities triggered by the Lorca earthquakes comprising 100 cases, mainly small rock and soil falls (1 to 100 m{sup 3}). The distribution of these instabilities is compared to two different earthquake-triggered landslide hazard maps: one considering the occurrence of the most probable earthquake for a 475-yr return period in the Lorca basin (M{sub w} = 5.0), which was previously published on the basis of a low-resolution digital elevation model (DEM), and a second one matching the occurrence of the M{sub w} = 5.1 2011 Lorca earthquake, which was undertaken using a higher resolution DEM. The most frequent Newmark displacement values related to the slope failures triggered by the 2011 Lorca earthquakes are smaller than 2 cm in both hazard scenarios and coincide with areas where significant soil and topographical seismic amplification effects have occurred.

  2. Developing Dynamic Digital Image Techniques with Continuous Parameters to Detect Structural Damage

    Directory of Open Access Journals (Sweden)

    Ming-Hsiang Shih

    2013-01-01

    Full Text Available Several earthquakes with strong magnitude occurred globally at various locations, especially the unforgettable tsunami disaster caused by the earthquake in Indonesia and Japan. If the characteristics of structures can be well understood to implement new technology, the damages caused by most natural disasters can be significantly alleviated. In this research, dynamic digital image correlation method for using continuous parameter is applied for developing a low-cost digital image correlation coefficient method with advanced digital cameras and high-speed computers. The experimental study using cantilever test object with defect control confirms that the vibration mode calculated using this proposed method can highly express the defect locations. This proposed method combined with the sensitivity of Inter-Story Drift Mode Shape, IDMS, can also reveal the damage degree of damage structure. These test and analysis results indicate that this proposed method is high enough for applying to achieve the object of real-time online monitoring of structure.

  3. Soil structure interactions of eastern U.S. type earthquakes

    International Nuclear Information System (INIS)

    Chang Chen; Serhan, S.

    1991-01-01

    Two types of earthquakes have occurred in the eastern US in the past. One of them was the infrequent major events such as the 1811-1812 New Madrid Earthquakes, or the 1886 Charleston Earthquake. The other type was the frequent shallow earthquakes with high frequency, short duration and high accelerations. Two eastern US nuclear power plants, V.C Summer and Perry, went through extensive licensing effort to obtain fuel load licenses after this type of earthquake was recorded on sites and exceeded the design bases beyond 10 hertz region. This paper discusses the soil-structure interactions of the latter type of earthquakes

  4. Earthquakes - a danger to deep-lying repositories?

    International Nuclear Information System (INIS)

    2012-03-01

    This booklet issued by the Swiss National Cooperative for the Disposal of Radioactive Waste NAGRA takes a look at geological factors concerning earthquakes and the safety of deep-lying repositories for nuclear waste. The geological processes involved in the occurrence of earthquakes are briefly looked at and the definitions for magnitude and intensity of earthquakes are discussed. Examples of damage caused by earthquakes are given. The earthquake situation in Switzerland is looked at and the effects of earthquakes on sub-surface structures and deep-lying repositories are discussed. Finally, the ideas proposed for deep-lying geological repositories for nuclear wastes are discussed

  5. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  6. Consideration for standard earthquake vibration (1). The Niigataken Chuetsu-oki Earthquake in 2007

    International Nuclear Information System (INIS)

    Ishibashi, Katsuhiko

    2007-01-01

    Outline of new guideline of quakeproof design standard of nuclear power plant and the standard earthquake vibration are explained. The improvement points of new guideline are discussed on the basis of Kashiwazaki-Kariwa Nuclear Power Plant incidents. The fundamental limits of new guideline are pointed. Placement of the quakeproof design standard of nuclear power plant, JEAG4601 of Japan Electric Association, new guideline, standard earthquake vibration of new guideline, the Niigataken Chuetsu-oki Earthquake in 2007 and damage of Kashiwazaki-Kariwa Nuclear Power Plant are discussed. The safety criteria of safety review system, organization, standard and guideline should be improved on the basis of this earthquake and nuclear plant accident. The general knowledge, 'a nuclear power plant is not constructed in the area expected large earthquake', has to be realized. Preconditions of all nuclear power plants should not cause damage to anything. (S.Y.)

  7. Earthquake response of inelastic structures

    International Nuclear Information System (INIS)

    Parulekar, Y.M.; Vaity, K.N.; Reddy, .R.; Vaze, K.K.; Kushwaha, H.S.

    2004-01-01

    The most commonly used method in the seismic analysis of structures is the response spectrum method. For seismic re-evaluation of existing facilities elastic response spectrum method cannot be used directly as large deformation above yield may be observed under Safe Shutdown Earthquake (SSE). The plastic deformation, i.e. hysteretic characteristics of various elements of the structure cause dissipation of energy. Hence the values of damping given by the code, which does not account hysteretic energy dissipation cannot be directly used. In this paper, appropriate damping values are evaluated for 5-storey, 10-storey and 15-storey shear beam structures, which deform beyond their yield limit. Linear elastic analysis is performed for the same structures using these damping values and the storey forces are compared with those obtained using inelastic time history analysis. A damping model, which relates ductility of the structure and damping, is developed. Using his damping model, a practical structure is analysed and results are compared with inelastic time history analysis and the comparison is found to be good

  8. Presentation and analysis of a worldwide database of earthquake-induced landslide inventories

    Science.gov (United States)

    Tanyas, Hakan; van Westen, Cees J.; Allstadt, Kate E.; Nowicki Jessee, M. Anna; Gorum, Tolga; Jibson, Randall W.; Godt, Jonathan W.; Sato, Hiroshi P.; Schmitt, Robert G.; Marc, Odin; Hovius, Niels

    2017-01-01

    Earthquake-induced landslide (EQIL) inventories are essential tools to extend our knowledge of the relationship between earthquakes and the landslides they can trigger. Regrettably, such inventories are difficult to generate and therefore scarce, and the available ones differ in terms of their quality and level of completeness. Moreover, access to existing EQIL inventories is currently difficult because there is no centralized database. To address these issues, we compiled EQIL inventories from around the globe based on an extensive literature study. The database contains information on 363 landslide-triggering earthquakes and includes 66 digital landslide inventories. To make these data openly available, we created a repository to host the digital inventories that we have permission to redistribute through the U.S. Geological Survey ScienceBase platform. It can grow over time as more authors contribute their inventories. We analyze the distribution of EQIL events by time period and location, more specifically breaking down the distribution by continent, country, and mountain region. Additionally, we analyze frequency distributions of EQIL characteristics, such as the approximate area affected by landslides, total number of landslides, maximum distance from fault rupture zone, and distance from epicenter when the fault plane location is unknown. For the available digital EQIL inventories, we examine the underlying characteristics of landslide size, topographic slope, roughness, local relief, distance to streams, peak ground acceleration, peak ground velocity, and Modified Mercalli Intensity. Also, we present an evaluation system to help users assess the suitability of the available inventories for different types of EQIL studies and model development.

  9. Crowd-Sourced Global Earthquake Early Warning

    Science.gov (United States)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  10. Using remote sensing to predict earthquake impacts

    Science.gov (United States)

    Fylaktos, Asimakis; Yfantidou, Anastasia

    2017-09-01

    Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.

  11. Fractals and Forecasting in Earthquakes and Finance

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Turcotte, D. L.

    2011-12-01

    It is now recognized that Benoit Mandelbrot's fractals play a critical role in describing a vast range of physical and social phenomena. Here we focus on two systems, earthquakes and finance. Since 1942, earthquakes have been characterized by the Gutenberg-Richter magnitude-frequency relation, which in more recent times is often written as a moment-frequency power law. A similar relation can be shown to hold for financial markets. Moreover, a recent New York Times article, titled "A Richter Scale for the Markets" [1] summarized the emerging viewpoint that stock market crashes can be described with similar ideas as large and great earthquakes. The idea that stock market crashes can be related in any way to earthquake phenomena has its roots in Mandelbrot's 1963 work on speculative prices in commodities markets such as cotton [2]. He pointed out that Gaussian statistics did not account for the excessive number of booms and busts that characterize such markets. Here we show that both earthquakes and financial crashes can both be described by a common Landau-Ginzburg-type free energy model, involving the presence of a classical limit of stability, or spinodal. These metastable systems are characterized by fractal statistics near the spinodal. For earthquakes, the independent ("order") parameter is the slip deficit along a fault, whereas for the financial markets, it is financial leverage in place. For financial markets, asset values play the role of a free energy. In both systems, a common set of techniques can be used to compute the probabilities of future earthquakes or crashes. In the case of financial models, the probabilities are closely related to implied volatility, an important component of Black-Scholes models for stock valuations. [2] B. Mandelbrot, The variation of certain speculative prices, J. Business, 36, 294 (1963)

  12. Coastal California Digital Imagery

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This digital ortho-imagery dataset is a survey of coastal California. The project area consists of approximately 3774 square miles. The project design of the digital...

  13. Digital rectal exam

    Science.gov (United States)

    ... this page: //medlineplus.gov/ency/article/007069.htm Digital rectal exam To use the sharing features on this page, please enable JavaScript. A digital rectal exam is an examination of the lower ...

  14. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  15. Links Between Earthquake Characteristics and Subducting Plate Heterogeneity in the 2016 Pedernales Ecuador Earthquake Rupture Zone

    Science.gov (United States)

    Bai, L.; Mori, J. J.

    2016-12-01

    The collision between the Indian and Eurasian plates formed the Himalayas, the largest orogenic belt on the Earth. The entire region accommodates shallow earthquakes, while intermediate-depth earthquakes are concentrated at the eastern and western Himalayan syntaxis. Here we investigate the focal depths, fault plane solutions, and source rupture process for three earthquake sequences, which are located at the western, central and eastern regions of the Himalayan orogenic belt. The Pamir-Hindu Kush region is located at the western Himalayan syntaxis and is characterized by extreme shortening of the upper crust and strong interaction of various layers of the lithosphere. Many shallow earthquakes occur on the Main Pamir Thrust at focal depths shallower than 20 km, while intermediate-deep earthquakes are mostly located below 75 km. Large intermediate-depth earthquakes occur frequently at the western Himalayan syntaxis about every 10 years on average. The 2015 Nepal earthquake is located in the central Himalayas. It is a typical megathrust earthquake that occurred on the shallow portion of the Main Himalayan Thrust (MHT). Many of the aftershocks are located above the MHT and illuminate faulting structures in the hanging wall with dip angles that are steeper than the MHT. These observations provide new constraints on the collision and uplift processes for the Himalaya orogenic belt. The Indo-Burma region is located south of the eastern Himalayan syntaxis, where the strike of the plate boundary suddenly changes from nearly east-west at the Himalayas to nearly north-south at the Burma Arc. The Burma arc subduction zone is a typical oblique plate convergence zone. The eastern boundary is the north-south striking dextral Sagaing fault, which hosts many shallow earthquakes with focal depth less than 25 km. In contrast, intermediate-depth earthquakes along the subduction zone reflect east-west trending reverse faulting.

  16. Earthquake damage to underground facilities and earthquake related displacement fields

    International Nuclear Information System (INIS)

    Pratt, H.R.; Stephenson, D.E.; Zandt, G.; Bouchon, M.; Hustrulid, W.A.

    1982-01-01

    The potential seismic risk for an underground facility is considered in the evaluation of its location and design. The possible damage resulting from either large-scale displacements or high accelerations should be considered in evaluating potential sites of underground facilities. Scattered through the available literature are statements to the effect that below a few hundred meters shaking and damage in mines is less than at the surface; however, data for decreased damage underground have not been completely reported or explained. In order to assess the seismic risk for an underground facility, a data base was established and analyzed to evaluate the potential for seismic disturbance. Substantial damage to underground facilities is usually the result of displacements primarily along pre-existing faults and fractures, or at the surface entrance to these facilities. Evidence of this comes from both earthquakes as a function of depth is important in the evaluation of the hazard to underground facilities. To evaluate potential displacements due to seismic effects of block motions along pre-existing or induced fractures, the displacement fields surrounding two types of faults were investigated. Analytical models were used to determine relative displacements of shafts and near-surface displacement of large rock masses. Numerical methods were used to determine the displacement fields associated with pure strike-slip and vertical normal faults. Results are presented as displacements for various fault lengths as a function of depth and distance. This provides input to determine potential displacements in terms of depth and distance for underground facilities, important for assessing potential sites and design parameters

  17. Digital dannelse til gymnasieeleverne

    DEFF Research Database (Denmark)

    Kaarsted, Thomas; Holch Andersen, Knud

    2012-01-01

    Søsætningen af en ny tænketank skal udstikke nye digital retningslinjer for gymnasiekolerne. Baggrunden er en erkendelse af, at it-infrastruktur og digital teknologi ikke gør de alene.......Søsætningen af en ny tænketank skal udstikke nye digital retningslinjer for gymnasiekolerne. Baggrunden er en erkendelse af, at it-infrastruktur og digital teknologi ikke gør de alene....

  18. Basic digital signal processing

    CERN Document Server

    Lockhart, Gordon B

    1985-01-01

    Basic Digital Signal Processing describes the principles of digital signal processing and experiments with BASIC programs involving the fast Fourier theorem (FFT). The book reviews the fundamentals of the BASIC program, continuous and discrete time signals including analog signals, Fourier analysis, discrete Fourier transform, signal energy, power. The text also explains digital signal processing involving digital filters, linear time-variant systems, discrete time unit impulse, discrete-time convolution, and the alternative structure for second order infinite impulse response (IIR) sections.

  19. Digital asset management.

    Science.gov (United States)

    Humphrey, Clinton D; Tollefson, Travis T; Kriet, J David

    2010-05-01

    Facial plastic surgeons are accumulating massive digital image databases with the evolution of photodocumentation and widespread adoption of digital photography. Managing and maximizing the utility of these vast data repositories, or digital asset management (DAM), is a persistent challenge. Developing a DAM workflow that incorporates a file naming algorithm and metadata assignment will increase the utility of a surgeon's digital images. Copyright 2010 Elsevier Inc. All rights reserved.

  20. Digital Inkjet Textile Printing

    OpenAIRE

    Wang, Meichun

    2017-01-01

    Digital inkjet textile printing is an emerging technology developed with the rise of the digital world. It offers a possibility to print high-resolution images with unlimited color selection on fabrics. Digital inkjet printing brings a revolutionary chance for the textile printing industry. The history of textile printing shows the law how new technology replaces the traditional way of printing. This indicates the future of digital inkjet textile printing is relatively positive. Differen...

  1. Clinical characteristics of patients seizure following the 2016 Kumamoto earthquake.

    Science.gov (United States)

    Inatomi, Yuichiro; Nakajima, Makoto; Yonehara, Toshiro; Ando, Yukio

    2017-06-01

    To investigate the clinical characteristics of patients with seizure following the 2016 Kumamoto earthquake. We retrospectively studied patients with seizure admitted to our hospital for 12weeks following the earthquake. We compared the clinical backgrounds and characteristics of the patients: before (the same period from the previous 3years) and after the earthquake; and the early (first 2weeks) and late (subsequent 10weeks) phases. A total of 60 patients with seizure were admitted to the emergency room after the earthquake, and 175 (58.3/year) patients were admitted before the earthquake. Of them, 35 patients with seizure were hospitalized in the Department of Neurology after the earthquake, and 96 (32/year) patients were hospitalized before the earthquake. In patients after the earthquake, males and non-cerebrovascular diseases as an epileptogenic disease were seen more frequently than before the earthquake. During the early phase after the earthquake, female, first-attack, and non-focal-type patients were seen more frequently than during the late phase after the earthquake. These characteristics of patients with seizure during the early phase after the earthquake suggest that many patients had non-epileptic seizures. To prevent seizures following earthquakes, mental stress and physical status of evacuees must be assessed. Copyright © 2017. Published by Elsevier Ltd.

  2. Aftershocks of the India Republic Day Earthquake: the MAEC/ISTAR Temporary Seismograph Network

    Science.gov (United States)

    Bodin, P.; Horton, S.; Johnston, A.; Patterson, G.; Bollwerk, J.; Rydelek, P.; Steiner, G.; McGoldrick, C.; Budhbhatti, K. P.; Shah, R.; Macwan, N.

    2001-05-01

    The MW=7.7 Republic Day (26 January, 2001) earthquake on the Kachchh in western India initiated a strong sequence of small aftershocks. Seventeen days following the mainshock, we deployed a network of portable digital event recorders as a cooperative project of the Mid America Earthquake Center in the US and the Institute for Scientific and Technological Advanced Research. Our network consisted of 8 event-triggered Kinemetrics K2 seismographs with 6 data channels (3 accelerometer, 3 Mark L-28/3d seismometer) sampled at 200 Hz, and one continuously-recording Guralp CMG40TD broad-band seismometer sampled at 220 Hz. This network was in place for 18 days. Underlying our network deployment was the notion that because of its tectonic and geologic setting the Republic Day earthquake and its aftershocks might have source and/or propagation characteristics common to earthquakes in stable continental plate-interiors rather than those on plate boundaries or within continental mobile belts. Thus, our goals were to provide data that could be used to compare the Republic Day earthquake with other earthquakes. In particular, the objectives of our network deployment were: (1) to characterize the spatial distribution and occurrence rates of aftershocks, (2) to examine source characteristics of the aftershocks (stress-drops, focal mechanisms), (3) to study the effect of deep unconsolidated sediment on wave propagation, and (4) to determine if other faults (notably the Allah Bundh) were simultaneously active. Most of our sites were on Jurassic bedrock, and all were either free-field, or on the floor of light structures built on rock or with a thin soil cover. However, one of our stations was on a section of unconsolidated sediments hundreds of meters thick adjacent to a site that was subjected to shaking-induced sediment liquefaction during the mainshock. The largest aftershock reported by global networks was an MW=5.9 event on January 28, prior to our deployment. The largest

  3. Digital Language Death

    Science.gov (United States)

    Kornai, András

    2013-01-01

    Of the approximately 7,000 languages spoken today, some 2,500 are generally considered endangered. Here we argue that this consensus figure vastly underestimates the danger of digital language death, in that less than 5% of all languages can still ascend to the digital realm. We present evidence of a massive die-off caused by the digital divide. PMID:24167559

  4. Digital language death.

    Directory of Open Access Journals (Sweden)

    András Kornai

    Full Text Available Of the approximately 7,000 languages spoken today, some 2,500 are generally considered endangered. Here we argue that this consensus figure vastly underestimates the danger of digital language death, in that less than 5% of all languages can still ascend to the digital realm. We present evidence of a massive die-off caused by the digital divide.

  5. Digital Imaging. Chapter 16

    Energy Technology Data Exchange (ETDEWEB)

    Clunie, D. [CoreLab Partners, Princeton (United States)

    2014-09-15

    The original means of recording X ray images was a photographic plate. Nowadays, all medical imaging modalities provide for digital acquisition, though globally, the use of radiographic film is still widespread. Many modalities are fundamentally digital in that they require image reconstruction from quantified digital signals, such as computed tomography (CT) and magnetic resonance imaging (MRI)

  6. Preparing collections for digitization

    CERN Document Server

    Bulow, Anna E

    2010-01-01

    Most libraries, archives and museums are confronting the challenges of providing digital access to their collections. This guide offers guidance covering the end-to-end process of digitizing collections, from selecting records for digitization to choosing suppliers and equipment and dealing with documents that present individual problems.

  7. Digital voltage discriminator

    International Nuclear Information System (INIS)

    Zhou Zhicheng

    1992-01-01

    A digital voltage discriminator is described, which is synthesized by digital comparator and ADC. The threshold is program controllable with high stability. Digital region of confusion is approximately equal to 1.5 LSB. This discriminator has a single channel analyzer function model with channel width of 1.5 LSB

  8. Playtesting the Digital Playground

    DEFF Research Database (Denmark)

    Majgaard, G.; Jessen, Carsten

    2009-01-01

    Being able to be absorbed in play in the digital playground is motivating for children who are used digital computer games. The children can play and exercise outdoors while using the same literacy as in indoor digital games. This paper presents a new playground product where an outdoor playgroun...

  9. Prevention of strong earthquakes: Goal or utopia?

    Science.gov (United States)

    Mukhamediev, Sh. A.

    2010-11-01

    In the present paper, we consider ideas suggesting various kinds of industrial impact on the close-to-failure block of the Earth’s crust in order to break a pending strong earthquake (PSE) into a number of smaller quakes or aseismic slips. Among the published proposals on the prevention of a forthcoming strong earthquake, methods based on water injection and vibro influence merit greater attention as they are based on field observations and the results of laboratory tests. In spite of this, the cited proofs are, for various reasons, insufficient to acknowledge the proposed techniques as highly substantiated; in addition, the physical essence of these methods has still not been fully understood. First, the key concept of the methods, namely, the release of the accumulated stresses (or excessive elastic energy) in the source region of a forthcoming strong earthquake, is open to objection. If we treat an earthquake as a phenomenon of a loss in stability, then, the heterogeneities of the physicomechanical properties and stresses along the existing fault or its future trajectory, rather than the absolute values of stresses, play the most important role. In the present paper, this statement is illustrated by the classical examples of stable and unstable fractures and by the examples of the calculated stress fields, which were realized in the source regions of the tsunamigenic earthquakes of December 26, 2004 near the Sumatra Island and of September 29, 2009 near the Samoa Island. Here, just before the earthquakes, there were no excessive stresses in the source regions. Quite the opposite, the maximum shear stresses τmax were close to their minimum value, compared to τmax in the adjacent territory. In the present paper, we provide quantitative examples that falsify the theory of the prevention of PSE in its current form. It is shown that the measures for the prevention of PSE, even when successful for an already existing fault, can trigger or accelerate a catastrophic

  10. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    Science.gov (United States)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  11. Where was the 1898 Mare Island Earthquake? Insights from the 2014 South Napa Earthquake

    Science.gov (United States)

    Hough, S. E.

    2014-12-01

    The 2014 South Napa earthquake provides an opportunity to reconsider the Mare Island earthquake of 31 March 1898, which caused severe damage to buildings at a Navy yard on the island. Revising archival accounts of the 1898 earthquake, I estimate a lower intensity magnitude, 5.8, than the value in the current Uniform California Earthquake Rupture Forecast (UCERF) catalog (6.4). However, I note that intensity magnitude can differ from Mw by upwards of half a unit depending on stress drop, which for a historical earthquake is unknowable. In the aftermath of the 2014 earthquake, there has been speculation that apparently severe effects on Mare Island in 1898 were due to the vulnerability of local structures. No surface rupture has ever been identified from the 1898 event, which is commonly associated with the Hayward-Rodgers Creek fault system, some 10 km west of Mare Island (e.g., Parsons et al., 2003). Reconsideration of detailed archival accounts of the 1898 earthquake, together with a comparison of the intensity distributions for the two earthquakes, points to genuinely severe, likely near-field ground motions on Mare Island. The 2014 earthquake did cause significant damage to older brick buildings on Mare Island, but the level of damage does not match the severity of documented damage in 1898. The high intensity files for the two earthquakes are more over spatially shifted, with the centroid of the 2014 distribution near the town of Napa and that of the 1898 distribution near Mare Island, east of the Hayward-Rodgers Creek system. I conclude that the 1898 Mare Island earthquake was centered on or near Mare Island, possibly involving rupture of one or both strands of the Franklin fault, a low-slip-rate fault sub-parallel to the Rodgers Creek fault to the west and the West Napa fault to the east. I estimate Mw5.8 assuming an average stress drop; data are also consistent with Mw6.4 if stress drop was a factor of ≈3 lower than average for California earthquakes. I

  12. An algorithm developed in Matlab for the automatic selection of cut-off frequencies, in the correction of strong motion data

    Science.gov (United States)

    Sakkas, Georgios; Sakellariou, Nikolaos

    2018-05-01

    Strong motion recordings are the key in many earthquake engineering applications and are also fundamental for seismic design. The present study focuses on the automated correction of accelerograms, analog and digital. The main feature of the proposed algorithm is the automatic selection for the cut-off frequencies based on a minimum spectral value in a predefined frequency bandwidth, instead of the typical signal-to-noise approach. The algorithm follows the basic steps of the correction procedure (instrument correction, baseline correction and appropriate filtering). Besides the corrected time histories, Peak Ground Acceleration, Peak Ground Velocity, Peak Ground Displacement values and the corrected Fourier Spectra are also calculated as well as the response spectra. The algorithm is written in Matlab environment, is fast enough and can be used for batch processing or in real-time applications. In addition, the possibility to also perform a signal-to-noise ratio is added as well as to perform causal or acausal filtering. The algorithm has been tested in six significant earthquakes (Kozani-Grevena 1995, Aigio 1995, Athens 1999, Lefkada 2003 and Kefalonia 2014) of the Greek territory with analog and digital accelerograms.

  13. Tokai earthquakes and Hamaoka Nuclear Power Station

    International Nuclear Information System (INIS)

    Komura, Hiroo

    1981-01-01

    Kanto district and Shizuoka Prefecture are designated as ''Observation strengthening districts'', where the possibility of earthquake occurrence is high. Hamaoka Nuclear Power Station, Chubu Electric Power Co., Inc., is at the center of this district. Nuclear power stations are vulnerable to earthquakes, and if damages are caused by earthquakes in nuclear power plants, the most dreadful accidents may occur. The Chubu Electric Power Co. underestimates the possibility and scale of earthquakes and the estimate of damages, and has kept on talking that the rock bed of the power station site is strong, and there is not the fear of accidents. However the actual situation is totally different from this. The description about earthquakes and the rock bed in the application of the installation of No.3 plant was totally rewritten after two years safety examination, and the Ministry of International Trade and Industry approved the application in less than two weeks thereafter. The rock bed is geologically evaluated in this paper, and many doubtful points in the application are pointed out. In addition, there are eight active faults near the power station site. The aseismatic design of the Hamaoka Nuclear Power Station assumes the acceleration up to 400 gal, but it may not be enough. The Hamaoka Nuclear Power Station is intentionally neglected in the estimate of damages in Shizuoka Prefecture. (Kako, I.)

  14. Real-time earthquake data feasible

    Science.gov (United States)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  15. A Deterministic Approach to Earthquake Prediction

    Directory of Open Access Journals (Sweden)

    Vittorio Sgrigna

    2012-01-01

    Full Text Available The paper aims at giving suggestions for a deterministic approach to investigate possible earthquake prediction and warning. A fundamental contribution can come by observations and physical modeling of earthquake precursors aiming at seeing in perspective the phenomenon earthquake within the framework of a unified theory able to explain the causes of its genesis, and the dynamics, rheology, and microphysics of its preparation, occurrence, postseismic relaxation, and interseismic phases. Studies based on combined ground and space observations of earthquake precursors are essential to address the issue. Unfortunately, up to now, what is lacking is the demonstration of a causal relationship (with explained physical processes and looking for a correlation between data gathered simultaneously and continuously by space observations and ground-based measurements. In doing this, modern and/or new methods and technologies have to be adopted to try to solve the problem. Coordinated space- and ground-based observations imply available test sites on the Earth surface to correlate ground data, collected by appropriate networks of instruments, with space ones detected on board of Low-Earth-Orbit (LEO satellites. Moreover, a new strong theoretical scientific effort is necessary to try to understand the physics of the earthquake.

  16. Roaming earthquakes in China highlight midcontinental hazards

    Science.gov (United States)

    Liu, Mian; Wang, Hui

    2012-11-01

    Before dawn on 28 July 1976, a magnitude (M) 7.8 earthquake struck Tangshan, a Chinese industrial city only 150 kilometers from Beijing (Figure 1a). In a brief moment, the earthquake destroyed the entire city and killed more than 242,000 people [Chen et al., 1988]. More than 30 years have passed, and upon the ruins a new Tangshan city has been built. However, the memory of devastation remains fresh. For this reason, a sequence of recent small earthquakes in the Tangshan region, including an M 4.8 event on 28 May and an M 4.0 event on 18 June 2012, has caused widespread concerns and heated debate in China. In the science community, the debate is whether the recent Tangshan earthquakes are the aftershocks of the 1976 earthquake despite the long gap in time since the main shock or harbingers of a new period of active seismicity in Tangshan and the rest of North China, where seismic activity seems to fluctuate between highs and lows over periods of a few decades [Ma, 1989].

  17. Electrostatically actuated resonant switches for earthquake detection

    KAUST Repository

    Ramini, Abdallah H.

    2013-04-01

    The modeling and design of electrostatically actuated resonant switches (EARS) for earthquake and seismic applications are presented. The basic concepts are based on operating an electrically actuated resonator close to instability bands of frequency, where it is forced to collapse (pull-in) if operated within these bands. By careful tuning, the resonator can be made to enter the instability zone upon the detection of the earthquake signal, thereby pulling-in as a switch. Such a switching action can be functionalized for useful functionalities, such as shutting off gas pipelines in the case of earthquakes, or can be used to activate a network of sensors for seismic activity recording in health monitoring applications. By placing a resonator on a printed circuit board (PCB) of a natural frequency close to that of the earthquake\\'s frequency, we show significant improvement on the detection limit of the EARS lowering it considerably to less than 60% of the EARS by itself without the PCB. © 2013 IEEE.

  18. Earthquake risk assessment of building structures

    International Nuclear Information System (INIS)

    Ellingwood, Bruce R.

    2001-01-01

    During the past two decades, probabilistic risk analysis tools have been applied to assess the performance of new and existing building structural systems. Structural design and evaluation of buildings and other facilities with regard to their ability to withstand the effects of earthquakes requires special considerations that are not normally a part of such evaluations for other occupancy, service and environmental loads. This paper reviews some of these special considerations, specifically as they pertain to probability-based codified design and reliability-based condition assessment of existing buildings. Difficulties experienced in implementing probability-based limit states design criteria for earthquake are summarized. Comparisons of predicted and observed building damage highlight the limitations of using current deterministic approaches for post-earthquake building condition assessment. The importance of inherent randomness and modeling uncertainty in forecasting building performance is examined through a building fragility assessment of a steel frame with welded connections that was damaged during the Northridge Earthquake of 1994. The prospects for future improvements in earthquake-resistant design procedures based on a more rational probability-based treatment of uncertainty are examined

  19. Metrics for comparing dynamic earthquake rupture simulations

    Science.gov (United States)

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  20. Earthquake chemical precursors in groundwater: a review

    Science.gov (United States)

    Paudel, Shukra Raj; Banjara, Sushant Prasad; Wagle, Amrita; Freund, Friedemann T.

    2018-03-01

    We review changes in groundwater chemistry as precursory signs for earthquakes. In particular, we discuss pH, total dissolved solids (TDS), electrical conductivity, and dissolved gases in relation to their significance for earthquake prediction or forecasting. These parameters are widely believed to vary in response to seismic and pre-seismic activity. However, the same parameters also vary in response to non-seismic processes. The inability to reliably distinguish between changes caused by seismic or pre-seismic activities from changes caused by non-seismic activities has impeded progress in earthquake science. Short-term earthquake prediction is unlikely to be achieved, however, by pH, TDS, electrical conductivity, and dissolved gas measurements alone. On the other hand, the production of free hydroxyl radicals (•OH), subsequent reactions such as formation of H2O2 and oxidation of As(III) to As(V) in groundwater, have distinctive precursory characteristics. This study deviates from the prevailing mechanical mantra. It addresses earthquake-related non-seismic mechanisms, but focused on the stress-induced electrification of rocks, the generation of positive hole charge carriers and their long-distance propagation through the rock column, plus on electrochemical processes at the rock-water interface.

  1. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  2. A preliminary regional assessment of earthquake-induced landslide susceptibility for Vrancea Seismic Region

    Science.gov (United States)

    Micu, Mihai; Balteanu, Dan; Ionescu, Constantin; Havenith, Hans; Radulian, Mircea; van Westen, Cees; Damen, Michiel; Jurchescu, Marta

    2015-04-01

    ) with head scarps near mountain tops and close to faults is similar to the one of large mass movements for which a seismic origin is proved (such as in the Tien Shan, Pamir, Longmenshan, etc.). Thus, correlations between landslide occurrence and combined seismotectonic and climatic factors are needed to support a regional multi-hazard risk assessment. The purpose of this paper is to harmonize for the first time at a regional scale the landslide predisposing factors and seismotectonic triggers and to present a first qualitative insight into the earthquake-induced landslide susceptibility for the Vrancea Seismic Region in terms of a GIS-based analysis of Newmark displacement (ND). In this way, it aims at better defining spatial and temporal distribution patterns of earthquake-triggered landslides. Arias Intensity calculation involved in the assessment considers both regional seismic hazard aspects and singular earthquake scenarios (adjusted by topography amplification factors). The known distribution of landslides mapped through digital stereographic interpretation of high-resolution aerial photos is compared with digital active fault maps and the computed ND maps to statistically outline the seismotectonic influence on slope stability in the study area. The importance of this approach resides in two main outputs. The fist one, of a fundamental nature, by providing the first regional insight into the seismic landslides triggering framework, is allowing us to understand if deep-focus earthquakes may trigger massive slope failures in an area with a relatively smooth relief (compared to the high mountain regions in Central Asia, the Himalayas), considering possible geologic and topographic site effects. The second one, more applied, will allow a better accelerometer instrumentation and monitoring of slopes and also will provide a first correlation of different levels of seismic shaking with precipitation recurrences, an important relationship within a multi-hazard risk

  3. Damage instability and Earthquake nucleation

    Science.gov (United States)

    Ionescu, I. R.; Gomez, Q.; Campillo, M.; Jia, X.

    2017-12-01

    Earthquake nucleation (initiation) is usually associated to the loss of the stability of the geological structure under a slip-weakening friction acting on the fault. The key parameters involved in the stability of the fault are the stress drop, the critical slip distance but also the elastic stiffness of the surrounding materials (rocks). We want to explore here how the nucleation phenomena are correlated to the material softening during damage accumulation by dynamic and/or quasi-static processes. Since damage models are describing micro-cracks growth, which is generally an unstable phenomenon, it is natural to expect some loss of stability on the associated micro-mechanics based models. If the model accurately captures the material behavior, then this can be due to the unstable nature of the brittle material itself. We obtained stability criteria at the microscopic scale, which are related to a large class of damage models. We show that for a given continuous strain history the quasi-static or dynamic problems are instable or ill-posed (multiplicity of material responses) and whatever the selection rule is adopted, shocks (time discontinuities) will occur. We show that the quasi-static equilibria chosen by the "perfect delay convention" is always stable. These stability criteria are used to analyze how NIC (Non Interacting Crack) effective elasticity associated to "self similar growth" model work in some special configurations (one family of micro-cracks in mode I, II and III and in plane strain or plain stress). In each case we determine a critical crack density parameter and critical micro-crack radius (length) which distinguish between stable and unstable behaviors. This critical crack density depends only on the chosen configuration and on the Poisson ratio.

  4. Calculation of earthquake rupture histories using a hybrid global search algorithm: Application to the 1992 Landers, California, earthquake

    Science.gov (United States)

    Hartzell, S.; Liu, P.

    1996-01-01

    A method is presented for the simultaneous calculation of slip amplitudes and rupture times for a finite fault using a hybrid global search algorithm. The method we use combines simulated annealing with the downhill simplex method to produce a more efficient search algorithm then either of the two constituent parts. This formulation has advantages over traditional iterative or linearized approaches to the problem because it is able to escape local minima in its search through model space for the global optimum. We apply this global search method to the calculation of the rupture history for the Landers, California, earthquake. The rupture is modeled using three separate finite-fault planes to represent the three main fault segments that failed during this earthquake. Both the slip amplitude and the time of slip are calculated for a grid work of subfaults. The data used consist of digital, teleseismic P and SH body waves. Long-period, broadband, and short-period records are utilized to obtain a wideband characterization of the source. The results of the global search inversion are compared with a more traditional linear-least-squares inversion for only slip amplitudes. We use a multi-time-window linear analysis to relax the constraints on rupture time and rise time in the least-squares inversion. Both inversions produce similar slip distributions, although the linear-least-squares solution has a 10% larger moment (7.3 ?? 1026 dyne-cm compared with 6.6 ?? 1026 dyne-cm). Both inversions fit the data equally well and point out the importance of (1) using a parameterization with sufficient spatial and temporal flexibility to encompass likely complexities in the rupture process, (2) including suitable physically based constraints on the inversion to reduce instabilities in the solution, and (3) focusing on those robust rupture characteristics that rise above the details of the parameterization and data set.

  5. Bridge seismic retrofit measures considering subduction zone earthquakes.

    Science.gov (United States)

    2015-07-01

    Over the years, earthquakes have exposed the vulnerability of reinforced concrete structures under : seismic loads. The recent occurrence of highly devastating earthquakes near instrumented regions, e.g. 2010 Maule, Chile : and 2011 Tohoku, Japan, ha...

  6. Parent Guidelines for Helping Children After an Earthquake

    Science.gov (United States)

    Parent Guidelines for Helping Children after an Earthquake Being in an earthquake is very frightening, and the days, weeks, and months following are very stressful. Your children and family will recover ...

  7. United States Earthquake Intensity Database, 1638-1985

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The United States Earthquake Intensity Database is a collection of damage and felt reports for over 23,000 U.S. earthquakes from 1638-1985. The majority of...

  8. Tilt Precursors before Earthquakes on the San Andreas Fault, California.

    Science.gov (United States)

    Johnston, M J; Mortensen, C E

    1974-12-13

    An array of 14 biaxial shallow-borehole tiltmeters (at 1O(-7) radian sensitivity) has been installed along 85 kilometers of the San Andreas fault during the past year. Earthquake-related changes in tilt have been simultaneously observed on up to four independent instruments. At earthquake distances greater than 10 earthquake source dimensions, there are few clear indications of tilt change. For the four instruments with the longest records (> 10 months), 26 earthquakes have occurred since July 1973 with at least one instrument closer than 10 source dimensions and 8 earthquakes with more than one instrument within that distance. Precursors in tilt direction have been observed before more than 10 earthquakes or groups of earthquakes, and no similar effect has yet been seen without the occurrence of an earthquake.

  9. Digital disruption ?syndromes.

    Science.gov (United States)

    Sullivan, Clair; Staib, Andrew

    2017-05-18

    The digital transformation of hospitals in Australia is occurring rapidly in order to facilitate innovation and improve efficiency. Rapid transformation can cause temporary disruption of hospital workflows and staff as processes are adapted to the new digital workflows. The aim of this paper is to outline various types of digital disruption and some strategies for effective management. A large tertiary university hospital recently underwent a rapid, successful roll-out of an integrated electronic medical record (EMR). We observed this transformation and propose several digital disruption "syndromes" to assist with understanding and management during digital transformation: digital deceleration, digital transparency, digital hypervigilance, data discordance, digital churn and post-digital 'depression'. These 'syndromes' are defined and discussed in detail. Successful management of this temporary digital disruption is important to ensure a successful transition to a digital platform. What is known about this topic? Digital disruption is defined as the changes facilitated by digital technologies that occur at a pace and magnitude that disrupt established ways of value creation, social interactions, doing business and more generally our thinking. Increasing numbers of Australian hospitals are implementing digital solutions to replace traditional paper-based systems for patient care in order to create opportunities for improved care and efficiencies. Such large scale change has the potential to create transient disruption to workflows and staff. Managing this temporary disruption effectively is an important factor in the successful implementation of an EMR. What does this paper add? A large tertiary university hospital recently underwent a successful rapid roll-out of an integrated electronic medical record (EMR) to become Australia's largest digital hospital over a 3-week period. We observed and assisted with the management of several cultural, behavioural and

  10. Ageing and digital games

    DEFF Research Database (Denmark)

    Iversen, Sara Mosberg

    Digital games are still to a great degree considered a medium mainly for young boys. However, available statistics on Western media use show that this is far from the case. Increasingly, people of all ages and genders play digital games, also older adults in their early 60s and beyond. The aim...... of the book is to examine, analyse and discuss: 1) What older adults do with digital games and what meanings the use of digital games take on in the everyday life of older adults; 2) How older adults are perceived by society in relation to digital games; 3) How play and games can be used both...

  11. Digital Living at Home

    DEFF Research Database (Denmark)

    Andersen, Pernille Viktoria Kathja; Christiansen, Ellen Tove

    2013-01-01

    of these user voices has directed us towards a ‘home-keeping’ design discourse, which opens new horizons for design of digital home control systems by allowing users to perform as self-determined controllers and groomers of their habitat. The paper concludes by outlining the implications of a ‘home......Does living with digital technology inevitably lead to digital living? Users talking about a digital home control system, they have had in their homes for eight years, indicate that there is more to living with digital technology than a functional-operational grip on regulation. Our analysis......-keeping’ design discourse....

  12. Automatic recognition of seismic intensity based on RS and GIS: a case study in Wenchuan Ms8.0 earthquake of China.

    Science.gov (United States)

    Zhang, Qiuwen; Zhang, Yan; Yang, Xiaohong; Su, Bin

    2014-01-01

    In recent years, earthquakes have frequently occurred all over the world, which caused huge casualties and economic losses. It is very necessary and urgent to obtain the seismic intensity map timely so as to master the distribution of the disaster and provide supports for quick earthquake relief. Compared with traditional methods of drawing seismic intensity map, which require many investigations in the field of earthquake area or are too dependent on the empirical formulas, spatial information technologies such as Remote Sensing (RS) and Geographical Information System (GIS) can provide fast and economical way to automatically recognize the seismic intensity. With the integrated application of RS and GIS, this paper proposes a RS/GIS-based approach for automatic recognition of seismic intensity, in which RS is used to retrieve and extract the information on damages caused by earthquake, and GIS is applied to manage and display the data of seismic intensity. The case study in Wenchuan Ms8.0 earthquake in China shows that the information on seismic intensity can be automatically extracted from remotely sensed images as quickly as possible after earthquake occurrence, and the Digital Intensity Model (DIM) can be used to visually query and display the distribution of seismic intensity.

  13. On a report that the 2012 M 6.0 earthquake in Italy was predicted after seeing an unusual cloud formation

    Science.gov (United States)

    Thomas, J.N.; Masci, F; Love, Jeffrey J.

    2015-01-01

    Several recently published reports have suggested that semi-stationary linear-cloud formations might be causally precursory to earthquakes. We examine the report of Guangmeng and Jie (2013), who claim to have predicted the 2012 M 6.0 earthquake in the Po Valley of northern Italy after seeing a satellite photograph (a digital image) showing a linear-cloud formation over the eastern Apennine Mountains of central Italy. From inspection of 4 years of satellite images we find numerous examples of linear-cloud formations over Italy. A simple test shows no obvious statistical relationship between the occurrence of these cloud formations and earthquakes that occurred in and around Italy. All of the linear-cloud formations we have identified in satellite images, including that which Guangmeng and Jie (2013) claim to have used to predict the 2012 earthquake, appear to be orographic – formed by the interaction of moisture-laden wind flowing over mountains. Guangmeng and Jie (2013) have not clearly stated how linear-cloud formations can be used to predict the size, location, and time of an earthquake, and they have not published an account of all of their predictions (including any unsuccessful predictions). We are skeptical of the validity of the claim by Guangmeng and Jie (2013) that they have managed to predict any earthquakes.

  14. Automatic Recognition of Seismic Intensity Based on RS and GIS: A Case Study in Wenchuan Ms8.0 Earthquake of China

    Directory of Open Access Journals (Sweden)

    Qiuwen Zhang

    2014-01-01

    Full Text Available In recent years, earthquakes have frequently occurred all over the world, which caused huge casualties and economic losses. It is very necessary and urgent to obtain the seismic intensity map timely so as to master the distribution of the disaster and provide supports for quick earthquake relief. Compared with traditional methods of drawing seismic intensity map, which require many investigations in the field of earthquake area or are too dependent on the empirical formulas, spatial information technologies such as Remote Sensing (RS and Geographical Information System (GIS can provide fast and economical way to automatically recognize the seismic intensity. With the integrated application of RS and GIS, this paper proposes a RS/GIS-based approach for automatic recognition of seismic intensity, in which RS is used to retrieve and extract the information on damages caused by earthquake, and GIS is applied to manage and display the data of seismic intensity. The case study in Wenchuan Ms8.0 earthquake in China shows that the information on seismic intensity can be automatically extracted from remotely sensed images as quickly as possible after earthquake occurrence, and the Digital Intensity Model (DIM can be used to visually query and display the distribution of seismic intensity.

  15. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  16. Evidence for strong Holocene earthquake(s) in the Wabash Valley seismic zone

    International Nuclear Information System (INIS)

    Obermeier, S.

    1991-01-01

    Many small and slightly damaging earthquakes have taken place in the region of the lower Wabash River Valley of Indiana and Illinois during the 200 years of historic record. Seismologists have long suspected the Wabash Valley seismic zone to be capable of producing earthquakes much stronger than the largest of record (m b 5.8). The seismic zone contains the poorly defined Wabash Valley fault zone and also appears to contain other vaguely defined faults at depths from which the strongest earthquakes presently originate. Faults near the surface are generally covered with thick alluvium in lowlands and a veneer of loess in uplands, which make direct observations of faults difficult. Partly because of this difficulty, a search for paleoliquefaction features was begun in 1990. Conclusions of the study are as follows: (1) an earthquake much stronger than any historic earthquake struck the lower Wabash Valley between 1,500 and 7,500 years ago; (2) the epicentral region of the prehistoric strong earthquake was the Wabash Valley seismic zone; (3) apparent sites have been located where 1811-12 earthquake accelerations can be bracketed

  17. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  18. Impact of the Christchurch earthquakes on hospital staff.

    Science.gov (United States)

    Tovaranonte, Pleayo; Cawood, Tom J

    2013-06-01

    On September 4, 2010 a major earthquake caused widespread damage, but no loss of life, to Christchurch city and surrounding areas. There were numerous aftershocks, including on February 22, 2011 which, in contrast, caused substantial loss of life and major damage to the city. The research aim was to assess how these two earthquakes affected the staff in the General Medicine Department at Christchurch Hospital. Problem To date there have been no published data assessing the impact of this type of natural disaster on hospital staff in Australasia. A questionnaire that examined seven domains (demographics, personal impact, psychological impact, emotional impact, impact on care for patients, work impact, and coping strategies) was handed out to General Medicine staff and students nine days after the September 2010 earthquake and 14 days after the February 2011 earthquake. Response rates were ≥ 99%. Sixty percent of responders were earthquakes, respectively. A fifth to a third of people had to find an alternative route of transport to get to work but only eight percent to 18% took time off work. Financial impact was more severe following the February earthquake, with 46% reporting damage of >NZ $1,000, compared with 15% following the September earthquake (P earthquake than the September earthquake (42% vs 69%, P earthquake but this rose to 53% after the February earthquake (12/53 vs 45/85, P earthquake but this dropped significantly to 15% following the February earthquake (27/53 vs 13/62, P earthquakes upon General Medicine hospital staff. The effect was widespread with minor financial impact during the first but much more during the second earthquake. Moderate psychological impact was experienced in both earthquakes. This data may be useful to help prepare plans for future natural disasters. .

  19. The 2010 Chile Earthquake: Rapid Assessments of Tsunami

    OpenAIRE

    Michelini, A.; Lauciani, V.; Selvaggi, G.; Lomax, A.

    2010-01-01

    After an earthquake underwater, rapid real-time assessment of earthquake parameters is important for emergency response related to infrastructure damage and, perhaps more exigently, for issuing warnings of the possibility of an impending tsunami. Since 2005, the Istituto Nazionale di Geofisica e Vulcanologia (INGV) has worked on the rapid quantification of earthquake magnitude and tsunami potential, especially for the Mediterranean area. This work includes quantification of earthquake size fr...

  20. The Klamath Falls, Oregon, earthquakes on September 20, 1993

    Science.gov (United States)

    Brantley, S.R.

    1993-01-01

    The strongest earthquake to strike Oregon in more than 50 yrs struck the southern part of the State on September 20, 1993. These shocks, a magnitude 5.9 earthquake at 8:28pm and a magnitude 6.0 earthquake at 10:45pm, were the opening salvo in a swarm of earthquakes that continued for more than three months. During this period, several thousand aftershocks, many strong enough to be felt, were recorded by seismographs.