WorldWideScience

Sample records for earthquake engineers release

  1. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  2. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  3. Computational methods in earthquake engineering

    CERN Document Server

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  4. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  5. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  6. Elastic energy release in great earthquakes and eruptions

    Directory of Open Access Journals (Sweden)

    Agust eGudmundsson

    2014-05-01

    Full Text Available The sizes of earthquakes are measured using well-defined, measurable quantities such as seismic moment and released (transformed elastic energy. No similar measures exist for the sizes of volcanic eruptions, making it difficult to compare the energies released in earthquakes and eruptions. Here I provide a new measure of the elastic energy (the potential mechanical energy associated with magma chamber rupture and contraction (shrinkage during an eruption. For earthquakes and eruptions, elastic energy derives from two sources: (1 the strain energy stored in the volcano/fault zone before rupture, and (2 the external applied load (force, pressure, stress, displacement on the volcano/fault zone. From thermodynamic considerations it follows that the elastic energy released or transformed (dU during an eruption is directly proportional to the excess pressure (pe in the magma chamber at the time of rupture multiplied by the volume decrease (-dVc of the chamber, so that . This formula can be used as a basis for a new eruption magnitude scale, based on elastic energy released, which can be related to the moment-magnitude scale for earthquakes. For very large eruptions (>100 km3, the volume of the feeder-dike is negligible, so that the decrease in chamber volume during an eruption corresponds roughly to the associated volume of erupted materials , so that the elastic energy is . Using a typical excess pressures of 5 MPa, it is shown that the largest known eruptions on Earth, such as the explosive La Garita Caldera eruption (27-28 million years ago and largest single (effusive Colombia River basalt lava flows (15-16 million years ago, both of which have estimated volumes of about 5000 km3, released elastic energy of the order of 10EJ. For comparison, the seismic moment of the largest earthquake ever recorded, the M9.5 1960 Chile earthquake, is estimated at 100 ZJ and the associated elastic energy release at 10EJ.

  7. Real-time earthquake monitoring using a search engine method.

    Science.gov (United States)

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  8. Elements of earthquake engineering and structural dynamics. 2. ed.

    International Nuclear Information System (INIS)

    Filiatrault, A.

    2002-01-01

    This book is written for practising engineers, senior undergraduate and junior structural-engineering students, and university educators. Its main goal is to provide basic knowledge to structural engineers who have no previous knowledge about earthquake engineering and structural dynamics. Earthquake engineering is a multidisciplinary science. This book is not limited to structural analysis and design. The basics of other relevant topics (such as geology, seismology, and geotechnical engineering) are also covered to ensure that structural engineers can interact efficiently with other specialists during a construction project in a seismic zone

  9. Modern earthquake engineering offshore and land-based structures

    CERN Document Server

    Jia, Junbo

    2017-01-01

    This book addresses applications of earthquake engineering for both offshore and land-based structures. It is self-contained as a reference work and covers a wide range of topics, including topics related to engineering seismology, geotechnical earthquake engineering, structural engineering, as well as special contents dedicated to design philosophy, determination of ground motions, shock waves, tsunamis, earthquake damage, seismic response of offshore and arctic structures, spatial varied ground motions, simplified and advanced seismic analysis methods, sudden subsidence of offshore platforms, tank liquid impacts during earthquakes, seismic resistance of non-structural elements, and various types of mitigation measures, etc. The target readership includes professionals in offshore and civil engineering, officials and regulators, as well as researchers and students in this field.

  10. Earthquake engineering development before and after the March 4, 1977, Vrancea, Romania earthquake

    International Nuclear Information System (INIS)

    Georgescu, E.-S.

    2002-01-01

    At 25 years since the of the Vrancea earthquake of March, 4th 1977, we can analyze in an open and critical way its impact on the evolution of earthquake engineering codes and protection policies in Romania. The earthquake (M G-R = 7.2; M w = 7.5), produced 1,570 casualties and more than 11,300 injured persons (90% of the victims in Bucharest), seismic losses were estimated at more then USD 2 billions. The 1977 earthquake represented a significant episode of XXth century in seismic zones of Romania and neighboring countries. The INCERC seismic record of March 4, 1977 put, for the first time, in evidence the spectral content of long period seismic motions of Vrancea earthquakes, the duration, the number of cycles and values of actual accelerations, with important effects of overloading upon flexible structures. The seismic coefficients k s , the spectral curve (the dynamic coefficient β r ) and the seismic zonation map, the requirements in the antiseismic design norms were drastically, changed while the microzonation maps of the time ceased to be used, and the specific Vrancea earthquake recurrence was reconsidered based on hazard studies Thus, the paper emphasises: - the existing engineering knowledge, earthquake code and zoning maps requirements until 1977 as well as seismology and structural lessons since 1977; - recent aspects of implementing of the Earthquake Code P.100/1992 and harmonization with Eurocodes, in conjunction with the specific of urban and rural seismic risk and enforcing policies on strengthening of existing buildings; - a strategic view of disaster prevention, using earthquake scenarios and loss assessments, insurance, earthquake education and training; - the need of a closer transfer of knowledge between seismologists, engineers and officials in charge with disaster prevention public policies. (author)

  11. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  12. Basic earthquake engineering from seismology to analysis and design

    CERN Document Server

    Sucuoğlu, Halûk

    2014-01-01

    This book provides senior undergraduate students, master students and structural engineers who do not have a background in the field with core knowledge of structural earthquake engineering that will be invaluable in their professional lives. The basics of seismotectonics, including the causes, magnitude, and intensity of earthquakes, are first explained. Then the book introduces basic elements of seismic hazard analysis and presents the concept of a seismic hazard map for use in seismic design. Subsequent chapters cover key aspects of the response analysis of simple systems and building struc­tures to earthquake ground motions, design spectrum, the adoption of seismic analysis procedures in seismic design codes, seismic design principles and seismic design of reinforced concrete structures. Helpful worked examples on seismic analysis of linear, nonlinear and base isolated buildings, earthquake-resistant design of frame and frame-shear wall systems are included, most of which can be solved using a hand calcu...

  13. Controlled drug release for tissue engineering.

    Science.gov (United States)

    Rambhia, Kunal J; Ma, Peter X

    2015-12-10

    Tissue engineering is often referred to as a three-pronged discipline, with each prong corresponding to 1) a 3D material matrix (scaffold), 2) drugs that act on molecular signaling, and 3) regenerative living cells. Herein we focus on reviewing advances in controlled release of drugs from tissue engineering platforms. This review addresses advances in hydrogels and porous scaffolds that are synthesized from natural materials and synthetic polymers for the purposes of controlled release in tissue engineering. We pay special attention to efforts to reduce the burst release effect and to provide sustained and long-term release. Finally, novel approaches to controlled release are described, including devices that allow for pulsatile and sequential delivery. In addition to recent advances, limitations of current approaches and areas of further research are discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Introduction: seismology and earthquake engineering in Central and South America.

    Science.gov (United States)

    Espinosa, A.F.

    1983-01-01

    Reports the state-of-the-art in seismology and earthquake engineering that is being advanced in Central and South America. Provides basic information on seismological station locations in Latin America and some of the programmes in strong-motion seismology, as well as some of the organizations involved in these activities.-from Author

  15. Engineering Seismic Base Layer for Defining Design Earthquake Motion

    International Nuclear Information System (INIS)

    Yoshida, Nozomu

    2008-01-01

    Engineer's common sense that incident wave is common in a widespread area at the engineering seismic base layer is shown not to be correct. An exhibiting example is first shown, which indicates that earthquake motion at the ground surface evaluated by the analysis considering the ground from a seismic bedrock to a ground surface simultaneously (continuous analysis) is different from the one by the analysis in which the ground is separated at the engineering seismic base layer and analyzed separately (separate analysis). The reason is investigated by several approaches. Investigation based on eigen value problem indicates that the first predominant period in the continuous analysis cannot be found in the separate analysis, and predominant period at higher order does not match in the upper and lower ground in the separate analysis. The earthquake response analysis indicates that reflected wave at the engineering seismic base layer is not zero, which indicates that conventional engineering seismic base layer does not work as expected by the term ''base''. All these results indicate that wave that goes down to the deep depths after reflecting in the surface layer and again reflects at the seismic bedrock cannot be neglected in evaluating the response at the ground surface. In other words, interaction between the surface layer and/or layers between seismic bedrock and engineering seismic base layer cannot be neglected in evaluating the earthquake motion at the ground surface

  16. Current earthquake engineering practice for Japanese nuclear power plants

    International Nuclear Information System (INIS)

    Hofmayer, C.H.; Park, Y.J.; Costello, J.F.

    1992-01-01

    This paper provides a brief overview of seismic research being conducted in Japan and describes USNRC efforts to understand Japanese seismic practice. Current earthquake engineering practice for Japanese nuclear power plants is descried in JEAG 4601-1987, ''Technical Guidelines for Aseismic Design of Nuclear Power Plants.'' The USNRC has sponsored BNL to translate this document into English. Efforts are underway to study and understand JEAG 4601-1987 and make the translation more readily available in the United States

  17. Does Modern Ideology of Earthquake Engineering Ensure the Declared Levels of Damage of Structures at Earthquakes?

    International Nuclear Information System (INIS)

    Gabrichidze, G.

    2011-01-01

    The basic position of the modern ideology of earthquake engineering is based on the idea that a structure should be designed so that it suffers almost no damage at an earthquake, the occurrence of which is most probable in the given area during the lifetime of the structure. This statement is essentially based on the so-called Performance Based Design, the ideology of the 21 s t century. In the article at tenton is focused on the fact that the modern ideology of earthquake engineering assigns structures to a dangerous zone in which their behavior is defined by processes of damage and destruction of materials, which is a nonequilibrium process and demands application of special refined methods of research. In such conditions use of ratios that correspond to static conditions of loading to describe the process of damage of materials appears to be unfounded. The article raises the question of the necessity of working out a new mathematical model of behavior of materials and structures at rapid intensive impact. (authors)

  18. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    Science.gov (United States)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  19. The HayWired earthquake scenario—Engineering implications

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2018-04-18

    The HayWired Earthquake Scenario—Engineering Implications is the second volume of U.S. Geological Survey (USGS) Scientific Investigations Report 2017–5013, which describes the HayWired scenario, developed by USGS and its partners. The scenario is a hypothetical yet scientifically realistic earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after a magnitude-7 earthquake (mainshock) on the Hayward Fault and its aftershocks.Analyses in this volume suggest that (1) 800 deaths and 16,000 nonfatal injuries result from shaking alone, plus property and direct business interruption losses of more than $82 billion from shaking, liquefaction, and landslides; (2) the building code is designed to protect lives, but even if all buildings in the region complied with current building codes, 0.4 percent could collapse, 5 percent could be unsafe to occupy, and 19 percent could have restricted use; (3) people expect, prefer, and would be willing to pay for greater resilience of buildings; (4) more than 22,000 people could require extrication from stalled elevators, and more than 2,400 people could require rescue from collapsed buildings; (5) the average east-bay resident could lose water service for 6 weeks, some for as long as 6 months; (6) older steel-frame high-rise office buildings and new reinforced-concrete residential buildings in downtown San Francisco and Oakland could be unusable for as long as 10 months; (7) about 450 large fires could result in a loss of residential and commercial building floor area equivalent to more than 52,000 single-family homes and cause property (building and content) losses approaching $30 billion; and (8) combining earthquake early warning (ShakeAlert) with “drop, cover, and hold on” actions could prevent as many as 1,500 nonfatal injuries out of 18,000 total estimated nonfatal injuries from shaking and liquefaction hazards.

  20. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  1. Introduction: seismology and earthquake engineering in Mexico and Central and South America.

    Science.gov (United States)

    Espinosa, A.F.

    1982-01-01

    The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston

  2. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... nuclear power plant structures, systems, and components important to safety to withstand the effects of...

  3. Engineering aspects of earthquake risk mitigation: Lessons from management of recent earthquakes, and consequential mudflows and landslides

    International Nuclear Information System (INIS)

    1992-01-01

    The Proceedings contain 30 selected presentations given at the Second and Third UNDRO/USSR Training Seminars: Engineering Aspects of Earthquake Risk Assessment and Mitigation of Losses, held in Dushanbe, October 1988; and Lessons from Management of Recent Earthquakes, and Consequential Mudflows and Landslides, held in Moscow, October 1989. The annexes to the document provide information on the participants, the work programme and the resolution adopted at each of the seminars. Refs, figs and tabs

  4. Earthquake engineering and structural dynamics studies at Bhabha Atomic Research Centre

    International Nuclear Information System (INIS)

    Reddy, G.R.; Parulekar, Y.M.; Sharma, A.; Dubey, P.N.; Vaity, K.N.; Kukreja, Mukhesh; Vaze, K.K.; Ghosh, A.K.; Kushwaha, H.S.

    2007-01-01

    Earthquake Engineering and structural Dynamics has gained the attention of many researchers throughout the world and extensive research work is performed. Linear behaviour of structures, systems and components (SSCs) subjected to earthquake/dynamic loading is clearly understood. However, nonlinear behaviour of SSCs subjected to earthquake/dynamic loading need to be understood clearly and design methods need to be validated experimentally. In view of this, three major areas in earthquake engineering and structural dynamics identified for research includes: design and development of passive devices to control the seismic/dynamic response of SSCs, nonlinear behaviour of piping systems subjected to earthquake loading and nonlinear behavior of RCC structures under seismic excitation or dynamic loading. BARC has performed extensive work and also being continued in the above-identified areas. The work performed is helping for clearer understanding of nonlinear behavior of SSCs as well as in developing new schemes, methodologies and devices to control the earthquake response of SSCs. (author)

  5. Building Infrastructure for Preservation and Publication of Earthquake Engineering Research Data

    Directory of Open Access Journals (Sweden)

    Stanislav Pejša

    2014-10-01

    Full Text Available The objective of this paper is to showcase the progress of the earthquake engineering community during a decade-long effort supported by the National Science Foundation in the George E. Brown Jr., Network for Earthquake Engineering Simulation (NEES. During the four years that NEES network operations have been headquartered at Purdue University, the NEEScomm management team has facilitated an unprecedented cultural change in the ways research is performed in earthquake engineering. NEES has not only played a major role in advancing the cyberinfrastructure required for transformative engineering research, but NEES research outcomes are making an impact by contributing to safer structures throughout the USA and abroad. This paper reflects on some of the developments and initiatives that helped instil change in the ways that the earthquake engineering and tsunami community share and reuse data and collaborate in general.

  6. Prevent recurrence of nuclear disaster (3). Agenda on nuclear safety from earthquake engineering

    International Nuclear Information System (INIS)

    Kameda, Hiroyuki; Takada, Tsuyoshi; Ebisawa, Katsumi; Nakamura, Susumu

    2012-01-01

    Based on results of activities of committee on seismic safety of nuclear power plants (NPPs) of Japan Association for Earthquake Engineering, which started activities after Chuetsu-oki earthquake and then experienced Great East Japan Earthquake, (under close collaboration with the committee of Atomic Energy Society of Japan started activities simultaneously), and taking account of further development of concept, agenda on nuclear safety were proposed from earthquake engineering. In order to prevent recurrence of nuclear disaster, individual technical issues of earthquake engineering and comprehensive issues of integration technology, multidisciplinary collaboration and establishment of technology governance based on them were of prime importance. This article described important problems to be solved; (1) technical issues and mission of seismic safety of NPPs, (2) decision making based on risk assessment - basis of technical governance, (3) framework of risk, design and regulation - framework of required technology governance, (4) technical issues of earthquake engineering for nuclear safety, (5) role of earthquake engineering in nuclear power risk communication and (6) importance of multidisciplinary collaboration. Responsibility of engineering would be attributed to establishment of technology governance, cultivation of individual technology and integration technology, and social communications. (T. Tanaka)

  7. Release, transport and toxicity of engineered nanoparticles.

    Science.gov (United States)

    Soni, Deepika; Naoghare, Pravin K; Saravanadevi, Sivanesan; Pandey, Ram Avatar

    2015-01-01

    Recent developments in nanotechnology have facilitated the synthesis of novel engineered nanoparticles (ENPs) that possess new and different physicochemical properties. These ENPs have been ex tensive ly used in various commercial sectors to achieve both social and economic benefits. However. the increasing production and consumption of ENPs by many different industries has raised concerns about their possible release and accumulation in the environment. Released EN Ps may either remain suspended in the atmosphere for several years or may accumulate and eventually be modified int o other substances. Settled nanoparticles can he easily washed away during ra in s. and therefore may easily enter the food chain via water and so il. Thus. EN Ps can contaminate air. water and soil and can subsequently pose adverse risks to the health of different organisms. Studies to date indicate that ENP transport to and within the ecosystem depend on their chemical and physical properties (viz .. size. shape and solubility) . Therefore. the EN Ps display variable behavior in the environment because of their individual properties th at affect their tendency for adsorption, absorption, diffusional and colloidal interaction. The transport of EN Ps also influences their fate and chemical transformation in ecosystems. The adsorption, absorption and colloidal interaction of ENPs affect their capacity to be degraded or transformed, whereas the tendency of ENPs to agglomerate fosters their sedimentation. How widely ENPs are transported and their environmental fate influence how tox ic they may become to environmental organisms. One barrier to fully understanding how EN Ps are transformed in the environment and how best to characterize their toxicity, is related to the nature of their ultrafine structure. Experiments with different animals, pl ants, and cell lines have revealed that ENPs induce toxicity via several cellular pathways that is linked to the size. shape. surface area

  8. Global life cycle releases of engineered nanomaterials

    International Nuclear Information System (INIS)

    Keller, Arturo A.; McFerran, Suzanne; Lazareva, Anastasiya; Suh, Sangwon

    2013-01-01

    Engineered nanomaterials (ENMs) are now becoming a significant fraction of the material flows in the global economy. We are already reaping the benefits of improved energy efficiency, material use reduction, and better performance in many existing and new applications that have been enabled by these technological advances. As ENMs pervade the global economy, however, it becomes important to understand their environmental implications. As a first step, we combined ENM market information and material flow modeling to produce the first global assessment of the likely ENM emissions to the environment and landfills. The top ten most produced ENMs by mass were analyzed in a dozen major applications. Emissions during the manufacturing, use, and disposal stages were estimated, including intermediate steps through wastewater treatment plants and waste incineration plants. In 2010, silica, titania, alumina, and iron and zinc oxides dominate the ENM market in terms of mass flow through the global economy, used mostly in coatings/paints/pigments, electronics and optics, cosmetics, energy and environmental applications, and as catalysts. We estimate that 63–91 % of over 260,000–309,000 metric tons of global ENM production in 2010 ended up in landfills, with the balance released into soils (8–28 %), water bodies (0.4–7 %), and atmosphere (0.1–1.5 %). While there are considerable uncertainties in the estimates, the framework for estimating emissions can be easily improved as better data become available. The material flow estimates can be used to quantify emissions at the local level, as inputs for fate and transport models to estimate concentrations in different environmental compartments.

  9. Structural performance of the DOE's Idaho National Engineering Laboratory during the 1983 Borah Peak Earthquake

    International Nuclear Information System (INIS)

    Guenzler, R.C.; Gorman, V.W.

    1985-01-01

    The 1983 Borah Peak Earthquake (7.3 Richter magnitude) was the largest earthquake ever experienced by the DOE's Idaho National Engineering Laboratory (INEL). Reactor and plant facilities are generally located about 90 to 110 km (60 miles) from the epicenter. Several reactors were operating normally at the time of the earthquake. Based on detailed inspections, comparisons of measured accelerations with design levels, and instrumental seismograph information, it was concluded that the 1983 Borah Peak Earthquake created no safety problems for INEL reactors or other facilities. 10 references, 16 figures, 2 tables

  10. Revolutionising engineering education in the Middle East region to promote earthquake-disaster mitigation

    Science.gov (United States)

    Baytiyeh, Hoda; Naja, Mohamad K.

    2014-09-01

    Due to the high market demands for professional engineers in the Arab oil-producing countries, the appetite of Middle Eastern students for high-paying jobs and challenging careers in engineering has sharply increased. As a result, engineering programmes are providing opportunities for more students to enrol on engineering courses through lenient admission policies that do not compromise academic standards. This strategy has generated an influx of students who must be carefully educated to enhance their professional knowledge and social capital to assist in future earthquake-disaster risk-reduction efforts. However, the majority of Middle Eastern engineering students are unaware of the valuable acquired engineering skills and knowledge in building the resilience of their communities to earthquake disasters. As the majority of the countries in the Middle East are exposed to seismic hazards and are vulnerable to destructive earthquakes, engineers have become indispensable assets and the first line of defence against earthquake threats. This article highlights the contributions of some of the engineering innovations in advancing technologies and techniques for effective disaster mitigation and it calls for the incorporation of earthquake-disaster-mitigation education into academic engineering programmes in the Eastern Mediterranean region.

  11. Regional distribution of released earthquake energy in northern Egypt along with Inahass area

    International Nuclear Information System (INIS)

    El-hemamy, S.T.; Adel, A.A. Othman

    1999-01-01

    A review of the seismic history of Egypt indicates sone areas of high activity concentrated along Oligocene-Miocene faults. These areas support the idea of recent activation of the Oligocene-Miocene stress cycle. There are similarities in the special distribution of recent and historical epicenters. Form the tectonic map of Egypt, distribution of Intensity and magnitude show strong activity along Nile Delta. This due to the presence of a thick layers of recent alluvial sediments. The released energy of the earthquakes are effective on the structures. The present study deals with the computed released energies of the reported earthquakes in Egypt and around Inshas area . Its effect on the urban and nuclear facilities inside Inshas site is considered. Special consideration will be given to old and new waste repository sites. The application of the determined released energy reveals that Inshas site is affected by seismic activity from five seismo-tectonic source zones, namely the Red Sea, Nile Delta, El-Faiyum, the Mediterranean Sea and the Gulf of Aqaba seismo-tectonic zones. El-Faiyum seismo-tectonic source zone has the maximum effect on the site and gave a high released energy reaching to 5.4E +2 1 erg

  12. A self-referential HOWTO on release engineering

    Energy Technology Data Exchange (ETDEWEB)

    Galassi, Mark C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-31

    Release engineering is a fundamental part of the software development cycle: it is the point at which quality control is exercised and bug fixes are integrated. The way in which software is released also gives the end user her first experience of a software package, while in scientific computing release engineering can guarantee reproducibility. For these reasons and others, the release process is a good indicator of the maturity and organization of a development team. Software teams often do not put in place a release process at the beginning. This is unfortunate because the team does not have early and continuous execution of test suites, and it does not exercise the software in the same conditions as the end users. I describe an approach to release engineering based on the software tools developed and used by the GNU project, together with several specific proposals related to packaging and distribution. I do this in a step-by-step manner, demonstrating how this very paper is written and built using proper release engineering methods. Because many aspects of release engineering are not exercised in the building of the paper, the accompanying software repository also contains examples of software libraries.

  13. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    Science.gov (United States)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  14. An Ilustrative Nuclide Release Behavior from an HLW Repository due to an Earthquake Event

    International Nuclear Information System (INIS)

    Lee, Youn-Myoung; Hwang, Yong-Soo; Choi, Jong-Won

    2008-01-01

    Program for the evaluation of a high-level waste repository which is conceptually modeled. During the last few years, programs developed with the aid of AMBER and GoldSim by which nuclide transports in the near- and far-field of a repository as well as transport through the biosphere under various normal and disruptive release scenarios could be modeled and evaluated, have been continuously demonstrated. To show its usability, as similarly done for the natural groundwater flow scheme, influence of a possible disruptive event on a nuclide release behavior from an HLW repository system caused naturally due to an earthquake has been investigated and illustrated with the newly developed GoldSim program

  15. Estimate of airborne release of plutonium from Babcock and Wilcox plant as a result of severe wind hazard and earthquake

    International Nuclear Information System (INIS)

    Mishima, J.; Schwendiman, L.C.; Ayer, J.E.

    1978-10-01

    As part of an interdisciplinary study to evaluate the potential radiological consequences of wind hazard and earthquake upon existing commercial mixed oxide fuel fabrication plants, the potential mass airborne releases of plutonium (source terms) from such events are estimated. The estimated souce terms are based upon the fraction of enclosures damaged to three levels of severity (crush, puncture penetrate, and loss of external filter, in order of decreasing severity), called damage ratio, and the airborne release if all enclosures suffered that level of damage. The discussion of damage scenarios and source terms is divided into wind hazard and earthquake scenarios in order of increasing severity. The largest airborne releases from the building were for cases involving the catastrophic collapse of the roof over the major production areas--wind hazard at 110 mph and earthquakes with peak ground accelerations of 0.20 to 0.29 g. Wind hazards at higher air velocities and earthquakes with higher ground acceleration do not result in significantly greater source terms. The source terms were calculated as additional mass of respirable particles released with time up to 4 days; and, under these assumptions, approximately 98% of the mass of material of concern is made airborne from 2 h to 4 days after the event. The overall building source terms from the damage scenarios evaluated are shown in a table. The contribution of individual areas to the overall building source term is presented in order of increasing severity for wind hazard and earthquake

  16. Advancing Integrated STEM Learning through Engineering Design: Sixth-Grade Students' Design and Construction of Earthquake Resistant Buildings

    Science.gov (United States)

    English, Lyn D.; King, Donna; Smeed, Joanna

    2017-01-01

    As part of a 3-year longitudinal study, 136 sixth-grade students completed an engineering-based problem on earthquakes involving integrated STEM learning. Students employed engineering design processes and STEM disciplinary knowledge to plan, sketch, then construct a building designed to withstand earthquake damage, taking into account a number of…

  17. Road Surfaces And Earthquake Engineering: A Theoretical And Experimental Study

    International Nuclear Information System (INIS)

    Pratico, Filippo Giammaria

    2008-01-01

    As is well known, road surfaces greatly affect vehicle-road interaction. As a consequence, road surfaces have a paramount influence on road safety and pavement management systems. On the other hand, earthquakes produce deformations able to modify road surface structure, properties and performance. In the light of these facts, the main goal of this paper has been confined into the modelling of road surface before, during and after the seismic event. The fundamentals of road surface texture theory have been stated in a general formulation. Models in the field of road profile generation and theoretical properties, before, during and after the earthquake, have been formulated and discussed. Practical applications can be hypothesised in the field of vehicle-road interaction as a result of road surface texture derived from deformations and accelerations caused by seismic or similar events

  18. Designing an Earthquake-Proof Art Museum: An Arts- and Engineering-Integrated Science Lesson

    Science.gov (United States)

    Carignan, Anastasia; Hussain, Mahjabeen

    2016-01-01

    In this practical arts-integrated science and engineering lesson, an inquiry-based approach was adopted to teach a class of fourth graders in a Midwest elementary school about the scientific concepts of plate tectonics and earthquakes. Lessons were prepared following the 5 E instructional model. Next Generation Science Standards (4-ESS3-2) and the…

  19. Electromagnetic Energy Released in the Subduction (Benioff) Zone in Weeks Previous to Earthquake Occurrence in Central Peru and the Estimation of Earthquake Magnitudes.

    Science.gov (United States)

    Heraud, J. A.; Centa, V. A.; Bleier, T.

    2017-12-01

    During the past four years, magnetometers deployed in the Peruvian coast have been providing evidence that the ULF pulses received are indeed generated at the subduction or Benioff zone and are connected with the occurrence of earthquakes within a few kilometers of the source of such pulses. This evidence was presented at the AGU 2015 Fall meeting, showing the results of triangulation of pulses from two magnetometers located in the central area of Peru, using data collected during a two-year period. Additional work has been done and the method has now been expanded to provide the instantaneous energy released at the stress areas on the Benioff zone during the precursory stage, before an earthquake occurs. Collected data from several events and in other parts of the country will be shown in a sequential animated form that illustrates the way energy is released in the ULF part of the electromagnetic spectrum. The process has been extended in time and geographical places. Only pulses associated with the occurrence of earthquakes are taken into account in an area which is highly associated with subduction-zone seismic events and several pulse parameters have been used to estimate a function relating the magnitude of the earthquake with the value of a function generated with those parameters. The results shown, including the animated data video, constitute additional work towards the estimation of the magnitude of an earthquake about to occur, based on electromagnetic pulses that originated at the subduction zone. The method is providing clearer evidence that electromagnetic precursors in effect conveys physical and useful information prior to the advent of a seismic event

  20. Load-Unload Response Ratio and Accelerating Moment/Energy Release Critical Region Scaling and Earthquake Prediction

    Science.gov (United States)

    Yin, X. C.; Mora, P.; Peng, K.; Wang, Y. C.; Weatherley, D.

    The main idea of the Load-Unload Response Ratio (LURR) is that when a system is stable, its response to loading corresponds to its response to unloading, whereas when the system is approaching an unstable state, the response to loading and unloading becomes quite different. High LURR values and observations of Accelerating Moment/Energy Release (AMR/AER) prior to large earthquakes have led different research groups to suggest intermediate-term earthquake prediction is possible and imply that the LURR and AMR/AER observations may have a similar physical origin. To study this possibility, we conducted a retrospective examination of several Australian and Chinese earthquakes with magnitudes ranging from 5.0 to 7.9, including Australia's deadly Newcastle earthquake and the devastating Tangshan earthquake. Both LURR values and best-fit power-law time-to-failure functions were computed using data within a range of distances from the epicenter. Like the best-fit power-law fits in AMR/AER, the LURR value was optimal using data within a certain epicentral distance implying a critical region for LURR. Furthermore, LURR critical region size scales with mainshock magnitude and is similar to the AMR/AER critical region size. These results suggest a common physical origin for both the AMR/AER and LURR observations. Further research may provide clues that yield an understanding of this mechanism and help lead to a solid foundation for intermediate-term earthquake prediction.

  1. Idaho National Engineering Laboratory release criteria for decontamination and decommissioning

    International Nuclear Information System (INIS)

    Dolenc, M.R.; Case, M.J.

    1986-01-01

    Criteria have been developed for release of Idaho National Engineering Laboratory (INEL) facilities and land areas following decontamination and decommissioning (D and D). Decommissioning release criteria in the form of dose guidelines were proposed by the US Nuclear Regulatory Commission as early as 1980. These criteria were used on an interim basis for INEL D and D projects. However, dose guidelines alone do not adequately cover the criteria necessary to release sites for unrestricted use. In actual practice, other parameters such as pathways analyses, sampling and instrumentation techniques, and implementation procedures are required to develop the basis for unrestricted release of a site. Thus, a rigorous approach for evaluating these other parameters is needed to develop acceptable D and D release criteria. Because of the complex and sensitive nature of the dose and pathways analyses work, a thorough review by experts in those respective fields was desired. Input and support in preparing or reviewing each part of the criteria development task was solicited from several DOE field offices. Experts were identified and contracted to assist in preparing portions of the release criteria, or to serve on a peer-review committee. Thus, the entire release criteria development task was thoroughly reviewed by recognized experts from each DOE field office, to validate technical content of the INEL site-specific document

  2. Earthquake-enhanced permeability – evidence from carbon dioxide release following the ML3.5 earthquake in West Bohemia

    Czech Academy of Sciences Publication Activity Database

    Fischer, Tomáš; Matyska, C.; Heinicke, J.

    2017-01-01

    Roč. 460, February (2017), s. 60-67 ISSN 0012-821X R&D Projects: GA MŠk LM2010008 Institutional support: RVO:67985530 Keywords : earthquake swarms * fluid triggering * crustal CO2 * fault valve Subject RIV: DC - Siesmology, Volcanology, Earth Structure OBOR OECD: Volcanology Impact factor: 4.409, year: 2016

  3. Addressing earthquakes strong ground motion issues at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wong, I.G.; Silva, W.J.; Stark, C.L.; Jackson, S.; Smith, R.P.

    1991-01-01

    In the course of reassessing seismic hazards at the Idaho National Engineering Laboratory (INEL), several key issues have been raised concerning the effects of the earthquake source and site geology on potential strong ground motions that might be generated by a large earthquake. The design earthquake for the INEL is an approximate moment magnitude (M w ) 7 event that may occur on the southern portion of the Lemhi fault, a Basin and Range normal fault that is located on the northwestern boundary of the eastern Snake River Plain and the INEL, within 10 to 27 km of several major facilities. Because the locations of these facilities place them at close distances to a large earthquake and generally along strike of the causative fault, the effects of source rupture dynamics (e.g., directivity) could be critical in enhancing potential ground shaking at the INEL. An additional source issue that has been addressed is the value of stress drop to use in ground motion predictions. In terms of site geology, it has been questioned whether the interbedded volcanic stratigraphy beneath the ESRP and the INEL attenuates ground motions to a greater degree than a typical rock site in the western US. These three issues have been investigated employing a stochastic ground motion methodology which incorporates the Band-Limited-White-Noise source model for both a point source and finite fault, random vibration theory and an equivalent linear approach to model soil response

  4. Earthquakes, Cities, and Lifelines: lessons integrating tectonics, society, and engineering in middle school Earth Science

    Science.gov (United States)

    Toke, N.; Johnson, A.; Nelson, K.

    2010-12-01

    Earthquakes are one of the most widely covered geologic processes by the media. As a result students, even at the middle school level, arrive in the classroom with preconceptions about the importance and hazards posed by earthquakes. Therefore earthquakes represent not only an attractive topic to engage students when introducing tectonics, but also a means to help students understand the relationships between geologic processes, society, and engineering solutions. Facilitating understanding of the fundamental connections between science and society is important for the preparation of future scientists and engineers as well as informed citizens. Here, we present a week-long lesson designed to be implemented in five one hour sessions with classes of ~30 students. It consists of two inquiry-based mapping investigations, motivational presentations, and short readings that describe fundamental models of plate tectonics, faults, and earthquakes. The readings also provide examples of engineering solutions such as the Alaskan oil pipeline which withstood multi-meter surface offset in the 2002 Denali Earthquake. The first inquiry-based investigation is a lesson on tectonic plates. Working in small groups, each group receives a different world map plotting both topography and one of the following data sets: GPS plate motion vectors, the locations and types of volcanoes, the location of types of earthquakes. Using these maps and an accompanying explanation of the data each group’s task is to map plate boundary locations. Each group then presents a ~10 minute summary of the type of data they used and their interpretation of the tectonic plates with a poster and their mapping results. Finally, the instructor will facilitate a class discussion about how the data types could be combined to understand more about plate boundaries. Using student interpretations of real data allows student misconceptions to become apparent. Throughout the exercise we record student preconceptions

  5. A refined Frequency Domain Decomposition tool for structural modal monitoring in earthquake engineering

    Science.gov (United States)

    Pioldi, Fabio; Rizzi, Egidio

    2017-07-01

    Output-only structural identification is developed by a refined Frequency Domain Decomposition ( rFDD) approach, towards assessing current modal properties of heavy-damped buildings (in terms of identification challenge), under strong ground motions. Structural responses from earthquake excitations are taken as input signals for the identification algorithm. A new dedicated computational procedure, based on coupled Chebyshev Type II bandpass filters, is outlined for the effective estimation of natural frequencies, mode shapes and modal damping ratios. The identification technique is also coupled with a Gabor Wavelet Transform, resulting in an effective and self-contained time-frequency analysis framework. Simulated response signals generated by shear-type frames (with variable structural features) are used as a necessary validation condition. In this context use is made of a complete set of seismic records taken from the FEMA P695 database, i.e. all 44 "Far-Field" (22 NS, 22 WE) earthquake signals. The modal estimates are statistically compared to their target values, proving the accuracy of the developed algorithm in providing prompt and accurate estimates of all current strong ground motion modal parameters. At this stage, such analysis tool may be employed for convenient application in the realm of Earthquake Engineering, towards potential Structural Health Monitoring and damage detection purposes.

  6. Earthquake strong ground motion studies at the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Wong, Ivan; Silva, W.; Darragh, R.; Stark, C.; Wright, D.; Jackson, S.; Carpenter, G.; Smith, R.; Anderson, D.; Gilbert, H.; Scott, D.

    1989-01-01

    Site-specific strong earthquake ground motions have been estimated for the Idaho National Engineering Laboratory assuming that an event similar to the 1983 M s 7.3 Borah Peak earthquake occurs at epicentral distances of 10 to 28 km. The strong ground motion parameters have been estimated based on a methodology incorporating the Band-Limited-White-Noise ground motion model coupled with Random Vibration Theory. A 16-station seismic attenuation and site response survey utilizing three-component portable digital seismographs was also performed for a five-month period in 1989. Based on the recordings of regional earthquakes, the effects of seismic attenuation in the shallow crust and along the propagation path and local site response were evaluated. This data combined with a detailed geologic profile developed for each site based principally on borehole data, was used in the estimation of the strong ground motion parameters. The preliminary peak horizontal ground accelerations for individual sites range from approximately 0.15 to 0.35 g. Based on the authors analysis, the thick sedimentary interbeds (greater than 20 m) in the basalt section attenuate ground motions as speculated upon in a number of previous studies

  7. Principles for selecting earthquake motions in engineering design of large dams

    Science.gov (United States)

    Krinitzsky, E.L.; Marcuson, William F.

    1983-01-01

    This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at

  8. Seismic ground motion modelling and damage earthquake scenarios: A bridge between seismologists and seismic engineers

    International Nuclear Information System (INIS)

    Panza, G.F.; Romanelli, F.; Vaccari. F.; . E-mails: Luis.Decanini@uniroma1.it; Fabrizio.Mollaioli@uniroma1.it)

    2002-07-01

    The input for the seismic risk analysis can be expressed with a description of 'roundshaking scenarios', or with probabilistic maps of perhaps relevant parameters. The probabilistic approach, unavoidably based upon rough assumptions and models (e.g. recurrence and attenuation laws), can be misleading, as it cannot take into account, with satisfactory accuracy, some of the most important aspects like rupture process, directivity and site effects. This is evidenced by the comparison of recent recordings with the values predicted by the probabilistic methods. We prefer a scenario-based, deterministic approach in view of the limited seismological data, of the local irregularity of the occurrence of strong earthquakes, and of the multiscale seismicity model, that is capable to reconcile two apparently conflicting ideas: the Characteristic Earthquake concept and the Self Organized Criticality paradigm. Where the numerical modeling is successfully compared with records, the synthetic seismograms permit the microzoning, based upon a set of possible scenario earthquakes. Where no recordings are available the synthetic signals can be used to estimate the ground motion without having to wait for a strong earthquake to occur (pre-disaster microzonation). In both cases the use of modeling is necessary since the so-called local site effects can be strongly dependent upon the properties of the seismic source and can be properly defined only by means of envelopes. The joint use of reliable synthetic signals and observations permits the computation of advanced hazard indicators (e.g. damaging potential) that take into account local soil properties. The envelope of synthetic elastic energy spectra reproduces the distribution of the energy demand in the most relevant frequency range for seismic engineering. The synthetic accelerograms can be fruitfully used for design and strengthening of structures, also when innovative techniques, like seismic isolation, are employed. For these

  9. Engineering works for increasing earthquake resistance of Hamaoka nuclear power plant

    International Nuclear Information System (INIS)

    Oonishi, Yoshihiro; Kondou, Makoto; Hattori, Kazushi

    2007-01-01

    The improvement works of the ground of outdoor piping and duct system of Hamaoka-3, one of engineering works for increasing earthquake resistance of the plant, are reported. The movable outdoor piping systems were moved. SJ method, one of the high-pressure jet mixing method, improved the ground between the duct and the unmoved light oil tank on the western side, and the environmental ground. The other places were improved by the concrete replacement works. The results of ground treated by SJ method showed the high quality of stiffness and continuity. Outline of engineering works, execution of concrete replacement works, the high-pressure jet mixing method, SJ method, the quality control and treatment of the generated mud by SJ method are reported. A seismic response analysis, execution facilities, construction planning, working diagram, improvement work conditions of three methods, and steps of SJ method are illustrated. (S.Y.)

  10. Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering

    Science.gov (United States)

    Cavlazoglu, Baki; Stuessy, Carol

    2018-02-01

    The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.

  11. Estimation of recurrence interval of large earthquakes on the central Longmen Shan fault zone based on seismic moment accumulation/release model.

    Science.gov (United States)

    Ren, Junjie; Zhang, Shimin

    2013-01-01

    Recurrence interval of large earthquake on an active fault zone is an important parameter in assessing seismic hazard. The 2008 Wenchuan earthquake (Mw 7.9) occurred on the central Longmen Shan fault zone and ruptured the Yingxiu-Beichuan fault (YBF) and the Guanxian-Jiangyou fault (GJF). However, there is a considerable discrepancy among recurrence intervals of large earthquake in preseismic and postseismic estimates based on slip rate and paleoseismologic results. Post-seismic trenches showed that the central Longmen Shan fault zone probably undertakes an event similar to the 2008 quake, suggesting a characteristic earthquake model. In this paper, we use the published seismogenic model of the 2008 earthquake based on Global Positioning System (GPS) and Interferometric Synthetic Aperture Radar (InSAR) data and construct a characteristic seismic moment accumulation/release model to estimate recurrence interval of large earthquakes on the central Longmen Shan fault zone. Our results show that the seismogenic zone accommodates a moment rate of (2.7 ± 0.3) × 10¹⁷ N m/yr, and a recurrence interval of 3900 ± 400 yrs is necessary for accumulation of strain energy equivalent to the 2008 earthquake. This study provides a preferred interval estimation of large earthquakes for seismic hazard analysis in the Longmen Shan region.

  12. Estimation of Recurrence Interval of Large Earthquakes on the Central Longmen Shan Fault Zone Based on Seismic Moment Accumulation/Release Model

    Directory of Open Access Journals (Sweden)

    Junjie Ren

    2013-01-01

    Full Text Available Recurrence interval of large earthquake on an active fault zone is an important parameter in assessing seismic hazard. The 2008 Wenchuan earthquake (Mw 7.9 occurred on the central Longmen Shan fault zone and ruptured the Yingxiu-Beichuan fault (YBF and the Guanxian-Jiangyou fault (GJF. However, there is a considerable discrepancy among recurrence intervals of large earthquake in preseismic and postseismic estimates based on slip rate and paleoseismologic results. Post-seismic trenches showed that the central Longmen Shan fault zone probably undertakes an event similar to the 2008 quake, suggesting a characteristic earthquake model. In this paper, we use the published seismogenic model of the 2008 earthquake based on Global Positioning System (GPS and Interferometric Synthetic Aperture Radar (InSAR data and construct a characteristic seismic moment accumulation/release model to estimate recurrence interval of large earthquakes on the central Longmen Shan fault zone. Our results show that the seismogenic zone accommodates a moment rate of (2.7 ± 0.3 × 1017 N m/yr, and a recurrence interval of 3900 ± 400 yrs is necessary for accumulation of strain energy equivalent to the 2008 earthquake. This study provides a preferred interval estimation of large earthquakes for seismic hazard analysis in the Longmen Shan region.

  13. 2001 Bhuj, India, earthquake engineering seismoscope recordings and Eastern North America ground-motion attenuation relations

    Science.gov (United States)

    Cramer, C.H.; Kumar, A.

    2003-01-01

    Engineering seismoscope data collected at distances less than 300 km for the M 7.7 Bhuj, India, mainshock are compatible with ground-motion attenuation in eastern North America (ENA). The mainshock ground-motion data have been corrected to a common geological site condition using the factors of Joyner and Boore (2000) and a classification scheme of Quaternary or Tertiary sediments or rock. We then compare these data to ENA ground-motion attenuation relations. Despite uncertainties in recording method, geological site corrections, common tectonic setting, and the amount of regional seismic attenuation, the corrected Bhuj dataset agrees with the collective predictions by ENA ground-motion attenuation relations within a factor of 2. This level of agreement is within the dataset uncertainties and the normal variance for recorded earthquake ground motions.

  14. Impact of Dissociation and Sensible Heat Release on Pulse Detonation and Gas Turbine Engine Performance

    Science.gov (United States)

    Povinelli, Louis A.

    2001-01-01

    A thermodynamic cycle analysis of the effect of sensible heat release on the relative performance of pulse detonation and gas turbine engines is presented. Dissociation losses in the PDE (Pulse Detonation Engine) are found to cause a substantial decrease in engine performance parameters.

  15. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    International Nuclear Information System (INIS)

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs

  16. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    Energy Technology Data Exchange (ETDEWEB)

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  17. State of the art of earthquake engineering in nuclear power plant design

    International Nuclear Information System (INIS)

    Schildknecht, P.O.

    1976-12-01

    A brief outline of definitions based on the USNRC, Seismic and Geologic Siting Criteria for Nuclear Power Plants, and on the plate tectonics and earthquake terminology is given. An introduction into plate tectonics and the associated earthquake phenomena is then presented. Ground motion characteristics are described in connection with the selection of design earthquakes. Mathematical methods of dynamic structural analyses are discussed for linear and nonlinear systems. Response analysis techniques for nuclear power plants are explained considering soil-structure interaction effects. (Auth.)

  18. Fuel effects on knock, heat releases and CARS temperatures in a spark ignition engine

    NARCIS (Netherlands)

    Kalghatgi, G.T.; Golombok, M.; Snowdon, P.

    1995-01-01

    Net heat release, knock characteristics and temperature were derived from in-cylinder pressure and end-gas CARS measurements for different fuels in a single-cylinder engine. The maximum net heat release rate resulting from the final phase of autoignition is closely associated with knock intensity.

  19. Current problems and subjects on numerical analysis of earthquake geotechnical engineering. For seamless analysis

    International Nuclear Information System (INIS)

    Yoshida, Taiki

    2016-01-01

    There are continuum and discontinuum analyses in the evaluation of seismic stability of surrounding slope in nuclear power plant facility. However, we cannot rationally evaluate such seismic stability due to excessive conservative margin of the results by each analysis. If we can simulate the behavior from small to large deformation by hybridizing them, we can contribute not only to the rationalization of the slope stability evaluation but also the enhancement of evaluation precision in the numerical analysis. In this review, the previous numerical analyses and application cases of them in earthquake geotechnical engineering were classified into three categories, that is, continuum analysis, discontinuum one and the hybridizing process to identify their research themes. The present review has revealed that the research themes are the standardization of condition for conversion, construction of the technique to determine parameters related to conversion and the reasonable physical property set of DEM(Distinct Element Method) after conversion. Our future work will be development of a numerical analysis code hybridizing continuum and discontinuum analyses based on the identified research themes. (author)

  20. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  1. A new mechanistic and engineering fission gas release model for a uranium dioxide fuel

    International Nuclear Information System (INIS)

    Lee, Chan Bock; Yang, Yong Sik; Kim, Dae Ho; Kim, Sun Ki; Bang, Je Geun

    2008-01-01

    A mechanistic and engineering fission gas release model (MEGA) for uranium dioxide (UO 2 ) fuel was developed. It was based upon the diffusional release of fission gases from inside the grain to the grain boundary and the release of fission gases from the grain boundary to the external surface by the interconnection of the fission gas bubbles in the grain boundary. The capability of the MEGA model was validated by a comparison with the fission gas release data base and the sensitivity analyses of the parameters. It was found that the MEGA model correctly predicts the fission gas release in the broad range of fuel burnups up to 98 MWd/kgU. Especially, the enhancement of fission gas release in a high-burnup fuel, and the reduction of fission gas release at a high burnup by increasing the UO 2 grain size were found to be correctly predicted by the MEGA model without using any artificial factor. (author)

  2. Fossil intermediate-depth earthquakes in subducting slabs linked to differential stress release

    Science.gov (United States)

    Scambelluri, Marco; Pennacchioni, Giorgio; Gilio, Mattia; Bestmann, Michel; Plümper, Oliver; Nestola, Fabrizio

    2017-12-01

    The cause of intermediate-depth (50-300 km) seismicity in subduction zones is uncertain. It is typically attributed either to rock embrittlement associated with fluid pressurization, or to thermal runaway instabilities. Here we document glassy pseudotachylyte fault rocks—the products of frictional melting during coseismic faulting—in the Lanzo Massif ophiolite in the Italian Western Alps. These pseudotachylytes formed at subduction-zone depths of 60-70 km in poorly hydrated to dry oceanic gabbro and mantle peridotite. This rock suite is a fossil analogue to an oceanic lithospheric mantle that undergoes present-day subduction. The pseudotachylytes locally preserve high-pressure minerals that indicate an intermediate-depth seismic environment. These pseudotachylytes are important because they are hosted in a near-anhydrous lithosphere free of coeval ductile deformation, which excludes an origin by dehydration embrittlement or thermal runaway processes. Instead, our observations indicate that seismicity in cold subducting slabs can be explained by the release of differential stresses accumulated in strong dry metastable rocks.

  3. Scientific, Engineering, and Financial Factors of the 1989 Human-Triggered Newcastle Earthquake in Australia

    Science.gov (United States)

    Klose, C. D.

    2006-12-01

    This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5losses of triggered earthquakes. An hazard assessment, based on a geomechanical crust model, shows that only four deep coal mines were responsible for triggering this severe earthquake. A small-scale economic risk assessment identifies that the financial loss due to earthquake damage has reduced mining profits that have been re-invested in the Newcastle region for over two centuries beginning in 1801. Furthermore, large-scale economic risk assessment reveals that the financial loss is equivalent to 26% of the Australian Gross Domestic Product (GDP) growth in 1988/89. These costs account for 13% of the total costs of all natural disasters (e.g., flooding, drought, wild fires) and 94% of the costs of all earthquakes recorded in Australia between 1967 and 1999. In conclusion, the increasing number and size of geoengineering activities, such as coal mining near Newcastle or planned carbon dioxide Geosequestration initiatives, represent a growing hazard potential, which can negatively affect socio-economic growth and sustainable development. Finally, hazard and risk degrees, based on geomechanical-mathematical models, can be forecasted in space and over time for urban planning in order to prevent economic losses of human-triggered earthquakes in the future.

  4. Mimicking Neurotransmitter Release in Chemical Synapses via Hysteresis Engineering in MoS2 Transistors.

    Science.gov (United States)

    Arnold, Andrew J; Razavieh, Ali; Nasr, Joseph R; Schulman, Daniel S; Eichfeld, Chad M; Das, Saptarshi

    2017-03-28

    Neurotransmitter release in chemical synapses is fundamental to diverse brain functions such as motor action, learning, cognition, emotion, perception, and consciousness. Moreover, improper functioning or abnormal release of neurotransmitter is associated with numerous neurological disorders such as epilepsy, sclerosis, schizophrenia, Alzheimer's disease, and Parkinson's disease. We have utilized hysteresis engineering in a back-gated MoS 2 field effect transistor (FET) in order to mimic such neurotransmitter release dynamics in chemical synapses. All three essential features, i.e., quantal, stochastic, and excitatory or inhibitory nature of neurotransmitter release, were accurately captured in our experimental demonstration. We also mimicked an important phenomenon called long-term potentiation (LTP), which forms the basis of human memory. Finally, we demonstrated how to engineer the LTP time by operating the MoS 2 FET in different regimes. Our findings could provide a critical component toward the design of next-generation smart and intelligent human-like machines and human-machine interfaces.

  5. Dynamic Model for the Stocks and Release Flows of Engineered Nanomaterials.

    Science.gov (United States)

    Song, Runsheng; Qin, Yuwei; Suh, Sangwon; Keller, Arturo A

    2017-11-07

    Most existing life-cycle release models for engineered nanomaterials (ENM) are static, ignoring the dynamics of stock and flows of ENMs. Our model, nanoRelease, estimates the annual releases of ENMs from manufacturing, use, and disposal of a product explicitly taking stock and flow dynamics into account. Given the variabilities in key parameters (e.g., service life of products and annual release rate during use) nanoRelease is designed as a stochastic model. We apply nanoRelease to three ENMs (TiO 2 , SiO 2 and FeO x ) used in paints and coatings through seven product applications, including construction and building, household and furniture, and automotive for the period from 2000 to 2020 using production volume and market projection information. We also consider model uncertainties using Monte Carlo simulation. Compared with 2016, the total annual releases of ENMs in 2020 will increase by 34-40%, and the stock will increase by 28-34%. The fraction of the end-of-life release among total release flows will increase from 11% in 2002 to 43% in 2020. As compared to static models, our dynamic model predicts about an order of magnitude lower values for the amount of ENM released from this sector in the near-term while stock continues to build up in the system.

  6. Estimated airborne release of plutonium from the 102 Building at the General Electric Vallecitos Nuclear Center, Vallecitos, California, as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.; Hays, I.D.

    1980-12-01

    This report estimates the potential airborne releases of plutonium as a consequence of various severities of earthquake and wind hazard postulated for the 102 Building at the General Electric Vallecitos Nuclear Center in California. The releases are based on damage scenarios developed by other specialists. The hazard severities presented range up to a nominal velocity of 230 mph for wind hazard and are in excess of 0.8 g linear acceleration for earthquakes. The consequences of thrust faulting are considered. The approaches and factors used to estimate the releases are discussed. Release estimates range from 0.003 to 3 g Pu

  7. Prediction and Validation of Heat Release Direct Injection Diesel Engine Using Multi-Zone Model

    Science.gov (United States)

    Anang Nugroho, Bagus; Sugiarto, Bambang; Prawoto; Shalahuddin, Lukman

    2014-04-01

    The objective of this study is to develop simulation model which capable to predict heat release of diesel combustion accurately in efficient computation time. A multi-zone packet model has been applied to solve the combustion phenomena inside diesel cylinder. The model formulations are presented first and then the numerical results are validated on a single cylinder direct injection diesel engine at various engine speed and timing injections. The model were found to be promising to fulfill the objective above.

  8. Performance and efficiency evaluation and heat release study of a direct-injection stratified-charge rotary engine

    Science.gov (United States)

    Nguyen, H. L.; Addy, H. E.; Bond, T. H.; Lee, C. M.; Chun, K. S.

    1987-01-01

    A computer simulation which models engine performance of the Direct Injection Stratified Charge (DISC) rotary engines was used to study the effect of variations in engine design and operating parameters on engine performance and efficiency of an Outboard Marine Corporation (OMC) experimental rotary combustion engine. Engine pressure data were used in a heat release analysis to study the effects of heat transfer, leakage, and crevice flows. Predicted engine data were compared with experimental test data over a range of engine speeds and loads. An examination of methods to improve the performance of the rotary engine using advanced heat engine concepts such as faster combustion, reduced leakage, and turbocharging is also presented.

  9. Source Release Modeling for the Idaho National Engineering and Environmental Laboratory's Subsurface Disposal Area

    International Nuclear Information System (INIS)

    Becker, B.H.

    2002-01-01

    A source release model was developed to determine the release of contaminants into the shallow subsurface, as part of the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) evaluation at the Idaho National Engineering and Environmental Laboratory's (INEEL) Subsurface Disposal Area (SDA). The output of the source release model is used as input to the subsurface transport and biotic uptake models. The model allowed separating the waste into areas that match the actual disposal units. This allows quantitative evaluation of the relative contribution to the total risk and allows evaluation of selective remediation of the disposal units within the SDA

  10. Proceedings of the Regional Seminar on Earthquake Engineering (13th) Held in Istanbul, Turkey on 14-24 September 1987.

    Science.gov (United States)

    1987-09-01

    Earthquake Engineering Conference held in San Francisco in July 198 . It is an international collaboration programme disigned to mitigate the damage...i0 25 30 days Fig. Field data showing restoring Processes of life line os ystems. :j50 0i i i . . rH l o? 50- f- Kitchen fire sources J-0 Kerosene...and another, intermediate, narrow, lobby, serving as entrance, kitchen , a.s.o. As a matter of fact, statistics indicate that the ra- tio of 3 room

  11. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    Science.gov (United States)

    Bostenaru Dan, M.

    2009-04-01

    mitigation will be presented. The session includes contributions showing methodological and modelling approaches from scientists in geophysical/seismological, hydrological, remote sensing, civil engineering, insurance, and urbanism, amongst other fields, as well as presentations from practitioners working on specific case studies, regarding analysis of recent events and their impact on cities as well as re-evaluation of past events from the point of view of long-time recovery. In 2005 it was called for: Most strategies for both preparedness and emergency management in case of disaster mitigation are related to urban planning. While natural, engineering and social sciences contribute to the evaluation of the impact of earthquakes and their secondary events (including tsunamis, earthquake triggered landslides, or fire), floods, landslides, high winds, and volcanic eruptions on urban areas, there are the instruments of urban planning which are to be employed for both visualisation as well as development and implementation of strategy concepts for pre- and postdisaster intervention. The evolution of natural systems towards extreme conditions is taken into consideration so far at it concerns the damaging impact on urban areas and infrastructure and the impact on the natural environment of interventions to reduce such damaging impact.

  12. Performance and heat release analysis of a pilot-ignited natural gas engine

    Energy Technology Data Exchange (ETDEWEB)

    Krishnan, S.R.; Biruduganti, M.; Mo, Y.; Bell, S.R.; Midkiff, K.C. [Alabama Univ., Dept. of Mechanical Engineering, Tuscaloosa, AL (United States)

    2002-09-01

    The influence of engine operating variables on the performance, emissions and heat release in a compression ignition engine operating in normal diesel and dual-fuel modes (with natural gas fuelling) was investigated. Substantial reductions in NO{sub x} emissions were obtained with dual-fuel engine operation. There was a corresponding increase in unburned hydrocarbon emissions as the substitution of natural gas was increased. Brake specific energy consumption decreased with natural gas substitution at high loads but increased at low loads. Experimental results at fixed pilot injection timing have also established the importance of intake manifold pressure and temperature in improving dual-fuel performance and emissions at part load. (Author)

  13. In vivo drug release behavior and osseointegration of a doxorubicin-loaded tissue-engineered scaffold

    DEFF Research Database (Denmark)

    Sun, Ming; Chen, Muwan; Wang, Miao

    2016-01-01

    Bone tissue-engineered scaffolds with therapeutic effects must meet the basic requirements as to support bone healing at the defect side and to release an effect drug within the therapeutic window. Here, a rapid prototyped PCL scaffold embedded with chitosan/nanoclay/β-tricalcium phosphate...

  14. Approaches to evaluating weathering effects on release of engineered nanomaterials from solid matrices

    Science.gov (United States)

    Increased production and use of engineered nanomaterials (ENMs) over the past decade has increased the potential for the transport and release of these materials into the environment. Here we present results of two separate studies designed to simulate the effects of weathering o...

  15. IAEA safety guides in the light of recent developments in earthquake engineering

    International Nuclear Information System (INIS)

    Gurpinar, A.

    1988-11-01

    The IAEA safety guides 50-SG-S1 and 50-SG-S2 emphasize on the determination of the design basis earthquake ground motion and earthquake resistant design considerations for nuclear power plants, respectively. Since the elaboration of these safety guides years have elapsed and a review of some of these concepts is necessary, taking into account the information collected and the technical developments. In this article, topics within the scope of these safety guides are discussed. In particular, the results of some recent research which may have a bearing on the nuclear industry are highlighted. Conclusions and recommendations are presented. 6 fig., 19 refs. (F.M.)

  16. Release mechanisms from shallow engineered trenches used as repositories for radioactive wastes

    International Nuclear Information System (INIS)

    Locke, J.; Wood, E.

    1987-05-01

    This report has been written for the Department of the Environment as part of their radioactive waste management research programme. The aim has been to identify release mechanisms of radioactivity from fully engineered trenches of the LAND 2 type and, to identify the data needed for their assessment. No direct experimental work has been involved. The report starts with a brief background to UK strategy and outlines a basic disposal system. It gives reviews of existing experience of low level radioactive waste disposal from LAND 1 trenches and of UK experience of toxic waste disposal to provide a practical basis for the next section which covers the implications of identified release mechanisms on the design requirements for an engineered trench. From these design requirements and their interaction with potential site conditions (both saturated and unsaturated zone sites are considered) an assessment of radionuclide release mechanism is made. (author)

  17. Release of engineered nanomaterials from polymer nanocomposites: the effect of matrix degradation.

    Science.gov (United States)

    Duncan, Timothy V

    2015-01-14

    Polymer nanocomposites-polymer-based materials that incorporate filler elements possessing at least one dimension in the nanometer range-are increasingly being developed for commercial applications ranging from building infrastructure to food packaging to biomedical devices and implants. Despite a wide range of intended applications, it is also important to understand the potential for exposure to these nanofillers, which could be released during routine use or abuse of these materials so that it can be determined whether they pose a risk to human health or the environment. This article is the second of a pair that review what is known about the release of engineered nanomaterials (ENMs) from polymer nanocomposites. Two roughly separate ENM release paradigms are considered in this series: the release of ENMs via passive diffusion, desorption, and dissolution into external liquid media and the release of ENMs assisted by matrix degradation. The present article is focused primarily on the second paradigm and includes a thorough, critical review of the associated body of peer-reviewed literature on ENM release by matrix degradation mechanisms, including photodegradation, thermal decomposition, mechanical wear, and hydrolysis. These release mechanisms may be especially relevant to nanocomposites that are likely to be subjected to weathering, including construction and infrastructural materials, sporting equipment, and materials that might potentially end up in landfills. This review pays particular attention to studies that shed light on specific release mechanisms and synergistic mechanistic relationships. The review concludes with a short section on knowledge gaps and future research needs.

  18. Surface-Engineered Nanocontainers Based on Molecular Self-Assembly and Their Release of Methenamine

    Directory of Open Access Journals (Sweden)

    Minghui Zhang

    2018-02-01

    Full Text Available The mixing of polymers and nanoparticles is opening pathways for engineering flexible composites that exhibit advantageous functional properties. To fabricate controllable assembling nanocomposites for efficiently encapsulating methenamine and releasing them on demand, we functionalized the surface of natural halloysite nanotubes (HNTs selectively with polymerizable gemini surfactant which has peculiar aggregation behavior, aiming at endowing the nanomaterials with self-assembly and stimulative responsiveness characteristics. The micromorphology, grafted components and functional groups were identified using transmission electron microscopy (TEM, thermogravimetric analysis (TGA, Fourier transform infrared (FTIR spectroscopy, and X-ray photoelectron spectroscopy (XPS. The created nanocomposites presented various characteristics of methenamine release with differences in the surface composition. It is particularly worth mentioning that the controlled release was more efficient with the increase of geminized monomer proportion, which is reasonably attributed to the fact that the amphiphilic geminized moieties with positive charge and obvious hydrophobic interactions interact with the outer and inner surface in different ways through fabricating polymeric shell as release stoppers at nanotube ends and forming polymer brush into the nanotube lumen for guest immobilization. Meanwhile, the nanocomposites present temperature and salinity responsive characteristics for the release of methenamine. The combination of HNTs with conjugated functional polymers will open pathways for engineering flexible composites which are promising for application in controlled release fields.

  19. [Engineering aspects of seismic behavior of health-care facilities: lessons from California earthquakes].

    Science.gov (United States)

    Rutenberg, A

    1995-03-15

    The construction of health-care facilities is similar to that of other buildings. Yet the need to function immediately after an earthquake, the helplessness of the many patients and the high and continuous occupancy of these buildings, require that special attention be paid to their seismic performance. Here the lessons from the California experience are invaluable. In this paper the behavior of California hospitals during destructive earthquakes is briefly described. Adequate structural design and execution, and securing of nonstructural elements are required to ensure both safety of occupants, and practically uninterrupted functioning of equipment, mechanical and electrical services and other vital systems. Criteria for post-earthquake functioning are listed. In view of the hazards to Israeli hospitals, in particular those located along the Jordan Valley and the Arava, a program for the seismic evaluation of medical facilities should be initiated. This evaluation should consider the hazards from nonstructural elements, the safety of equipment and systems, and their ability to function after a severe earthquake. It should not merely concentrate on safety-related structural behavior.

  20. Drug-releasing nano-engineered titanium implants: therapeutic efficacy in 3D cell culture model, controlled release and stability

    Energy Technology Data Exchange (ETDEWEB)

    Gulati, Karan [School of Chemical Engineering, The University of Adelaide, SA 5005 (Australia); Kogawa, Masakazu; Prideaux, Matthew; Findlay, David M. [Discipline of Orthopaedics and Trauma, The University of Adelaide, SA 5005 (Australia); Atkins, Gerald J., E-mail: gerald.atkins@adelaide.edu.au [Discipline of Orthopaedics and Trauma, The University of Adelaide, SA 5005 (Australia); Losic, Dusan, E-mail: dusan.losic@adelaide.edu.au [School of Chemical Engineering, The University of Adelaide, SA 5005 (Australia)

    2016-12-01

    There is an ongoing demand for new approaches for treating localized bone pathologies. Here we propose a new strategy for treatment of such conditions, via local delivery of hormones/drugs to the trauma site using drug releasing nano-engineered implants. The proposed implants were prepared in the form of small Ti wires/needles with a nano-engineered oxide layer composed of array of titania nanotubes (TNTs). TNTs implants were inserted into a 3D collagen gel matrix containing human osteoblast-like, and the results confirmed cell migration onto the implants and their attachment and spread. To investigate therapeutic efficacy, TNTs/Ti wires loaded with parathyroid hormone (PTH), an approved anabolic therapeutic for the treatment of severe bone fractures, were inserted into 3D gels containing osteoblast-like cells. Gene expression studies revealed a suppression of SOST (sclerostin) and an increase in RANKL (receptor activator of nuclear factor kappa-B ligand) mRNA expression, confirming the release of PTH from TNTs at concentrations sufficient to alter cell function. The performance of the TNTs wire implants using an example of a drug needed at relatively higher concentrations, the anti-inflammatory drug indomethacin, is also demonstrated. Finally, the mechanical stability of the prepared implants was tested by their insertion into bovine trabecular bone cores ex vivo followed by retrieval, which confirmed the robustness of the TNT structures. This study provides proof of principle for the suitability of the TNT/Ti wire implants for localized bone therapy, which can be customized to cater for specific therapeutic requirements. - Highlights: • Ti wire with titania nanotubes (TNTs) are proposed as ‘in-bone’ therapeutic implants. • 3D cell culture model is used to confirm therapeutic efficacy of drug releasing implants. Osteoblasts migrated and firmly attached to the TNTs and the micro-scale cracks. • Tailorable drug loading from few nanograms to several hundred

  1. Revolutionising Engineering Education in the Middle East Region to Promote Earthquake-Disaster Mitigation

    Science.gov (United States)

    Baytiyeh, Hoda; Naja, Mohamad K.

    2014-01-01

    Due to the high market demands for professional engineers in the Arab oil-producing countries, the appetite of Middle Eastern students for high-paying jobs and challenging careers in engineering has sharply increased. As a result, engineering programmes are providing opportunities for more students to enroll on engineering courses through lenient…

  2. New geological perspectives on earthquake recurrence models

    International Nuclear Information System (INIS)

    Schwartz, D.P.

    1997-01-01

    In most areas of the world the record of historical seismicity is too short or uncertain to accurately characterize the future distribution of earthquakes of different sizes in time and space. Most faults have not ruptured once, let alone repeatedly. Ultimately, the ability to correctly forecast the magnitude, location, and probability of future earthquakes depends on how well one can quantify the past behavior of earthquake sources. Paleoseismological trenching of active faults, historical surface ruptures, liquefaction features, and shaking-induced ground deformation structures provides fundamental information on the past behavior of earthquake sources. These studies quantify (a) the timing of individual past earthquakes and fault slip rates, which lead to estimates of recurrence intervals and the development of recurrence models and (b) the amount of displacement during individual events, which allows estimates of the sizes of past earthquakes on a fault. When timing and slip per event are combined with information on fault zone geometry and structure, models that define individual rupture segments can be developed. Paleoseismicity data, in the form of timing and size of past events, provide a window into the driving mechanism of the earthquake engine--the cycle of stress build-up and release

  3. Influences of use activities and waste management on environmental releases of engineered nanomaterials

    International Nuclear Information System (INIS)

    Wigger, Henning; Hackmann, Stephan; Zimmermann, Till; Köser, Jan; Thöming, Jorg; Gleich, Arnim von

    2015-01-01

    Engineered nanomaterials (ENM) offer enhanced or new functionalities and properties that are used in various products. This also entails potential environmental risks in terms of hazard and exposure. However, hazard and exposure assessment for ENM still suffer from insufficient knowledge particularly for product-related releases and environmental fate and behavior. This study therefore analyzes the multiple impacts of the product use, the properties of the matrix material, and the related waste management system (WMS) on the predicted environmental concentration (PEC) by applying nine prospective life cycle release scenarios based on reasonable assumptions. The products studied here are clothing textiles treated with silver nanoparticles (AgNPs), since they constitute a controversial application. Surprisingly, the results show counter-intuitive increases by a factor of 2.6 in PEC values for the air compartment in minimal AgNP release scenarios. Also, air releases can shift from washing to wearing activity; their associated release points may shift accordingly, potentially altering release hot spots. Additionally, at end-of-life, the fraction of AgNP-residues contained on exported textiles can be increased by 350% when assuming short product lifespans and globalized WMS. It becomes evident that certain combinations of use activities, matrix material characteristics, and WMS can influence the regional PEC by several orders of magnitude. Thus, in the light of the findings and expected ENM market potential, future assessments should consider these aspects to derive precautionary design alternatives and to enable prospective global and regional risk assessments. - Highlights: • Textile use activities and two waste management systems (WMSs) are investigated. • Matrix material and use activities determine the ENM release. • Counter-intuitive shifts of releases to air can happen during usage. • WMS export can increase by 350% in case of short service life and

  4. Influences of use activities and waste management on environmental releases of engineered nanomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Wigger, Henning, E-mail: hwigger@uni-bremen.de [Faculty of Production Engineering, Department of Technological Design and Development, University of Bremen, Badgasteiner Str. 1, 28359 Bremen (Germany); Hackmann, Stephan [UFT Center for Environmental Research and Sustainable Technology, Department of General and Theoretical Ecology, University of Bremen, Leobener Str., 28359 Bremen (Germany); Zimmermann, Till [Faculty of Production Engineering, Department of Technological Design and Development, University of Bremen, Badgasteiner Str. 1, 28359 Bremen (Germany); ARTEC — Research Center for Sustainability Studies, Enrique-Schmidt-Str. 7, 28359 Bremen (Germany); Köser, Jan [UFT Center for Environmental Research and Sustainable Technology, Department of Sustainable Chemistry, University of Bremen, Leobener Str., 28359 Bremen (Germany); Thöming, Jorg [UFT Center for Environmental Research and Sustainable Technology, Department of Sustainable Chemical Engineering, University of Bremen, Leobener Str., 28359 Bremen (Germany); Gleich, Arnim von [Faculty of Production Engineering, Department of Technological Design and Development, University of Bremen, Badgasteiner Str. 1, 28359 Bremen (Germany); ARTEC — Research Center for Sustainability Studies, Enrique-Schmidt-Str. 7, 28359 Bremen (Germany)

    2015-12-01

    Engineered nanomaterials (ENM) offer enhanced or new functionalities and properties that are used in various products. This also entails potential environmental risks in terms of hazard and exposure. However, hazard and exposure assessment for ENM still suffer from insufficient knowledge particularly for product-related releases and environmental fate and behavior. This study therefore analyzes the multiple impacts of the product use, the properties of the matrix material, and the related waste management system (WMS) on the predicted environmental concentration (PEC) by applying nine prospective life cycle release scenarios based on reasonable assumptions. The products studied here are clothing textiles treated with silver nanoparticles (AgNPs), since they constitute a controversial application. Surprisingly, the results show counter-intuitive increases by a factor of 2.6 in PEC values for the air compartment in minimal AgNP release scenarios. Also, air releases can shift from washing to wearing activity; their associated release points may shift accordingly, potentially altering release hot spots. Additionally, at end-of-life, the fraction of AgNP-residues contained on exported textiles can be increased by 350% when assuming short product lifespans and globalized WMS. It becomes evident that certain combinations of use activities, matrix material characteristics, and WMS can influence the regional PEC by several orders of magnitude. Thus, in the light of the findings and expected ENM market potential, future assessments should consider these aspects to derive precautionary design alternatives and to enable prospective global and regional risk assessments. - Highlights: • Textile use activities and two waste management systems (WMSs) are investigated. • Matrix material and use activities determine the ENM release. • Counter-intuitive shifts of releases to air can happen during usage. • WMS export can increase by 350% in case of short service life and

  5. Evaluating protein incorporation and release in electrospun composite scaffolds for bone tissue engineering applications.

    Science.gov (United States)

    Briggs, Tonye; Matos, Jeffrey; Collins, George; Arinzeh, Treena Livingston

    2015-10-01

    Electrospun polymer/ceramic composites have gained interest for use as scaffolds for bone tissue engineering applications. In this study, we investigated methods to incorporate Platelet Derived Growth Factor-BB (PDGF-BB) in electrospun polycaprolactone (PCL) or PCL prepared with polyethylene oxide (PEO), where both contained varying levels (up to 30 wt %) of ceramic composed of biphasic calcium phosphates, hydroxyapatite (HA)/β-tricalcium phosphate (TCP). Using a model protein, lysozyme, we compared two methods of protein incorporation, adsorption and emulsion electrospinning. Adsorption of lysozyme on scaffolds with ceramic resulted in minimal release of lysozyme over time. Using emulsion electrospinning, lysozyme released from scaffolds containing a high concentration of ceramic where the majority of the release occurred at later time points. We investigated the effect of reducing the electrostatic interaction between the protein and the ceramic on protein release with the addition of the cationic surfactant, cetyl trimethylammonium bromide (CTAB). In vitro release studies demonstrated that electrospun scaffolds prepared with CTAB released more lysozyme or PDGF-BB compared with scaffolds without the cationic surfactant. Human mesenchymal stem cells (MSCs) on composite scaffolds containing PDGF-BB incorporated through emulsion electrospinning expressed higher levels of osteogenic markers compared to scaffolds without PDGF-BB, indicating that the bioactivity of the growth factor was maintained. This study revealed methods for incorporating growth factors in polymer/ceramic scaffolds to promote osteoinduction and thereby facilitate bone regeneration. © 2015 Wiley Periodicals, Inc.

  6. Fabrication and characterization of a rapid prototyped tissue engineering scaffold with embedded multicomponent matrix for controlled drug release

    DEFF Research Database (Denmark)

    Chen, Muwan; Le, Dang Q S; Hein, San

    2012-01-01

    Bone tissue engineering implants with sustained local drug delivery provide an opportunity for better postoperative care for bone tumor patients because these implants offer sustained drug release at the tumor site and reduce systemic side effects. A rapid prototyped macroporous polycaprolactone......, this scaffold can fulfill the requirements for both bone tissue engineering and local sustained release of an anticancer drug in vitro. These results suggest that the scaffold can be used clinically in reconstructive surgery after bone tumor resection. Moreover, by changing the composition and amount...... of individual components, the scaffold can find application in other tissue engineering areas that need local sustained release of drug....

  7. Effect of Engineered Nanoparticles on Exopolymeric Substances Release from Marine Phytoplankton

    Science.gov (United States)

    Chiu, Meng-Hsuen; Khan, Zafir A.; Garcia, Santiago G.; Le, Andre D.; Kagiri, Agnes; Ramos, Javier; Tsai, Shih-Ming; Drobenaire, Hunter W.; Santschi, Peter H.; Quigg, Antonietta; Chin, Wei-Chun

    2017-12-01

    Engineered nanoparticles (ENPs), products from modern nanotechnologies, can potentially impact the marine environment to pose serious threats to marine ecosystems. However, the cellular responses of marine phytoplankton to ENPs are still not well established. Here, we investigate four different diatom species ( Odontella mobiliensis, Skeletonema grethae, Phaeodactylum tricornutum, Thalassiosira pseudonana) and one green algae ( Dunaliella tertiolecta) for their extracellular polymeric substances (EPS) release under model ENP treatments: 25 nm titanium dioxide (TiO2), 10-20 nm silicon dioxide (SiO2), and 15-30 nm cerium dioxide (CeO2). We found SiO2 ENPs can significantly stimulate EPS release from these algae (200-800%), while TiO2 ENP exposure induced the lowest release. Furthermore, the increase of intracellular Ca2+ concentration can be triggered by ENPs, suggesting that the EPS release process is mediated through Ca2+ signal pathways. With better understanding of the cellular mechanism mediated ENP-induced EPS release, potential preventative and safety measures can be developed to mitigate negative impact on the marine ecosystem.

  8. An Experimental Investigation on the Combustion and Heat Release Characteristics of an Opposed-Piston Folded-Cranktrain Diesel Engine

    Directory of Open Access Journals (Sweden)

    Fukang Ma

    2015-06-01

    Full Text Available In opposed-piston folded-cranktrain diesel engines, the relative movement rules of opposed-pistons, combustion chamber components and injector position are different from those of conventional diesel engines. The combustion and heat release characteristics of an opposed-piston folded-cranktrain diesel engine under different operating conditions were investigated. Four phases: ignition delay, premixed combustion, diffusion combustion and after combustion are used to describe the heat release process of the engine. Load changing has a small effect on premixed combustion duration while it influences diffusion combustion duration significantly. The heat release process has more significant isochoric and isobaric combustion which differs from the conventional diesel engine situation, except at high exhaust pressure and temperature, due to its two-stroke and uniflow scavenging characteristics. Meanwhile, a relatively high-quality exhaust heat energy is produced in opposed-piston folded-cranktrain diesel engines.

  9. Development of criteria for release of Idaho National Engineering Laboratory sites following decontamination and decommissioning

    International Nuclear Information System (INIS)

    Kirol, L.

    1986-08-01

    Criteria have been developed for release of Idaho National Engineering Laboratory (INEL) facilities and land areas following decontamination and decommissioning (D and D). Although these facilities and land areas are not currently being returned to the public domain, and no plans exist for doing so, criteria suitable for unrestricted release to the public were desired. Midway through this study, the implementation of Department of Energy (DOE) Order 5820.2, Radioactive Waste Management, required development of site specific release criteria for use on D and D projects. These criteria will help prevent remedial actions from being required if INEL reuse considerations change in the future. Development of criteria for release of INEL facilities following D and D comprised four study areas: pathways analysis, dose and concentration guidelines, sampling and instrumentation, and implementation procedures. Because of the complex and sensitive nature of the first three categories, a thorough review by experts in those respective fields was desired. Input and support in preparing or reviewing each part of the criteria development task was solicited from several DOE field offices. Experts were identified and contracted to assist in preparing portions of the release criteria, or to serve on a peer-review committee. Thus, the entire release criteria development task was thoroughly reviewed by recognized experts from contractors at several DOE field offices, to validate technical content of the document. Each of the above four study areas was developed originally as an individual task, and a report was generated from each. These reports are combined here to form this document. This release criteria document includes INEL-specific pathways analysis, instrumentation requirements, sampling procedures, the basis for selection of dose and concentration guidelines, and cost-risk-benefit procedures

  10. Open field release of genetically engineered sterile male Aedes aegypti in Malaysia.

    Directory of Open Access Journals (Sweden)

    Renaud Lacroix

    Full Text Available BACKGROUND: Dengue is the most important mosquito-borne viral disease. In the absence of specific drugs or vaccines, control focuses on suppressing the principal mosquito vector, Aedes aegypti, yet current methods have not proven adequate to control the disease. New methods are therefore urgently needed, for example genetics-based sterile-male-release methods. However, this requires that lab-reared, modified mosquitoes be able to survive and disperse adequately in the field. METHODOLOGY/PRINCIPAL FINDINGS: Adult male mosquitoes were released into an uninhabited forested area of Pahang, Malaysia. Their survival and dispersal was assessed by use of a network of traps. Two strains were used, an engineered 'genetically sterile' (OX513A and a wild-type laboratory strain, to give both absolute and relative data about the performance of the modified mosquitoes. The two strains had similar maximum dispersal distances (220 m, but mean distance travelled of the OX513A strain was lower (52 vs. 100 m. Life expectancy was similar (2.0 vs. 2.2 days. Recapture rates were high for both strains, possibly because of the uninhabited nature of the site. CONCLUSIONS/SIGNIFICANCE: After extensive contained studies and regulatory scrutiny, a field release of engineered mosquitoes was safely and successfully conducted in Malaysia. The engineered strain showed similar field longevity to an unmodified counterpart, though in this setting dispersal was reduced relative to the unmodified strain. These data are encouraging for the future testing and implementation of genetic control strategies and will help guide future field use of this and other engineered strains.

  11. Composite microsphere-functionalized scaffold for the controlled release of small molecules in tissue engineering

    Directory of Open Access Journals (Sweden)

    Laura Pandolfi

    2016-01-01

    Full Text Available Current tissue engineering strategies focus on restoring damaged tissue architectures using biologically active scaffolds. The ideal scaffold would mimic the extracellular matrix of any tissue of interest, promoting cell proliferation and de novo extracellular matrix deposition. A plethora of techniques have been evaluated to engineer scaffolds for the controlled and targeted release of bioactive molecules to provide a functional structure for tissue growth and remodeling, as well as enhance recruitment and proliferation of autologous cells within the implant. Recently, novel approaches using small molecules, instead of growth factors, have been exploited to regulate tissue regeneration. The use of small synthetic molecules could be very advantageous because of their stability, tunability, and low cost. Herein, we propose a chitosan–gelatin scaffold functionalized with composite microspheres consisting of mesoporous silicon microparticles and poly(dl-lactic-co-glycolic acid for the controlled release of sphingosine-1-phospate, a small molecule of interest. We characterized the platform with scanning electron microscopy, Fourier transform infrared spectroscopy, and confocal microscopy. Finally, the biocompatibility of this multiscale system was analyzed by culturing human mesenchymal stem cells onto the scaffold. The presented strategy establishes the basis of a versatile scaffold for the controlled release of small molecules and for culturing mesenchymal stem cells for regenerative medicine applications.

  12. High-performance computing for structural mechanics and earthquake/tsunami engineering

    CERN Document Server

    Hori, Muneo; Ohsaki, Makoto

    2016-01-01

    Huge earthquakes and tsunamis have caused serious damage to important structures such as civil infrastructure elements, buildings and power plants around the globe.  To quantitatively evaluate such damage processes and to design effective prevention and mitigation measures, the latest high-performance computational mechanics technologies, which include telascale to petascale computers, can offer powerful tools. The phenomena covered in this book include seismic wave propagation in the crust and soil, seismic response of infrastructure elements such as tunnels considering soil-structure interactions, seismic response of high-rise buildings, seismic response of nuclear power plants, tsunami run-up over coastal towns and tsunami inundation considering fluid-structure interactions. The book provides all necessary information for addressing these phenomena, ranging from the fundamentals of high-performance computing for finite element methods, key algorithms of accurate dynamic structural analysis, fluid flows ...

  13. Combustion Heat Release Rate Comparison of Algae Hydroprocessed Renewable Diesel to F-76 in a Two-Stroke Diesel Engine

    Science.gov (United States)

    2013-06-01

    was recorded. Figure 14 shows the gauge on the rocker arm during calibration . Figure 14. Mechanical Injector Rocker Arm Strain Gauge. D. DATA...RELEASE RATE COMPARISON OF ALGAE HYDROPROCESSED RENEWABLE DIESEL TO F-76 IN A TWO-STROKE DIESEL ENGINE by John H. Petersen June 2013 Thesis...RELEASE RATE COMPARISON OF ALGAE HYDROPROCESSED RENEWABLE DIESEL TO F-76 IN A TWO-STROKE DIESEL ENGINE 5. FUNDING NUMBERS 6. AUTHOR(S) John H

  14. Sensitivity of the engineered barrier system (EBS) release rate to alternative conceptual models of advective release from waste packages under dripping fractures

    International Nuclear Information System (INIS)

    Lee, J.H.; Atkins, J.E.; McNeish, J.A.; Vallikat, V.

    1996-01-01

    Simulations were conducted to analyze the sensitivity of the engineered barrier system (EBS) release rate to alternative conceptual models of the advective release from waste packages under dripping fractures. The first conceptual model assumed that dripping water directly contacts the waste form inside the 'failed' waste package, and radionuclides are released from the EBS by advection. The second conceptual model assumed that dripping water is diverted around the 'failed' waste package (because of the presence of corrosion products plugging the perforations) and dripping water is prevented from directly contacting the waste form. In the second model, radionuclides were assumed to transport through the perforations by diffusion, and, once outside the waste package, to be released from the EBS by advection. The second model was to incorporate more realism into the EBS release calculations. For the case with the second EBS release model, most radionuclides had significantly lower peak EBS release rates (from at least one to several orders of magnitude) than with the first EBS release model. The impacts of the alternative EBS release models were greater for the radionuclides with a low solubility (or solubility-limited radionuclides) than for the radionuclides with a high solubility (or waste form dissolution-limited radionuclides). The analyses indicated that the EBS release model representing advection through a 'failed' waste package (the first EBS release model) may be too conservative in predicting the EBS performance. One major implication from this sensitivity study was that a 'failed' waste package container with multiple perforations may still be able to perform effectively as an important barrier to radionuclide release. (author)

  15. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  16. Airborne engineered nanomaterials in the workplace—a review of release and worker exposure during nanomaterial production and handling processes

    Energy Technology Data Exchange (ETDEWEB)

    Ding, Yaobo [Institute for Work and Health (IST), Universities of Lausanne and Geneva, Route de la Corniche 2, 1066, Epalinges (Switzerland); Kuhlbusch, Thomas A.J. [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Centre for Nanointegration (CENIDE), University Duisburg-Essen, Duisburg (Germany); Van Tongeren, Martie; Jiménez, Araceli Sánchez [Centre for Human Exposure Science, Institute of Occupational Medicine (IOM), Research Avenue North, Edinburgh EH14 4AP (United Kingdom); Tuinman, Ilse [TNO, Lange Kleiweg 137, Rijswijk (Netherlands); Chen, Rui [CAS Key Laboratory for Biomedical Effects of Nanomaterials and Nanosafety & CAS Center for Excellence in Nanoscience, National Center for Nanoscience and Technology of China, Beijing 100190 (China); Alvarez, Iñigo Larraza [ACCIONA Infrastructure, Materials Area, Innovation Division, C/Valportillo II 8, 28108, Alcobendas (Spain); Mikolajczyk, Urszula [Nofer Institute of Occupational Medicine, Lodz (Poland); Nickel, Carmen; Meyer, Jessica; Kaminski, Heinz [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Wohlleben, Wendel [Dept. Material Physics, BASF SE, Advanced Materials Research, Ludwigshafen (Germany); Stahlmecke, Burkhard [Institute of Energy and Environmental Technology (IUTA), Air Quality & Sustainable Nanotechnology Unit, Bliersheimer Straße 58-60, 47229 Duisburg (Germany); Clavaguera, Simon [NanoSafety Platform, Commissariat à l’Energie Atomique et aux Energies Alternatives (CEA), Univ. Grenoble Alpes, Grenoble, 38054 (France); and others

    2017-01-15

    Highlights: • Release characteristics can be grouped by the type of occupational activities. • Release levels may be linked to process energy. • A better data reporting practice will facilitate exposure assessment. • The results help prioritize industrial processes for human risk assessment. - Abstract: For exposure and risk assessment in occupational settings involving engineered nanomaterials (ENMs), it is important to understand the mechanisms of release and how they are influenced by the ENM, the matrix material, and process characteristics. This review summarizes studies providing ENM release information in occupational settings, during different industrial activities and using various nanomaterials. It also assesses the contextual information — such as the amounts of materials handled, protective measures, and measurement strategies — to understand which release scenarios can result in exposure. High-energy processes such as synthesis, spraying, and machining were associated with the release of large numbers of predominantly small-sized particles. Low-energy processes, including laboratory handling, cleaning, and industrial bagging activities, usually resulted in slight or moderate releases of relatively large agglomerates. The present analysis suggests that process-based release potential can be ranked, thus helping to prioritize release assessments, which is useful for tiered exposure assessment approaches and for guiding the implementation of workplace safety strategies. The contextual information provided in the literature was often insufficient to directly link release to exposure. The studies that did allow an analysis suggested that significant worker exposure might mainly occur when engineering safeguards and personal protection strategies were not carried out as recommended.

  17. Airborne engineered nanomaterials in the workplace—a review of release and worker exposure during nanomaterial production and handling processes

    International Nuclear Information System (INIS)

    Ding, Yaobo; Kuhlbusch, Thomas A.J.; Van Tongeren, Martie; Jiménez, Araceli Sánchez; Tuinman, Ilse; Chen, Rui; Alvarez, Iñigo Larraza; Mikolajczyk, Urszula; Nickel, Carmen; Meyer, Jessica; Kaminski, Heinz; Wohlleben, Wendel; Stahlmecke, Burkhard; Clavaguera, Simon

    2017-01-01

    Highlights: • Release characteristics can be grouped by the type of occupational activities. • Release levels may be linked to process energy. • A better data reporting practice will facilitate exposure assessment. • The results help prioritize industrial processes for human risk assessment. - Abstract: For exposure and risk assessment in occupational settings involving engineered nanomaterials (ENMs), it is important to understand the mechanisms of release and how they are influenced by the ENM, the matrix material, and process characteristics. This review summarizes studies providing ENM release information in occupational settings, during different industrial activities and using various nanomaterials. It also assesses the contextual information — such as the amounts of materials handled, protective measures, and measurement strategies — to understand which release scenarios can result in exposure. High-energy processes such as synthesis, spraying, and machining were associated with the release of large numbers of predominantly small-sized particles. Low-energy processes, including laboratory handling, cleaning, and industrial bagging activities, usually resulted in slight or moderate releases of relatively large agglomerates. The present analysis suggests that process-based release potential can be ranked, thus helping to prioritize release assessments, which is useful for tiered exposure assessment approaches and for guiding the implementation of workplace safety strategies. The contextual information provided in the literature was often insufficient to directly link release to exposure. The studies that did allow an analysis suggested that significant worker exposure might mainly occur when engineering safeguards and personal protection strategies were not carried out as recommended.

  18. [Construction and evaluation of the tissue engineered nerve of bFGF-PLGA sustained release microspheres].

    Science.gov (United States)

    Wang, Guanglin; Lin, Wei; Gao, Weiqiang; Xiao, Yuhua; Dong, Changchao

    2008-12-01

    To study the outcomes of nerve defect repair with the tissue engineered nerve, which is composed of the complex of SCs, 30% ECM gel, bFGF-PLGA sustained release microspheres, PLGA microfilaments and permeable poly (D, L-lactic acid) (PDLLA) catheters. SCs were cultured and purified from the sciatic nerves of 1-day-old neonatal SD rats. The 1st passage cells were compounded with bFGF-PLGA sustained release microspheres and ECM gel, and then were injected into permeable PDLLA catheters with PLGA microfilaments inside. In this way, the tissue engineered nerve was constructed. Sixty SD rats were included. The model of 15-mm sciatic nerve defects was made, and then the rats were randomly divided into 5 groups, with 12 rats in each. In group A, autograft was adopted. In group B, the blank PDLLA catheters with PBS inside were used. In group C, PDLLA catheters, with PLGA microfilaments and 30% ECM gel inside, were used. In group D, PDLLA catheters, with PLGA microfilaments, SCs and 30% ECM gel inside, were used. In group E, the tissue engineered nerve was applied. After the operation, observation was made for general conditions of the rats. The sciatic function index (SFI) analysis was performed at 12, 16, 20 and 24 weeks after the operation, respectively. Electrophysiological detection and histological observation were performed at 12 and 24 weeks after the operation, respectively. All rats survived to the end of the experiment. At 12 and 16 weeks after the operation, group E was significantly different from group B in SFI (P fibers in group E were significantly differents from those in groups A, B and C (P fibers in group E were smaller than those in group A (P fibers in group E was significantly different from those in groups A, B, C (P fibers in group E were bigger than those in groups B and C (P < 0.05). The tissue engineered nerve with the complex of SCs, ECM gel, bFGF-PLGA sustained release microspheres, PLGA microfilaments and permeable PDLLA catheters promote

  19. Sustained release of sphingosine 1-phosphate for therapeutic arteriogenesis and bone tissue engineering.

    Science.gov (United States)

    Sefcik, Lauren S; Petrie Aronin, Caren E; Wieghaus, Kristen A; Botchwey, Edward A

    2008-07-01

    Sphingosine 1-phosphate (S1P) is a bioactive phospholipid that impacts migration, proliferation, and survival in diverse cell types, including endothelial cells, smooth muscle cells, and osteoblast-like cells. In this study, we investigated the effects of sustained release of S1P on microvascular remodeling and associated bone defect healing in vivo. The murine dorsal skinfold window chamber model was used to evaluate the structural remodeling response of the microvasculature. Our results demonstrated that 1:400 (w/w) loading and subsequent sustained release of S1P from poly(lactic-co-glycolic acid) (PLAGA) significantly enhanced lumenal diameter expansion of arterioles and venules after 3 and 7 days. Incorporation of 5-bromo-2-deoxyuridine (BrdU) at day 7 revealed significant increases in mural cell proliferation in response to S1P delivery. Additionally, three-dimensional (3D) scaffolds loaded with S1P (1:400) were implanted into critical-size rat calvarial defects, and healing of bony defects was assessed by radiograph X-ray, microcomputed tomography (muCT), and histology. Sustained release of S1P significantly increased the formation of new bone after 2 and 6 weeks of healing and histological results suggest increased numbers of blood vessels in the defect site. Taken together, these experiments support the use of S1P delivery for promoting microvessel diameter expansion and improving the healing outcomes of tissue-engineered therapies.

  20. Heat release and engine performance effects of soybean oil ethyl ester blending into diesel fuel

    International Nuclear Information System (INIS)

    Bueno, Andre Valente; Velasquez, Jose Antonio; Milanez, Luiz Fernando

    2011-01-01

    The engine performance impact of soybean oil ethyl ester blending into diesel fuel was analyzed employing heat release analysis, in-cylinder exergy balances and dynamometric tests. Blends with concentrations of up to 30% of soybean oil ethyl ester in volume were used in steady-state experiments conducted in a high speed turbocharged direct injection engine. Modifications in fuel heat value, fuel-air equivalence ratio and combustion temperature were found to govern the impact resulting from the addition of biodiesel on engine performance. For the analyzed fuels, the 20% biodiesel blend presented the best results of brake thermal efficiency, while the 10% biodiesel blend presented the best results of brake power and sfc (specific fuel consumption). In relation to mineral diesel and in full load conditions, an average increase of 4.16% was observed in brake thermal efficiency with B20 blend. In the same conditions, an average gain of 1.15% in brake power and a reduction of 1.73% in sfc was observed with B10 blend.

  1. Market-implied spread for earthquake CAT bonds: financial implications of engineering decisions.

    Science.gov (United States)

    Damnjanovic, Ivan; Aslan, Zafer; Mander, John

    2010-12-01

    In the event of natural and man-made disasters, owners of large-scale infrastructure facilities (assets) need contingency plans to effectively restore the operations within the acceptable timescales. Traditionally, the insurance sector provides the coverage against potential losses. However, there are many problems associated with this traditional approach to risk transfer including counterparty risk and litigation. Recently, a number of innovative risk mitigation methods, termed alternative risk transfer (ART) methods, have been introduced to address these problems. One of the most important ART methods is catastrophe (CAT) bonds. The objective of this article is to develop an integrative model that links engineering design parameters with financial indicators including spread and bond rating. The developed framework is based on a four-step structural loss model and transformed survival model to determine expected excess returns. We illustrate the framework for a seismically designed bridge using two unique CAT bond contracts. The results show a nonlinear relationship between engineering design parameters and market-implied spread. © 2010 Society for Risk Analysis.

  2. Applications of human factors engineering to LNG release prevention and control

    Energy Technology Data Exchange (ETDEWEB)

    Shikiar, R.; Rankin, W.L.; Rideout, T.B.

    1982-06-01

    The results of an investigation of human factors engineering and human reliability applications to LNG release prevention and control are reported. The report includes a discussion of possible human error contributions to previous LNG accidents and incidents, and a discussion of generic HF considerations for peakshaving plants. More specific recommendations for improving HF practices at peakshaving plants are offered based on visits to six facilities. The HF aspects of the recently promulgated DOT regulations are reviewed, and recommendations are made concerning how these regulations can be implemented utilizing standard HF practices. Finally, the integration of HF considerations into overall system safety is illustrated by a presentation of human error probabilities applicable to LNG operations and by an expanded fault tree analysis which explicitly recognizes man-machine interfaces.

  3. Sensitivity of performance assessment of the engineered barriers to nuances of release rate criteria

    International Nuclear Information System (INIS)

    Oliver, D.L.R.

    1987-01-01

    The United States Nuclear Regulatory Commission (NRC) has established criteria for the long-term performance of proposed high-level waste repositories. As with any regulation, the criteria may be interpreted in several ways. Due to the high capital costs and the emotional political climate associated with any high-level radioactive waste repository, it is important that there be an early consensus regarding interpretations of the criteria, and what assumptions may be used to demonstrate compliance with them. This work uses analytic solutions of mass transport theory to demonstrate how sensitive performance analyses are to various nuances of the NRC release rate criterion for the engineered barriers. The analysis is directed at the proposed repository in basalt at the Hanford site in Washington State

  4. Neural tissue engineering scaffold with sustained RAPA release relieves neuropathic pain in rats.

    Science.gov (United States)

    Ding, Tan; Zhu, Chao; Kou, Zhen-Zhen; Yin, Jun-Bin; Zhang, Ting; Lu, Ya-Cheng; Wang, Li-Ying; Luo, Zhuo-Jing; Li, Yun-Qing

    2014-09-01

    To investigate the effect of locally slow-released rapamycin (RAPA) from bionic peripheral nerve stent to reduce the incidence of neuropathic pain or mitigate the degree of pain after nerve injury. We constructed a neural tissue engineering scaffold with sustained release of RAPA to repair 20mm defects in rat sciatic nerves. Four presurgical and postsurgical time windows were selected to monitor the changes in the expression of pain-related dorsal root ganglion (DRG) voltage-gated sodium channels 1.3 (Nav1.3), 1.7 (Nav1.7), and 1.8 (Nav1.8) through immunohistochemistry (IHC) and Western Blot, along with the observation of postsurgical pathological pain in rats by pain-related behavior approaches. Relatively small upregulation of DRG sodium channels was observed in the experimental group (RAPA+poly(lactic-co-glycolic acid) (PLGA)+stent) after surgery, along with low degrees of neuropathic pain and anxiety, which were similar to those in the Autologous nerve graft group. Autoimmune inflammatory response plays a leading role in the occurrence of post-traumatic neuropathic pain, and that RAPA significantly inhibits the abnormal upregulation of sodium channels to reduce pain by alleviating inflammatory response. Copyright © 2014 Elsevier Inc. All rights reserved.

  5. Simulating Exposure Concentrations of Engineered Nanomaterials in Surface Water Systems: Release of WASP8

    Science.gov (United States)

    Knightes, C. D.; Bouchard, D.; Zepp, R. G.; Henderson, W. M.; Han, Y.; Hsieh, H. S.; Avant, B. K.; Acrey, B.; Spear, J.

    2017-12-01

    The unique properties of engineered nanomaterials led to their increased production and potential release into the environment. Currently available environmental fate models developed for traditional contaminants are limited in their ability to simulate nanomaterials' environmental behavior. This is due to an incomplete understanding and representation of the processes governing nanomaterial distribution in the environment and by scarce empirical data quantifying the interaction of nanomaterials with environmental surfaces. The well-known Water Quality Analysis Simulation Program (WASP) was updated to incorporate nanomaterial-specific processes, specifically hetero-aggregation with particulate matter. In parallel with this effort, laboratory studies were used to quantify parameter values parameters necessary for governing processes in surface waters. This presentation will discuss the recent developments in the new architecture for WASP8 and the newly constructed Advanced Toxicant Module. The module includes advanced algorithms for increased numbers of state variables: chemicals, solids, dissolved organic matter, pathogens, temperature, and salinity. This presentation will focus specifically on the incorporation of nanomaterials, with the applications of the fate and transport of hypothetical releases of Multi-Walled Carbon Nanotubes (MWCNT) and Graphene Oxide (GO) into the headwaters of a southeastern US coastal plains river. While this presentation focuses on nanomaterials, the advanced toxicant module can also simulate metals and organic contaminants.

  6. Novel cobalt releasing sol-gel derived bioactive glass for bone tissue engineering

    International Nuclear Information System (INIS)

    Oliveira, Ana Celeste Ximenes; Barrioni, Breno Rocha; Leite, Maria de Fatima; Pereira, Marivalda Magalhaes

    2016-01-01

    Full text: Bone defects are caused by traumas, congenital disorders or infections, and bone grafts are the usual treatment. However, limitations of this therapy have lead to the advance of tissue engineering approaches. Bioactive glasses (BG) are an attractive bioactive ceramic for bone repair [1], due to its osteogenic properties and capability of releasing different ions, inducing specific biological responses. Tissue repair depends also on blood vessels formation. Among angiogenic agents, cobalt ion has been regarded as strategic component to incorporate into ion releasing materials. In this study, 5% (molar) cobalt releasing BG was synthesized by sol-gel method. To characterize the material, powder samples were evaluated by FTIR and DRX. To access the cytotoxic effects, MTT and LIVE/DEAD tests were performed on osteoblasts exposed to the ionic product of the material (100 μg/mL) for 72h. FTIR analysis reveals typical absorption bands of present groups in BG. X-ray diffractogram of DRX confirmed the amorphous character of BG, without the occurrence of recrystallization of cobalt precursor, suggesting that cobalt incorporation was well succeeded. MTT test showed that cells exposed to ionic product presented high levels of metabolic activity. LIVE/DEAD assay evidenced that cell membrane integrity and intracellular esterases activity were preserved. Both cytotoxic tests proved that cobalt-BG material generated a cell friendly environment. This work shows that BG with cobalt agent presented proper structural features and a non-cytotoxic behaviour. Reference: [1] Hench LL, J Mater Sci Mater Med 17(11), 967-78 (2006). (author)

  7. Metabolic engineering of a diazotrophic bacterium improves ammonium release and biofertilization of plants and microalgae.

    Science.gov (United States)

    Ambrosio, Rafael; Ortiz-Marquez, Juan Cesar Federico; Curatti, Leonardo

    2017-03-01

    The biological nitrogen fixation carried out by some Bacteria and Archaea is one of the most attractive alternatives to synthetic nitrogen fertilizers. However, with the exception of the symbiotic rhizobia-legumes system, progress towards a more extensive realization of this goal has been slow. In this study we manipulated the endogenous regulation of both nitrogen fixation and assimilation in the aerobic bacterium Azotobacter vinelandii. Substituting an exogenously inducible promoter for the native promoter of glutamine synthetase produced conditional lethal mutant strains unable to grow diazotrophically in the absence of the inducer. This mutant phenotype could be reverted in a double mutant strain bearing a deletion in the nifL gene that resulted in constitutive expression of nif genes and increased production of ammonium. Under GS non-inducing conditions both the single and the double mutant strains consistently released very high levels of ammonium (>20mM) into the growth medium. The double mutant strain grew and excreted high levels of ammonium under a wider range of concentrations of the inducer than the single mutant strain. Induced mutant cells could be loaded with glutamine synthetase at different levels, which resulted in different patterns of extracellular ammonium accumulation afterwards. Inoculation of the engineered bacteria into a microalgal culture in the absence of sources of C and N other than N 2 and CO 2 from the air, resulted in a strong proliferation of microalgae that was suppressed upon addition of the inducer. Both single and double mutant strains also promoted growth of cucumber plants in the absence of added N-fertilizer, while this property was only marginal in the parental strain. This study provides a simple synthetic genetic circuit that might inspire engineering of optimized inoculants that efficiently channel N 2 from the air into crops. Copyright © 2017 International Metabolic Engineering Society. Published by Elsevier Inc. All

  8. Fabrication and characterization of a rapid prototyped tissue engineering scaffold with embedded multicomponent matrix for controlled drug release

    Science.gov (United States)

    Chen, Muwan; Le, Dang QS; Hein, San; Li, Pengcheng; Nygaard, Jens V; Kassem, Moustapha; Kjems, Jørgen; Besenbacher, Flemming; Bünger, Cody

    2012-01-01

    Bone tissue engineering implants with sustained local drug delivery provide an opportunity for better postoperative care for bone tumor patients because these implants offer sustained drug release at the tumor site and reduce systemic side effects. A rapid prototyped macroporous polycaprolactone scaffold was embedded with a porous matrix composed of chitosan, nanoclay, and β-tricalcium phosphate by freeze-drying. This composite scaffold was evaluated on its ability to deliver an anthracycline antibiotic and to promote formation of mineralized matrix in vitro. Scanning electronic microscopy, confocal imaging, and DNA quantification confirmed that immortalized human bone marrow-derived mesenchymal stem cells (hMSC-TERT) cultured in the scaffold showed high cell viability and growth, and good cell infiltration to the pores of the scaffold. Alkaline phosphatase activity and osteocalcin staining showed that the scaffold was osteoinductive. The drug-release kinetics was investigated by loading doxorubicin into the scaffold. The scaffolds comprising nanoclay released up to 45% of the drug for up to 2 months, while the scaffold without nanoclay released 95% of the drug within 4 days. Therefore, this scaffold can fulfill the requirements for both bone tissue engineering and local sustained release of an anticancer drug in vitro. These results suggest that the scaffold can be used clinically in reconstructive surgery after bone tumor resection. Moreover, by changing the composition and amount of individual components, the scaffold can find application in other tissue engineering areas that need local sustained release of drug. PMID:22904634

  9. Engineered collagen hydrogels for the sustained release of biomolecules and imaging agents: promoting the growth of human gingival cells.

    Science.gov (United States)

    Choi, Jonghoon; Park, Hoyoung; Kim, Taeho; Jeong, Yoon; Oh, Myoung Hwan; Hyeon, Taeghwan; Gilad, Assaf A; Lee, Kwan Hyi

    2014-01-01

    We present here the in vitro release profiles of either fluorescently labeled biomolecules or computed tomography contrast nanoagents from engineered collagen hydrogels under physiological conditions. The collagen constructs were designed as potential biocompatible inserts into wounded human gingiva. The collagen hydrogels were fabricated under a variety of conditions in order to optimize the release profile of biomolecules and nanoparticles for the desired duration and amount. The collagen constructs containing biomolecules/nanoconstructs were incubated under physiological conditions (ie, 37°C and 5% CO2) for 24 hours, and the release profile was tuned from 20% to 70% of initially loaded materials by varying the gelation conditions of the collagen constructs. The amounts of released biomolecules and nanoparticles were quantified respectively by measuring the intensity of fluorescence and X-ray scattering. The collagen hydrogel we fabricated may serve as an efficient platform for the controlled release of biomolecules and imaging agents in human gingiva to facilitate the regeneration of oral tissues.

  10. Dealing with uncertainty in Earthquake Engineering: a discussion on the application of the Theory of Open Dynamical Systems

    OpenAIRE

    Quintana-Gallo, Patricio; Rebolledo, Rolando; Allan, George

    2013-01-01

    Earthquakes, as a natural phenomenon and their consequences upon structures, have been addressed from deterministic, pseudo-empirical and primary statistical-probabilistic points of view. In the latter approach, 'primary' is meant to suggest that randomness has been artificially introduced into the variables of investigation. An alternative view has been advanced by a number ofresearchers that have classified earthquakes as chaotic from an ontological perspective. Their arguments are founded ...

  11. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  12. Intermediate temperature heat release in an HCCI engine fueled by ethanol/n-heptane mixtures: An experimental and modeling study

    KAUST Repository

    Vuilleumier, David

    2014-03-01

    This study examines intermediate temperature heat release (ITHR) in homogeneous charge compression ignition (HCCI) engines using blends of ethanol and n-heptane. Experiments were performed over the range of 0-50% n-heptane liquid volume fractions, at equivalence ratios 0.4 and 0.5, and intake pressures from 1.4bar to 2.2bar. ITHR was induced in the mixtures containing predominantly ethanol through the addition of small amounts of n-heptane. After a critical threshold, additional n-heptane content yielded low temperature heat release (LTHR). A method for quantifying the amount of heat released during ITHR was developed by examining the second derivative of heat release, and this method was then used to identify trends in the engine data. The combustion process inside the engine was modeled using a single-zone HCCI model, and good qualitative agreement of pre-ignition pressure rise and heat release rate was found between experimental and modeling results using a detailed n-heptane/ethanol chemical kinetic model. The simulation results were used to identify the dominant reaction pathways contributing to ITHR, as well as to verify the chemical basis behind the quantification of the amount of ITHR in the experimental analysis. The dominant reaction pathways contributing to ITHR were found to be H-atom abstraction from n-heptane by OH and the addition of fuel radicals to O2. © 2013 The Combustion Institute.

  13. Intermediate temperature heat release in an HCCI engine fueled by ethanol/n-heptane mixtures: An experimental and modeling study

    KAUST Repository

    Vuilleumier, David; Kozarac, Darko; Mehl, Marco; Saxena, Samveg; Pitz, William J.; Dibble, Robert W.; Chen, Jyhyuan; Sarathy, Mani

    2014-01-01

    This study examines intermediate temperature heat release (ITHR) in homogeneous charge compression ignition (HCCI) engines using blends of ethanol and n-heptane. Experiments were performed over the range of 0-50% n-heptane liquid volume fractions, at equivalence ratios 0.4 and 0.5, and intake pressures from 1.4bar to 2.2bar. ITHR was induced in the mixtures containing predominantly ethanol through the addition of small amounts of n-heptane. After a critical threshold, additional n-heptane content yielded low temperature heat release (LTHR). A method for quantifying the amount of heat released during ITHR was developed by examining the second derivative of heat release, and this method was then used to identify trends in the engine data. The combustion process inside the engine was modeled using a single-zone HCCI model, and good qualitative agreement of pre-ignition pressure rise and heat release rate was found between experimental and modeling results using a detailed n-heptane/ethanol chemical kinetic model. The simulation results were used to identify the dominant reaction pathways contributing to ITHR, as well as to verify the chemical basis behind the quantification of the amount of ITHR in the experimental analysis. The dominant reaction pathways contributing to ITHR were found to be H-atom abstraction from n-heptane by OH and the addition of fuel radicals to O2. © 2013 The Combustion Institute.

  14. Concrete release protocol case studies for decommissioning work at the Idaho National Engineering and Environmental Laboratory

    International Nuclear Information System (INIS)

    Kamboj, S.; Arnish, J.; Chen, S-Y; Parker, F. L.; Phillips, A. M.; Tripp, J. L.; Meservey, R. H.

    2000-01-01

    The US Department of Energy (DOE) Order 5400.5, ''Radiation Protection of the Public and Environment'' contains provisions pertinent to releasing potentially radioactive materials from DOE facilities for reuse or recycle. A process of authorized release for materials recovered from radiation areas is permitted under Order 5400.5 and the proposed rule in Title 10, Part 834, of the Code of Federal Regulations (10 CFR Part 834). A generic disposition protocol to facilitate release of concrete under these provisions has been developed. This report analyzes the application of that generic protocol to site-specific cases at the Idaho National Engineering and Environmental Laboratory (INEEL). The potential radiological doses and costs for several concrete disposition alternatives for the sewage treatment plant (STP) at the Central Facilities Area (CFA) of INEEL were evaluated in this analysis. Five disposition alternatives were analyzed for the concrete: (A) decontaminate, crush, and reuse; (B) crush and reuse without decontamination; (C) decontaminate, demolish, and dispose of at a nonradiological landfill; (D) demolish and dispose of at a nonradiological landfill without decontamination; and (E) demolish and dispose of at a low-level radioactive waste (LLW) facility. The analysis was performed for disposition of concrete from four INEEL structures: (1) trickle filter, (2) primary clarifier, (3) secondary clarifier, and (4) CFA-691 pumphouse for a generic case (based on default parameters from the disposition protocol) and an INEEL-specific case (based on INEEL-specific parameters). The results of the analysis indicated that Alternatives B and D would incur the lowest cost and result in a dose less than 1 mrem/yr (except for the trickle filter, the dose for which was estimated at 1.9 mrem/yr) for nonradiological workers. The analysis indicated that the main contributor to the radiological dose would be cobalt-60 contamination in the concrete. A characterization conducted

  15. Modelling the effect of injection pressure on heat release parameters and nitrogen oxides in direct injection diesel engines

    Directory of Open Access Journals (Sweden)

    Yüksek Levent

    2014-01-01

    Full Text Available Investigation and modelling the effect of injection pressure on heat release parameters and engine-out nitrogen oxides are the main aim of this study. A zero-dimensional and multi-zone cylinder model was developed for estimation of the effect of injection pressure rise on performance parameters of diesel engine. Double-Wiebe rate of heat release global model was used to describe fuel combustion. extended Zeldovich mechanism and partial equilibrium approach were used for modelling the formation of nitrogen oxides. Single cylinder, high pressure direct injection, electronically controlled, research engine bench was used for model calibration. 1000 and 1200 bars of fuel injection pressure were investigated while injection advance, injected fuel quantity and engine speed kept constant. The ignition delay of injected fuel reduced 0.4 crank angle with 1200 bars of injection pressure and similar effect observed in premixed combustion phase duration which reduced 0.2 crank angle. Rate of heat release of premixed combustion phase increased 1.75 % with 1200 bar injection pressure. Multi-zone cylinder model showed good agreement with experimental in-cylinder pressure data. Also it was seen that the NOx formation model greatly predicted the engine-out NOx emissions for both of the operation modes.

  16. Fabrication and characterization of a rapid prototyped tissue engineering scaffold with embedded multicomponent matrix for controlled drug release

    Directory of Open Access Journals (Sweden)

    Chen M

    2012-08-01

    Full Text Available Muwan Chen,1,2 Dang QS Le,1,2 San Hein,2 Pengcheng Li,1 Jens V Nygaard,2 Moustapha Kassem,3 Jørgen Kjems,2 Flemming Besenbacher,2 Cody Bünger11Orthopaedic Research Lab, Aarhus University Hospital, Aarhus C, Denmark; 2Interdisciplinary Nanoscience Center (iNANO, Aarhus University, Aarhus C, Denmark; 3Department of Endocrinology and Metabolism, Odense University Hospital, Odense C, DenmarkAbstract: Bone tissue engineering implants with sustained local drug delivery provide an opportunity for better postoperative care for bone tumor patients because these implants offer sustained drug release at the tumor site and reduce systemic side effects. A rapid prototyped macroporous polycaprolactone scaffold was embedded with a porous matrix composed of chitosan, nanoclay, and β-tricalcium phosphate by freeze-drying. This composite scaffold was evaluated on its ability to deliver an anthracycline antibiotic and to promote formation of mineralized matrix in vitro. Scanning electronic microscopy, confocal imaging, and DNA quantification confirmed that immortalized human bone marrow-derived mesenchymal stem cells (hMSC-TERT cultured in the scaffold showed high cell viability and growth, and good cell infiltration to the pores of the scaffold. Alkaline phosphatase activity and osteocalcin staining showed that the scaffold was osteoinductive. The drug-release kinetics was investigated by loading doxorubicin into the scaffold. The scaffolds comprising nanoclay released up to 45% of the drug for up to 2 months, while the scaffold without nanoclay released 95% of the drug within 4 days. Therefore, this scaffold can fulfill the requirements for both bone tissue engineering and local sustained release of an anticancer drug in vitro. These results suggest that the scaffold can be used clinically in reconstructive surgery after bone tumor resection. Moreover, by changing the composition and amount of individual components, the scaffold can find application in other

  17. Earthquake Engineering Support

    Science.gov (United States)

    1999-11-01

    recovered from eql 3c 7 525 49% loose 79% dense 2 5/5/98 1 3 Nevada sand, ESB #2 3d 11 525 54% loose 80% dense 2 3/9/98 2.5 4 Nevada sand, ESB #2 3e...The pore pressure transducers used in the experiments were manufactured by Druck , and are widely used in centrifuge modelling. Typical

  18. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  19. Utilisation of VOC in Diesel Engines. Ignition and combustion of VOC released in crude oil tankers

    International Nuclear Information System (INIS)

    Melhus, Oeyvin

    2002-01-01

    the radiation of visible light from the diffusion combustion of diesel oil and VOC Fuel (i.e. propane, iso-butane and n-butane) are quite different. First, this radiation disturbs the Schlieren image and second, the radiation from the combustion of diesel oil is far more intense than that of the VOC Fuel. The light VOC fraction of the vent gas - methane and ethane - is not utilised in the concept of ''Condensate Diesel Process''. This fraction represents about 15 % of the total energy in the VOC release when loading crude oil at the Statfjord field. At other fields as Gullfaks, this fraction can represent up to 50% or more of the total energy. After the VOC Fuel is produced, a residual VOC consisting of methane, ethane, some propane and inert gas is lost. A useful and simple way of utilising even this fraction is to mix it with the charge air at low pressure and feed the mixture into the cylinder where a pilot fuel spray ignites the charge. The method is found to have potential of being a suitable way, at least theoretically, to utilise the light VOC fraction. Some practical difficulties, however, may restrict the use of this fraction to medium and high engine loads. At lower loads the ignition delay increases due to the dilution with great quantities of inert gas. Another option to utilise the light VOC fraction is by capturing the gas in hydrates. No real study of this concept has been carried out, but an initial survey of possible solutions is described. A final conclusion of the potential of this concept cannot be drawn until more detailed work has been carried out. However, simply using the light VOC fraction extracted by melting the hydrate will be the most likely way. As a main conclusion it can be stated that the use of VOC Fuel in a ''Condensate Diesel Process'' is a feasible way of utilising energy otherwise lost

  20. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  1. Size-fractionated characterization and quantification of nanoparticle release rates from a consumer spray product containing engineered nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Hagendorfer, Harald, E-mail: Harald.Hagendorfer@empa.c [EMPA, Swiss Federal Laboratories for Materials Testing and Research (Switzerland); Lorenz, Christiane, E-mail: Christiane.Lorenz@chem.ethz.c [ETHZ, Swiss Federal Institute of Technology Zurich (Switzerland); Kaegi, Ralf, E-mail: Ralf.Kaegi@eawag.ch; Sinnet, Brian, E-mail: Brian.Sinnet@eawag.c [EAWAG, Swiss Federal Institute of Aquatic Science and Technology (Switzerland); Gehrig, Robert, E-mail: Robert.Gehrig@empa.c [EMPA, Swiss Federal Laboratories for Materials Testing and Research (Switzerland); Goetz, Natalie V., E-mail: Natalie.vonGoetz@chem.ethz.ch; Scheringer, Martin, E-mail: Martin.Scheringer@chem.ethz.c [ETHZ, Swiss Federal Institute of Technology Zurich (Switzerland); Ludwig, Christian, E-mail: Christian.Ludwig@psi.c [PSI, Paul Scherrer Institue (Switzerland); Ulrich, Andrea, E-mail: Andrea.Ulrich@empa.c [EMPA, Swiss Federal Laboratories for Materials Testing and Research (Switzerland)

    2010-09-15

    This study describes methods developed for reliable quantification of size- and element-specific release of engineered nanoparticles (ENP) from consumer spray products. A modified glove box setup was designed to allow controlled spray experiments in a particle-minimized environment. Time dependence of the particle size distribution in a size range of 10-500 nm and ENP release rates were studied using a scanning mobility particle sizer (SMPS). In parallel, the aerosol was transferred to a size-calibrated electrostatic TEM sampler. The deposited particles were investigated using electron microscopy techniques in combination with image processing software. This approach enables the chemical and morphological characterization as well as quantification of released nanoparticles from a spray product. The differentiation of solid ENP from the released nano-sized droplets was achieved by applying a thermo-desorbing unit. After optimization, the setup was applied to investigate different spray situations using both pump and gas propellant spray dispensers for a commercially available water-based nano-silver spray. The pump spray situation showed no measurable nanoparticle release, whereas in the case of the gas spray, a significant release was observed. From the results it can be assumed that the homogeneously distributed ENP from the original dispersion grow in size and change morphology during and after the spray process but still exist as nanometer particles of size <100 nm. Furthermore, it seems that the release of ENP correlates with the generated aerosol droplet size distribution produced by the spray vessel type used. This is the first study presenting results concerning the release of ENP from spray products.

  2. Size-fractionated characterization and quantification of nanoparticle release rates from a consumer spray product containing engineered nanoparticles

    International Nuclear Information System (INIS)

    Hagendorfer, Harald; Lorenz, Christiane; Kaegi, Ralf; Sinnet, Brian; Gehrig, Robert; Goetz, Natalie V.; Scheringer, Martin; Ludwig, Christian; Ulrich, Andrea

    2010-01-01

    This study describes methods developed for reliable quantification of size- and element-specific release of engineered nanoparticles (ENP) from consumer spray products. A modified glove box setup was designed to allow controlled spray experiments in a particle-minimized environment. Time dependence of the particle size distribution in a size range of 10-500 nm and ENP release rates were studied using a scanning mobility particle sizer (SMPS). In parallel, the aerosol was transferred to a size-calibrated electrostatic TEM sampler. The deposited particles were investigated using electron microscopy techniques in combination with image processing software. This approach enables the chemical and morphological characterization as well as quantification of released nanoparticles from a spray product. The differentiation of solid ENP from the released nano-sized droplets was achieved by applying a thermo-desorbing unit. After optimization, the setup was applied to investigate different spray situations using both pump and gas propellant spray dispensers for a commercially available water-based nano-silver spray. The pump spray situation showed no measurable nanoparticle release, whereas in the case of the gas spray, a significant release was observed. From the results it can be assumed that the homogeneously distributed ENP from the original dispersion grow in size and change morphology during and after the spray process but still exist as nanometer particles of size <100 nm. Furthermore, it seems that the release of ENP correlates with the generated aerosol droplet size distribution produced by the spray vessel type used. This is the first study presenting results concerning the release of ENP from spray products.

  3. Estimated airborne release of radionuclides from the Battelle Memorial Institute Columbus Laboratories JN-1b building at the West Jefferson site as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.

    1981-11-01

    The potential airborne releases of radionuclides (source terms) that could result from wind and earthquake dmage are estimated for the Battelle Memorial Institute Columbus Laboratories JN-1b Building at the West Jefferson site in Ohio. The estimated source terms are based on the damage to barriers containing the radionuclides, the inventory of radionuclides at risk, and the fraction of the inventory made airborne as a result of the loss of containment. In an attempt to provide a realistic range of potential source terms that include most of the normal operating conditions, a best estimate bounded by upper and lower limits is calculated by combining the upper-bound, best-estimate, and lower-bound inventories-at-risk with an airborne release factor (upper-bound, best-estimate, and lower-bound if possible) for the situation. The factors used to evaluate the fractional airborne release of materials and the exchange rates between enclosed and exterior atmospheres are discussed. The postulated damage and source terms are discussed for wind and earthquake hazard scenarios in order of their increasing severity

  4. Cylinder pressure, performance parameters, heat release, specific heats ratio and duration of combustion for spark ignition engine

    International Nuclear Information System (INIS)

    Shehata, M.S.

    2010-01-01

    An experimental work were conducted for investigating cylinder pressure, performance parameters, heat release, specific heat ratio and duration of combustion for multi cylinder spark ignition engine (SIE). Ccylinder pressure was measured for gasoline, kerosene and Liquefied Petroleum Gases (LPG) separately as a fuel for SIE. Fast Fourier Transformations (FFT) was used to cylinder pressure data transform from time domain into frequency domain to develop empirical correlation for calculating cylinder pressures at different engine speeds and different fuels. In addition, Inverse Fast Fourier Transformations (IFFT) was used to cylinder pressure reconstruct into time domain. The results gave good agreement between the measured cylinder pressure and the reconstructed cylinder pressure in time domain with different engine speeds and different fuels. The measured cylinder pressure and hydraulic dynamotor were the sours of data for calculating engine performance parameters. First law of thermodynamics and single zone heat release model with temperature dependant specific heat ratio γ(T) were the main tools for calculating heat release and heat transfer to cylinder walls. Third order empirical correlation for calculating γ(T) was one of the main gains of the present study. The correlation gave good agreement with other researchers with wide temperatures range. For kerosene, cylinder pressure is higher than for gasoline and LPG due to high volumetric efficiency where kerosene density (mass/volume ratio) is higher than gasoline and LPG. In addition, kerosene heating value is higher than gasoline that contributes in heat release rate and pressure increases. Duration of combustion for different engine speeds was determined using four different methods: (I) Mass fuel burnt, (II) Entropy change, (III) Temperature dependant specific heat ratio γ(T), and (IV) Logarithmic scale of (P and V). The duration of combustion for kerosene is smaller than for gasoline and LPG due to high

  5. Cylinder pressure, performance parameters, heat release, specific heats ratio and duration of combustion for spark ignition engine

    Energy Technology Data Exchange (ETDEWEB)

    Shehata, M.S. [Mechanical Engineering Technology Department, Higher Institute of Technology, Banha University, 4Zagalol Street, Benha, Galubia 1235 Z (Egypt)

    2010-12-15

    An experimental work were conducted for investigating cylinder pressure, performance parameters, heat release, specific heat ratio and duration of combustion for multi cylinder spark ignition engine (SIE). Ccylinder pressure was measured for gasoline, kerosene and Liquefied Petroleum Gases (LPG) separately as a fuel for SIE. Fast Fourier Transformations (FFT) was used to cylinder pressure data transform from time domain into frequency domain to develop empirical correlation for calculating cylinder pressures at different engine speeds and different fuels. In addition, Inverse Fast Fourier Transformations (IFFT) was used to cylinder pressure reconstruct into time domain. The results gave good agreement between the measured cylinder pressure and the reconstructed cylinder pressure in time domain with different engine speeds and different fuels. The measured cylinder pressure and hydraulic dynamotor were the source of data for calculating engine performance parameters. First law of thermodynamics and single zone heat release model with temperature dependant specific heat ratio {gamma}(T) were the main tools for calculating heat release and heat transfer to cylinder walls. Third order empirical correlation for calculating {gamma}(T) was one of the main gains of the present study. The correlation gave good agreement with other researchers with wide temperatures range. For kerosene, cylinder pressure is higher than for gasoline and LPG due to high volumetric efficiency where kerosene density (mass/volume ratio) is higher than gasoline and LPG. In addition, kerosene heating value is higher than gasoline that contributes in heat release rate and pressure increases. Duration of combustion for different engine speeds was determined using four different methods: (I) Mass fuel burnt, (II) Entropy change, (III) Temperature dependant specific heat ratio {gamma}(T), and (IV) Logarithmic scale of (P and V). The duration of combustion for kerosene is smaller than for gasoline and

  6. Modified n-HA/PA66 scaffolds with chitosan coating for bone tissue engineering: cell stimulation and drug release.

    Science.gov (United States)

    Zou, Qin; Li, Junfeng; Niu, Lulu; Zuo, Yi; Li, Jidong; Li, Yubao

    2017-09-01

    The dipping-drying procedure and cross-linking method were used to make drug-loaded chitosan (CS) coating on nano-hydroxyapatite/polyamide66 (nHA/PA66) composite porous scaffold, endowing the scaffold controlled drug release functionality. The prefabricated scaffold was immersed into an aqueous drug/CS solution in a vacuum condition and then crosslinked by vanillin. The structure, porosity, composition, compressive strength, swelling ratio, drug release and cytocompatibility of the pristine and coating scaffolds were investigated. After coating, the scaffold porosity and pore interconnection were slightly decreased. Cytocompatibility performance was observed through an in vitro experiment based on cell attachment and the MTT assay by MG63 cells which revealed positive cell viability and increasing proliferation over the 11-day period in vitro. The drug could effectively release from the coated scaffold in a controlled fashion and the release rate was sustained for a long period and highly dependent on coating swelling, suggesting the possibility of a controlled drug release. Our results demonstrate that the scaffold with drug-loaded crosslinked CS coating can be used as a simple technique to render the surfaces of synthetic scaffolds active, thus enabling them to be a promising high performance biomaterial in bone tissue engineering.

  7. Controllable mineral coatings on scaffolds as carriers for growth factor release for bone tissue engineering

    Science.gov (United States)

    Saurez-Gonzalez, Darilis

    The work presented in this document, focused on the development and characterization of mineral coatings on scaffold materials to serve as templates for growth factor binding and release. Mineral coatings were formed using a biomimetic approach that consisted in the incubation of scaffolds in modified simulated body fluids (mSBF). To modulate the properties of the mineral coating, which we hypothesized would dictate growth factor release, we used carbonate (HCO3) concentration in mSBF of 4.2 mM, 25mM, and 100mM. Analysis of the mineral coatings formed using scanning electron microscopy indicated growth of a continuous layer of mineral with different morphologies. X-ray diffraction analysis showed peaks associated with hydroxyapatite. FTIR data confirmed the substitution of HCO3 in the mineral. As the extent of HCO3 substitution increased, the coating exhibited more rapid dissolution kinetics in an environment deficient in calcium and phosphate. The mineral coatings provided an effective mechanism for bioactive growth factor binding and release. Peptide versions of vascular endothelial growth factor (VEGF) and bone morphogenetic protein 2 (BMP2) were bound with efficiencies up to 90% to mineral-coated PCL scaffolds. Recombinant human vascular endothelial growth factor (rhVEGF) also bound to mineral coated scaffolds with lower efficiency (20%) and released with faster release kinetics compared to peptides growth factor. Released rhVEGF induced human umbilical vein endothelial cell (HUVEC) proliferation in vitro and enhanced blood vessel formation in vivo in an intramuscular sheep model. In addition to the use the mineral coatings for single growth factor release, we expanded the concept and bound both an angiogenic (rhVEGF) and osteogenic (mBMP2) growth factor by a simple double dipping process. Sustained release of both growth factors was demonstrated for over 60 days. Released rhVEGF enhanced blood vessel formation in vivo in sheep and its biological activity was

  8. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  9. 33 CFR 222.4 - Reporting earthquake effects.

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Reporting earthquake effects. 222..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... significant earthquakes. It primarily concerns damage surveys following the occurrences of earthquakes. (b...

  10. An Fc engineering approach that modulates antibody-dependent cytokine release without altering cell-killing functions.

    Science.gov (United States)

    Kinder, Michelle; Greenplate, Allison R; Strohl, William R; Jordan, Robert E; Brezski, Randall J

    2015-01-01

    Cytotoxic therapeutic monoclonal antibodies (mAbs) often mediate target cell-killing by eliciting immune effector functions via Fc region interactions with cellular and humoral components of the immune system. Key functions include antibody-dependent cell-mediated cytotoxicity (ADCC), antibody-dependent cellular phagocytosis (ADCP), and complement-dependent cytotoxicity (CDC). However, there has been increased appreciation that along with cell-killing functions, the induction of antibody-dependent cytokine release (ADCR) can also influence disease microenvironments and therapeutic outcomes. Historically, most Fc engineering approaches have been aimed toward modulating ADCC, ADCP, or CDC. In the present study, we describe an Fc engineering approach that, while not resulting in impaired ADCC or ADCP, profoundly affects ADCR. As such, when peripheral blood mononuclear cells are used as effector cells against mAb-opsonized tumor cells, the described mAb variants elicit a similar profile and quantity of cytokines as IgG1. In contrast, although the variants elicit similar levels of tumor cell-killing as IgG1 with macrophage effector cells, the variants do not elicit macrophage-mediated ADCR against mAb-opsonized tumor cells. This study demonstrates that Fc engineering approaches can be employed to uncouple macrophage-mediated phagocytic and subsequent cell-killing functions from cytokine release.

  11. Characterization of nanoparticles released during construction of photocatalytic pavements using engineered nanoparticles

    International Nuclear Information System (INIS)

    Dylla, Heather; Hassan, Marwa M.

    2012-01-01

    With the increasing use of titanium dioxide (TiO 2 ) nanoparticles in self-cleaning materials such as photocatalytic concrete pavements, the release of nanoparticles into the environment is inevitable. Nanoparticle concentration, particle size, surface area, elemental composition, and surface morphology are pertinent to determine the associated risks. In this study, the potential of exposure to synthetic nanoparticles released during construction activities for application of photocatalytic pavements was measured during laboratory-simulated construction activities of photocatalytic mortar overlays and in an actual field application of photocatalytic spray coat. A scanning mobility particle sizer system measured the size distribution of nanoparticles released during laboratory and field activities. Since incidental nanoparticles are released during construction activities, nanoparticle emissions were compared to those from similar activities without nano-TiO 2 . Nanoparticle counts and size distribution suggest that synthetic nanoparticles are released during application of photocatalytic pavements. In order to identify the nanoparticle source, nanoparticles were also collected for offline characterization using transmission electron microscopy. However, positive identification of synthetic nanoparticles was not possible due to difficulties in obtaining high-resolution images. As a result, further research is recommended to identify nanoparticle composition and sources.

  12. Gas and Dust Phenomena of Mega-earthquakes and the Cause

    Science.gov (United States)

    Yue, Z.

    2013-12-01

    A mega-earthquake suddenly releases a large to extremely large amount of kinetic energy within a few tens to two hundreds seconds and over ten to hundreds kilometer distances in the Earth's crust and on ground surface. It also generates seismic waves that can be received globally and co-seismic ground damages such co-seismic ruptures and landslides. However, such vast, dramatic and devastating kinetic actions in the Earth's crustal rocks and on the ground soils cannot be known or predicted by people at few weeks, days, hours, or minutes before they are happening. Although seismologists can develop and use seismometers to report the locations and magnitudes of earthquakes within minutes of their occurrence, they cannot predict earthquakes at present. Therefore, damage earthquakes have caused and would continue to cause huge disasters, fatalities and injuries to our human beings. This problem may indicate that it is necessary to re-examine the cause of mega-earthquakes in addition to the conventional cause of active fault elastic rebounding. In the last ten years, many mega-earthquakes occurred in China and around the Pacific Ocean and caused many casualties to human beings and devastating disasters to environments. The author will give a brief review on the impacts of the mega-earthquakes happened in recent years. He will then present many gas and dust related phenomena associated with the sudden occurrences of these mega earthquakes. They include the 2001 Kunlunshan Earthquake M8.1, 2008 Wenchuan Earthquake M8.0 and the 2010 Yushu Earthquake M7.1 in China, the 2010 Haiti Earthquake M7.0, the 2010 Mexicali Earthquake M7.2, the 2010 Chile Earthquake M8.8, the 2011 Christchurch earthquake M6.3 and the 2011 Japan Earthquake M9.0 around the Pacific Ocean. He will discuss the cause of these gas and dust related phenomena. He will use these phenomena and their common cause to show that the earthquakes were caused the rapid migration and expansion of highly compressed and

  13. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  14. Estimated airborne release of plutonium from Atomics International's Nuclear Materials Development Facility in the Santa Susana site, California, as a result of postulated damage from severe wind and earthquake hazard

    International Nuclear Information System (INIS)

    Mishima, J.; Ayer, J.E.

    1981-09-01

    The potential mass of airborne releases of plutonium (source term) that could result from wind and seismic damage is estimated for the Atomics International Company's Nuclear Materials Development Facility (NMDF) at the Santa Susana site in California. The postulated source terms will be useful as the basis for estimating the potential dose to the maximum exposed individual by inhalation and to the total population living within a prescribed radius of the site. The respirable fraction of airborne particles is thus the principal concern. The estimated source terms are based on the damage ratio, and the potential airborne releases if all enclosures suffer particular levels of damage. In an attempt to provide a realistic range of potential source terms that include most of the normal processing conditions, a best estimate bounded by upper and lower limits is provided. The range of source terms is calculated by combining a high best estimate and a low damage ratio, based on a fraction of enclosures suffering crush or perforation, with the airborne release from enclosures based upon an upper limit, average, and lower limit inventory of dispersible materials at risk. Two throughput levels are considered. The factors used to evaluate the fractional airborne release of materials and the exchange rates between enclosed and exterior atmospheres are discussed. The postulated damage and source terms are discussed for wind and earthquake hazard scenarios in order of their increasing severity

  15. Tissue-engineered matrices as functional delivery systems: adsorption and release of bioactive proteins from degradable composite scaffolds.

    Science.gov (United States)

    Cushnie, Emily K; Khan, Yusuf M; Laurencin, Cato T

    2010-08-01

    A tissue-engineered bone graft should imitate the ideal autograft in both form and function. However, biomaterials that have appropriate chemical and mechanical properties for grafting applications often lack biological components that may enhance regeneration. The concept of adding proteins such as growth factors to scaffolds has therefore emerged as a possible solution to improve overall graft design. In this study, we investigated this concept by loading porous hydroxyapatite-poly(lactide-co-glycolide) (HA-PLAGA) scaffolds with a model protein, cytochrome c, and then studying its release in a phosphate-buffered saline solution. The HA-PLAGA scaffold has previously been shown to be bioactive, osteoconductive, and to have appropriate physical properties for tissue engineering applications. The loading experiments demonstrated that the HA-PLAGA scaffold could also function effectively as a substrate for protein adsorption and release. Scaffold protein adsorptive loading (as opposed to physical entrapment within the matrix) was directly related to levels of scaffold HA-content. The HA phase of the scaffold facilitated protein retention in the matrix following incubation in aqueous buffer for periods up to 8 weeks. Greater levels of protein retention time may improve the protein's effective activity by increasing the probability for protein-cell interactions. The ability to control protein loading and delivery simply via composition of the HA-PLAGA scaffold offers the potential of forming robust functionalized bone grafts. (c) 2010 Wiley Periodicals, Inc.

  16. Effect of Engineered Nanoparticles on Exopolymeric Substances Release from Marine Phytoplankton

    OpenAIRE

    Chiu, Meng-Hsuen; Khan, Zafir A.; Garcia, Santiago G.; Le, Andre D.; Kagiri, Agnes; Ramos, Javier; Tsai, Shih-Ming; Drobenaire, Hunter W.; Santschi, Peter H.; Quigg, Antonietta; Chin, Wei-Chun

    2017-01-01

    Engineered nanoparticles (ENPs), products from modern nanotechnologies, can potentially impact the marine environment to pose serious threats to marine ecosystems. However, the cellular responses of marine phytoplankton to ENPs are still not well established. Here, we investigate four different diatom species (Odontella mobiliensis, Skeletonema grethae, Phaeodactylum tricornutum, Thalassiosira pseudonana) and one green algae (Dunaliella tertiolecta) for their extracellular polymeric substance...

  17. Using mass-release of engineered insects to manage insecticide resistance

    International Nuclear Information System (INIS)

    Alphey, Nina; Coleman, Paul G.; Donnelly, Christl A.

    2006-01-01

    Transgenic crops expressing insecticidal toxins derived from Bacillus thuringiensis (Bt) are widely used to control insect pests. The benefits of such crops would be lost if resistance to the toxins spread to a significant proportion of the pest population. The main resistance management method, mandatory in the US, is the high-dose/refuge strategy, requiring nearby refuges of toxin-free crops, and the use of toxin doses sufficiently high to kill not only wild type insects but also insects heterozygous for a resistance allele, thereby rendering the resistance functionally recessive. We propose that mass-release of harmless toxin-sensitive insects could substantially delay or even reverse the spread of resistance. Mass-release of such insects is an integral part of RIDL, a genetics-based method of pest control related to the Sterile Insect Technique. We used a population genetic mathematical model to analyze the effects of releasing male insects homozygous for a female-specific dominant lethal genetic construct, and concluded that this RIDL strategy could form an effective component of a resistance management scheme for insecticidal plants and other toxins. (author)

  18. Using mass-release of engineered insects to manage insecticide resistance

    Energy Technology Data Exchange (ETDEWEB)

    Alphey, Nina [University of Oxford (United Kingdom). Dept. of Zoology; Alphey, Luke [Oxitec Limited, Oxford (United Kingdom); Coleman, Paul G [London School of Hygiene and Tropical Medicine (United Kingdom). Dept. of Infectious and Tropical Diseases; Donnelly, Christl A [Imperial College Faculty of Medicine, London (United Kingdom). Dept. of Infectious Disease Epidemiology

    2006-07-01

    Transgenic crops expressing insecticidal toxins derived from Bacillus thuringiensis (Bt) are widely used to control insect pests. The benefits of such crops would be lost if resistance to the toxins spread to a significant proportion of the pest population. The main resistance management method, mandatory in the US, is the high-dose/refuge strategy, requiring nearby refuges of toxin-free crops, and the use of toxin doses sufficiently high to kill not only wild type insects but also insects heterozygous for a resistance allele, thereby rendering the resistance functionally recessive. We propose that mass-release of harmless toxin-sensitive insects could substantially delay or even reverse the spread of resistance. Mass-release of such insects is an integral part of RIDL, a genetics-based method of pest control related to the Sterile Insect Technique. We used a population genetic mathematical model to analyze the effects of releasing male insects homozygous for a female-specific dominant lethal genetic construct, and concluded that this RIDL strategy could form an effective component of a resistance management scheme for insecticidal plants and other toxins. (author)

  19. Biodegradable hyaluronic acid hydrogels to control release of dexamethasone through aqueous Diels–Alder chemistry for adipose tissue engineering

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Ming; Ma, Ye; Zhang, Ziwei; Mao, Jiahui [School of Materials Science and Engineering, Nanjing University of Science and Technology, Nanjing (China); Tan, Huaping, E-mail: hptan@njust.edu.cn [School of Materials Science and Engineering, Nanjing University of Science and Technology, Nanjing (China); Hu, Xiaohong [School of Material Engineering, Jinling Institute of Technology, Nanjing (China)

    2015-11-01

    A robust synthetic strategy of biopolymer-based hydrogels has been developed where hyaluronic acid derivatives reacted through aqueous Diels–Alder chemistry without the involvement of chemical catalysts, allowing for control and sustain release of dexamethasone. To conjugate the hydrogel, furan and maleimide functionalized hyaluronic acid were synthesized, respectively, as well as furan functionalized dexamethasone, for the covalent immobilization. Chemical structure, gelation time, morphologies, swelling kinetics, weight loss, compressive modulus and dexamethasone release of the hydrogel system in PBS at 37 °C were studied. The results demonstrated that the aqueous Diels–Alder chemistry provides an extremely selective reaction and proceeds with high efficiency for hydrogel conjugation and covalent immobilization of dexamethasone. Cell culture results showed that the dexamethasone immobilized hydrogel was noncytotoxic and preserved proliferation of entrapped human adipose-derived stem cells. This synthetic approach uniquely allows for the direct fabrication of biologically functionalized gel scaffolds with ideal structures for adipose tissue engineering, which provides a competitive alternative to conventional conjugation techniques such as copper mediated click chemistry. - Highlights: • A biodegradable hyaluronic acid hydrogel was crosslinked via aqueous Diels–Alder chemistry. • Dexamethasone was covalently immobilized into the hyaluronic acid hydrogel via aqueous Diels–Alder chemistry. • Dexamethasone could be released from the Diels–Alder hyaluronic acid hydrogel in a controlled fashion.

  20. Biodegradable hyaluronic acid hydrogels to control release of dexamethasone through aqueous Diels–Alder chemistry for adipose tissue engineering

    International Nuclear Information System (INIS)

    Fan, Ming; Ma, Ye; Zhang, Ziwei; Mao, Jiahui; Tan, Huaping; Hu, Xiaohong

    2015-01-01

    A robust synthetic strategy of biopolymer-based hydrogels has been developed where hyaluronic acid derivatives reacted through aqueous Diels–Alder chemistry without the involvement of chemical catalysts, allowing for control and sustain release of dexamethasone. To conjugate the hydrogel, furan and maleimide functionalized hyaluronic acid were synthesized, respectively, as well as furan functionalized dexamethasone, for the covalent immobilization. Chemical structure, gelation time, morphologies, swelling kinetics, weight loss, compressive modulus and dexamethasone release of the hydrogel system in PBS at 37 °C were studied. The results demonstrated that the aqueous Diels–Alder chemistry provides an extremely selective reaction and proceeds with high efficiency for hydrogel conjugation and covalent immobilization of dexamethasone. Cell culture results showed that the dexamethasone immobilized hydrogel was noncytotoxic and preserved proliferation of entrapped human adipose-derived stem cells. This synthetic approach uniquely allows for the direct fabrication of biologically functionalized gel scaffolds with ideal structures for adipose tissue engineering, which provides a competitive alternative to conventional conjugation techniques such as copper mediated click chemistry. - Highlights: • A biodegradable hyaluronic acid hydrogel was crosslinked via aqueous Diels–Alder chemistry. • Dexamethasone was covalently immobilized into the hyaluronic acid hydrogel via aqueous Diels–Alder chemistry. • Dexamethasone could be released from the Diels–Alder hyaluronic acid hydrogel in a controlled fashion

  1. The threat of silent earthquakes

    Science.gov (United States)

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  2. Analysis of heat release dynamics in an internal combustion engine using multifractals and wavelets

    International Nuclear Information System (INIS)

    Sen, A.K.; Litak, G.; Finney, C.E.A.; Daw, C.S.; Wagner, R.M.

    2010-01-01

    In this paper we analyze data from previously reported experimental measurements of cycle-to-cycle combustion variations in a lean-fueled, multi-cylinder spark-ignition (SI) engine. We characterize the changes in the observed combustion dynamics with as-fed fuel-air ratio using conventional histograms and statistical moments, and we further characterize the shifts in combustion complexity in terms of multifractals and wavelet decomposition. Changes in the conventional statistics and multifractal structure indicate trends with fuel-air ratio that parallel earlier reported observations. Wavelet decompositions reveal persistent, non-stochastic oscillation modes at higher fuel-air ratios that were not obvious in previous analyses. Recognition of these long-time-scale, non-stochastic oscillations is expected to be useful for improving modelling and control of engine combustion variations and multi-cylinder balancing.

  3. Spot the difference: engineered and natural nanoparticles in the environment--release, behavior, and fate.

    Science.gov (United States)

    Wagner, Stephan; Gondikas, Andreas; Neubauer, Elisabeth; Hofmann, Thilo; von der Kammer, Frank

    2014-11-10

    The production and use of nanoparticles leads to the emission of manufactured or engineered nanoparticles into the environment. Those particles undergo many possible reactions and interactions in the environment they are exposed to. These reactions and the resulting behavior and fate of nanoparticles in the environment have been studied for decades through naturally occurring nanoparticulate (1-100 nm) and colloidal (1-1000 nm) substances. The knowledge gained from these investigations is nowhere near sufficiently complete to create a detailed model of the behavior and fate of engineered nanoparticles in the environment, but is a valuable starting point for the risk assessment of these novel materials. It is the aim of this Review to critically compare naturally observed processes with those found for engineered systems to identify the "nanospecific" properties of manufactured particles and describe critical knowledge gaps relevant for the risk assessment of manufactured nanomaterials in the environment. © 2014 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA. This is an open access article under the terms of the Creative Commons Attribution Non-Commercial NoDerivs License, which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.

  4. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    problems which began to discuss only during the last time. Earthquakes often precede volcanic eruptions. According to Darwin, the earthquake-induced shock may be a common mechanism of the simultaneous eruptions of the volcanoes separated by long distances. In particular, Darwin wrote that ‘… the elevation of many hundred square miles of territory near Concepcion is part of the same phenomenon, with that splashing up, if I may so call it, of volcanic matter through the orifices in the Cordillera at the moment of the shock;…'. According to Darwin the crust is a system where fractured zones, and zones of seismic and volcanic activities interact. Darwin formulated the task of considering together the processes studied now as seismology and volcanology. However the difficulties are such that the study of interactions between earthquakes and volcanoes began only recently and his works on this had relatively little impact on the development of geosciences. In this report, we discuss how the latest data on seismic and volcanic events support the Darwin's observations and ideas about the 1835 Chilean earthquake. The material from researchspace. auckland. ac. nz/handle/2292/4474 is used. We show how modern mechanical tests from impact engineering and simple experiments with weakly-cohesive materials also support his observations and ideas. On the other hand, we developed the mathematical theory of the earthquake-induced catastrophic wave phenomena. This theory allow to explain the most important aspects the Darwin's earthquake reports. This is achieved through the simplification of fundamental governing equations of considering problems to strongly-nonlinear wave equations. Solutions of these equations are constructed with the help of analytic and numerical techniques. The solutions can model different strongly-nonlinear wave phenomena which generate in a variety of physical context. A comparison with relevant experimental observations is also presented.

  5. Pathways and mechanisms for product release in the engineered haloalkane dehalogenases explored using classical and random acceleration molecular dynamics simulations.

    Science.gov (United States)

    Klvana, Martin; Pavlova, Martina; Koudelakova, Tana; Chaloupkova, Radka; Dvorak, Pavel; Prokop, Zbynek; Stsiapanava, Alena; Kuty, Michal; Kuta-Smatanova, Ivana; Dohnalek, Jan; Kulhanek, Petr; Wade, Rebecca C; Damborsky, Jiri

    2009-10-09

    Eight mutants of the DhaA haloalkane dehalogenase carrying mutations at the residues lining two tunnels, previously observed by protein X-ray crystallography, were constructed and biochemically characterized. The mutants showed distinct catalytic efficiencies with the halogenated substrate 1,2,3-trichloropropane. Release pathways for the two dehalogenation products, 2,3-dichloropropane-1-ol and the chloride ion, and exchange pathways for water molecules, were studied using classical and random acceleration molecular dynamics simulations. Five different pathways, denoted p1, p2a, p2b, p2c, and p3, were identified. The individual pathways showed differing selectivity for the products: the chloride ion releases solely through p1, whereas the alcohol releases through all five pathways. Water molecules play a crucial role for release of both products by breakage of their hydrogen-bonding interactions with the active-site residues and shielding the charged chloride ion during its passage through a hydrophobic tunnel. Exchange of the chloride ions, the alcohol product, and the waters between the buried active site and the bulk solvent can be realized by three different mechanisms: (i) passage through a permanent tunnel, (ii) passage through a transient tunnel, and (iii) migration through a protein matrix. We demonstrate that the accessibility of the pathways and the mechanisms of ligand exchange were modified by mutations. Insertion of bulky aromatic residues in the tunnel corresponding to pathway p1 leads to reduced accessibility to the ligands and a change in mechanism of opening from permanent to transient. We propose that engineering the accessibility of tunnels and the mechanisms of ligand exchange is a powerful strategy for modification of the functional properties of enzymes with buried active sites.

  6. Development and optimization of press coated tablets of release engineered valsartan for pulsatile delivery.

    Science.gov (United States)

    Shah, Sunny; Patel, Romik; Soniwala, Moinuddin; Chavda, Jayant

    2015-01-01

    The present work is aimed to develop and optimize pulsatile delivery during dissolution of an improved formulation of valsartan to coordinate the drug release with circadian rhythm. Preliminary studies suggested that β cyclodextrin could improve the solubility of valsartan and showed AL type solubility curve. A 1:1 stoichiometric ratio of valsartan to β cyclodextrin was revealed from phase solubility studies and Job's plot. The prepared complex showed significantly better dissolution efficiency (p valsartan β cyclodextrin complex was significantly higher (p valsartan β cyclodextrin complex were subsequently prepared and application of the Plackett-Burman screening design revealed that HPMC K4M and EC showed significant effect on lag time. A 3(2) full factorial design was used to measure the response of HPMC K4M and EC on lag time and time taken for 90% drug release (T90). The optimized batch prepared according to the levels obtained from the desirability function had a lag time of 6 h and consisted of HPMC K4M:ethylcellulose in a 1:1.5 ratio with 180 mg of coating and revealed a close agreement between observed and predicted value (R(2 )= 0.9694).

  7. Development of proposed free release criteria for Idaho National Engineering Laboratory lead

    International Nuclear Information System (INIS)

    Losinski, S.J.

    1994-01-01

    The INEL Lead Management Project (LMP) performed an investigation of the origin of lead used as shielding at the INEL and developed radiological profile information that was then used to establish a baseline for the DOE ''no-rad-added'' standard. Primary findings of the investigation include the following: (a) Much of the lead at the INEL was obtained from a DOE lead bank; (b) Lead inventory at the DOE lead bank was derived primarily from recycled sources and was most likely in the form of pure lead; (c) Secondary lead (lead from recycled sources), available in today's market, is expected to have radiological characteristics similar to those of the DOE lead bank; (d) Highly sensitive radiological testing of 20 samples of lead from secondary sources revealed the lead to be radiologically pristine. Beta-, gamma-, and alpha-emitting radionuclide concentrations were all found to be less than detectable, except for a very small quantity of lead-210 (an alpha emitter), which is a naturally occurring isotope of lead. Based on the pristine nature of lead, a proposed free release criterion for lead was developed based on a statistical null hypothesis approach. The free release criterion compares the natural background count of a clean lead standard with the natural background count of a sample. When the sample background count cannot be distinguished as different from the standard background count at the 95% confidence level, then the sample is considered radiologically clean

  8. Plant immunity triggered by engineered in vivo release of oligogalacturonides, damage-associated molecular patterns.

    Science.gov (United States)

    Benedetti, Manuel; Pontiggia, Daniela; Raggi, Sara; Cheng, Zhenyu; Scaloni, Flavio; Ferrari, Simone; Ausubel, Frederick M; Cervone, Felice; De Lorenzo, Giulia

    2015-04-28

    Oligogalacturonides (OGs) are fragments of pectin that activate plant innate immunity by functioning as damage-associated molecular patterns (DAMPs). We set out to test the hypothesis that OGs are generated in planta by partial inhibition of pathogen-encoded polygalacturonases (PGs). A gene encoding a fungal PG was fused with a gene encoding a plant polygalacturonase-inhibiting protein (PGIP) and expressed in transgenic Arabidopsis plants. We show that expression of the PGIP-PG chimera results in the in vivo production of OGs that can be detected by mass spectrometric analysis. Transgenic plants expressing the chimera under control of a pathogen-inducible promoter are more resistant to the phytopathogens Botrytis cinerea, Pectobacterium carotovorum, and Pseudomonas syringae. These data provide strong evidence for the hypothesis that OGs released in vivo act as a DAMP signal to trigger plant immunity and suggest that controlled release of these molecules upon infection may be a valuable tool to protect plants against infectious diseases. On the other hand, elevated levels of expression of the chimera cause the accumulation of salicylic acid, reduced growth, and eventually lead to plant death, consistent with the current notion that trade-off occurs between growth and defense.

  9. Sol-gel derived manganese-releasing bioactive glass as a therapeutical approach for bone tissue engineering

    Energy Technology Data Exchange (ETDEWEB)

    Barrioni, B.R.; Oliveira, A.C.; Leite, M.F.; Pereira, M.M. [Universidade Federal de Minas Gerais (UFMG), MG (Brazil)

    2016-07-01

    Full text: Bioactive glasses (BG) have been highlighted in tissue engineering, due to their high bioactivity and biocompatibility, being potential materials for bone tissue repair. Its composition is variable and quite flexible, allowing the incorporation of therapeutic metallic ions, which has been regarded as a promising approach in the development of BG with superior properties for tissue engineering. These ions can be released in a controlled manner during the dissolution process of the glass, having the advantage of being released at the exactly implant site where they are needed, thus optimizing the therapeutic efficacy and reducing undesired side effects in the patient. Among several ions that have been studied, Manganese (Mn) has been shown to favor osteogenic differentiation. Besides, this ion is also a cofactor for several enzymes involved in remodeling of extracellular matrix, presenting an important role in cell adhesion. Therefore, it is very important to study the Mn role in the BG network and its influence on the glass bioactivity. In this context, new bioactive glass compositions derived from the 58S (60%SiO2-36%CaO-4%P2O5, mol%) were synthesized in this work, using the sol-gel method, by the incorporation of Mn into their structure. FTIR and Raman spectra showed the presence of typical BG chemical groups, whereas the amorphous structure typical of these materials was confirmed by XRD analysis, which also indicated that the Mn incorporation in the glass network was well succeeded, as its precursor did not recrystallize. The role of Mn in the glass network was also evaluated by XPS. The influence of Mn on carbonated hydroxyapatite layer formation after different periods of immersion of the BG powder in Simulated Body Fluid was evaluated using zeta potential, SEM, EDS and FTIR, whereas the controlled ion release was measured through ICP-OES. MTT assay revealed that Mn-containing BG showed no cytotoxic effect on cell culture. All these results indicate

  10. Sol-gel derived manganese-releasing bioactive glass as a therapeutical approach for bone tissue engineering

    International Nuclear Information System (INIS)

    Barrioni, B.R.; Oliveira, A.C.; Leite, M.F.; Pereira, M.M.

    2016-01-01

    Full text: Bioactive glasses (BG) have been highlighted in tissue engineering, due to their high bioactivity and biocompatibility, being potential materials for bone tissue repair. Its composition is variable and quite flexible, allowing the incorporation of therapeutic metallic ions, which has been regarded as a promising approach in the development of BG with superior properties for tissue engineering. These ions can be released in a controlled manner during the dissolution process of the glass, having the advantage of being released at the exactly implant site where they are needed, thus optimizing the therapeutic efficacy and reducing undesired side effects in the patient. Among several ions that have been studied, Manganese (Mn) has been shown to favor osteogenic differentiation. Besides, this ion is also a cofactor for several enzymes involved in remodeling of extracellular matrix, presenting an important role in cell adhesion. Therefore, it is very important to study the Mn role in the BG network and its influence on the glass bioactivity. In this context, new bioactive glass compositions derived from the 58S (60%SiO2-36%CaO-4%P2O5, mol%) were synthesized in this work, using the sol-gel method, by the incorporation of Mn into their structure. FTIR and Raman spectra showed the presence of typical BG chemical groups, whereas the amorphous structure typical of these materials was confirmed by XRD analysis, which also indicated that the Mn incorporation in the glass network was well succeeded, as its precursor did not recrystallize. The role of Mn in the glass network was also evaluated by XPS. The influence of Mn on carbonated hydroxyapatite layer formation after different periods of immersion of the BG powder in Simulated Body Fluid was evaluated using zeta potential, SEM, EDS and FTIR, whereas the controlled ion release was measured through ICP-OES. MTT assay revealed that Mn-containing BG showed no cytotoxic effect on cell culture. All these results indicate

  11. Microcapsule Technology for Controlled Growth Factor Release in Musculoskeletal Tissue Engineering.

    Science.gov (United States)

    Della Porta, Giovanna; Ciardulli, Maria C; Maffulli, Nicola

    2018-06-01

    Tissue engineering strategies have relied on engineered 3-dimensional (3D) scaffolds to provide architectural templates that can mimic the native cell environment. Among the several technologies proposed for the fabrication of 3D scaffold, that can be attractive for stem cell cultivation and differentiation, moulding or bioplotting of hydrogels allow the stratification of layers loaded with cells and with specific additives to obtain a predefined microstructural organization. Particularly with bioplotting technology, living cells, named bio-ink, and additives, such as biopolymer microdevices/nanodevices for the controlled delivery of growth factors or biosignals, can be organized spatially into a predesigned 3D pattern by automated fabrication with computer-aided digital files. The technologies for biopolymer microcarrier/nanocarrier fabrication can be strategic to provide a controlled spatiotemporal delivery of specific biosignals within a microenvironment that can better or faster address the stem cells loaded within it. In this review, some examples of growth factor-controlled delivery by biopolymer microdevices/nanodevices embedded within 3D hydrogel scaffolds will be described, to achieve a bioengineered 3D interactive microenvironment for stem cell differentiation. Conventional and recently proposed technologies for biopolymer microcapsule fabrication for controlled delivery over several days will also be illustrated and critically discussed.

  12. Sensitivity of the engineered barrier system (EBS) release rate to alternative conceptual models of advective release from waste packages under dripping fractures

    International Nuclear Information System (INIS)

    Lee, J.H.; Atkins, J.E.; McNeish, J.A.; Vallikat, V.

    1996-01-01

    The first model assumed that dripping water directly contacts the waste form inside the ''failed'' waste package and radionuclides are released from the EBS by advection. The second model assumed that dripping water is diverted around the package (because of corrosion products plugging the perforations), thereby being prevented from directly contacting the waste form. In the second model, radionuclides were assumed to diffuse through the perforations, and, once outside the waste package, to be released from the EBS by advection. For the case with the second EBS release model, most radionuclides had lower peak EBS release rates than with the first model. Impacts of the alternative EBS release models were greater for the radionuclides with low solubility. The analysis indicated that the EBS release model representing advection through a ''failed'' waste package (the first model) may be too conservative; thus a ''failed'' waste package container with multiple perforations may still be an important barrier to radionuclide release

  13. Release of tensile strain on engineered human tendon tissue disturbs cell adhesions, changes matrix architecture, and induces an inflammatory phenotype

    DEFF Research Database (Denmark)

    Bayer, Monika L; Schjerling, Peter; Herchenhan, Andreas

    2014-01-01

    Mechanical loading of tendon cells results in an upregulation of mechanotransduction signaling pathways, cell-matrix adhesion and collagen synthesis, but whether unloading removes these responses is unclear. We investigated the response to tension release, with regard to matrix proteins, pro......-inflammatory mediators and tendon phenotypic specific molecules, in an in vitro model where tendon-like tissue was engineered from human tendon cells. Tissue sampling was performed 1, 2, 4 and 6 days after surgical de-tensioning of the tendon construct. When tensile stimulus was removed, integrin type collagen receptors...... were upregulated. Stimulation with the cytokine TGF-β1 had distinct effects on some tendon-related genes in both tensioned and de-tensioned tissue. These findings indicate an important role of mechanical loading for cellular and matrix responses in tendon, including that loss of tension leads...

  14. Biodegradable hyaluronic acid hydrogels to control release of dexamethasone through aqueous Diels-Alder chemistry for adipose tissue engineering.

    Science.gov (United States)

    Fan, Ming; Ma, Ye; Zhang, Ziwei; Mao, Jiahui; Tan, Huaping; Hu, Xiaohong

    2015-11-01

    A robust synthetic strategy of biopolymer-based hydrogels has been developed where hyaluronic acid derivatives reacted through aqueous Diels-Alder chemistry without the involvement of chemical catalysts, allowing for control and sustain release of dexamethasone. To conjugate the hydrogel, furan and maleimide functionalized hyaluronic acid were synthesized, respectively, as well as furan functionalized dexamethasone, for the covalent immobilization. Chemical structure, gelation time, morphologies, swelling kinetics, weight loss, compressive modulus and dexamethasone release of the hydrogel system in PBS at 37°C were studied. The results demonstrated that the aqueous Diels-Alder chemistry provides an extremely selective reaction and proceeds with high efficiency for hydrogel conjugation and covalent immobilization of dexamethasone. Cell culture results showed that the dexamethasone immobilized hydrogel was noncytotoxic and preserved proliferation of entrapped human adipose-derived stem cells. This synthetic approach uniquely allows for the direct fabrication of biologically functionalized gel scaffolds with ideal structures for adipose tissue engineering, which provides a competitive alternative to conventional conjugation techniques such as copper mediated click chemistry. Copyright © 2015. Published by Elsevier B.V.

  15. Refresher Course on Physics of Earthquakes -98 ...

    Indian Academy of Sciences (India)

    The objective of this course is to help teachers gain an understanding of the earhquake phenomenon and the physical processes involved in its genesis as well as offhe earthquake waves which propagate the energy released by the earthquake rupture outward from the source. The Course will begin with mathematical ...

  16. Controlled field release of a bioluminescent genetically engineered microorganism for bioremediation process monitoring and control

    Energy Technology Data Exchange (ETDEWEB)

    Ripp, S.; Nivens, D.E.; Ahn, Y.; Werner, C.; Jarrell, J. IV; Easter, J.P.; Cox, C.D.; Burlage, R.S.; Sayler, G.S.

    2000-03-01

    Pseudomonas fluorescens HK44 represents the first genetically engineered microorganism approved for field testing in the United States for bioremediation purposes. Strain HK44 harbors an introduced lux gene fused within a naphthalene degradative pathway, thereby allowing this recombinant microbe to bioluminescent as it degrades specific polyaromatic hydrocarbons such as naphthalene. The bioremediation process can therefore be monitored by the detection of light. P. fluorescens HK44 was inoculated into the vadose zone of intermediate-scale, semicontained soil lysimeters contaminated with naphthalene, anthracene, and phenanthrene, and the population dynamics were followed over an approximate 2-year period in order to assess the long-term efficacy of using strain HK44 for monitoring and controlling bioremediation processes. Results showed that P. fluorescens HK44 was capable of surviving initial inoculation into both hydrocarbon contaminated and uncontaminated soils and was recoverable from these soils 660 days post inoculation. It was also demonstrated that strain HK44 was capable of generating bioluminescence in response to soil hydrocarbon bioavailability. Bioluminescence approaching 166,000 counts/s was detected in fiber optic-based biosensor devices responding to volatile polyaromatic hydrocarbons, while a portable photomultiplier module detected bioluminescence at an average of 4300 counts/s directly from soil-borne HK44 cells within localized treatment areas. The utilization of lux-based bioreporter microorganisms therefore promises to be a viable option for in situ determination of environmental contaminant bioavailability and biodegradation process monitoring and control.

  17. Moment-ration imaging of seismic regions for earthquake prediction

    Science.gov (United States)

    Lomnitz, Cinna

    1993-10-01

    An algorithm for predicting large earthquakes is proposed. The reciprocal ratio (mri) of the residual seismic moment to the total moment release in a region is used for imaging seismic moment precursors. Peaks in mri predict recent major earthquakes, including the 1985 Michoacan, 1985 central Chile, and 1992 Eureka, California earthquakes.

  18. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    Science.gov (United States)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake

  19. Physics of Earthquake Rupture Propagation

    Science.gov (United States)

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  20. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  1. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  2. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    earthquakes and deep-focus earthquakes are the energy release caused by the slip or flow of rocks following a jamming-unjamming transition. (4) The energetics and impending precursors of earthquake: The energy of earthquake is the kinetic energy released from the jamming-unjamming transition. Calculation shows that the kinetic energy of seismic rock sliding is comparable with the total work demanded for rocks’ shear failure and overcoming of frictional resistance. There will be no heat flow paradox. Meanwhile, some valuable seismic precursors are likely to be identified by observing the accumulation of additional tectonic forces, local geological changes, as well as the effect of rock state changes, etc.

  3. Leveraging geodetic data to reduce losses from earthquakes

    Science.gov (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    event response products and by expanded use of geodetic imaging data to assess fault rupture and source parameters.Uncertainties in the NSHM, and in regional earthquake models, are reduced by fully incorporating geodetic data into earthquake probability calculations.Geodetic networks and data are integrated into the operations and earthquake information products of the Advanced National Seismic System (ANSS).Earthquake early warnings are improved by more rapidly assessing ground displacement and the dynamic faulting process for the largest earthquakes using real-time geodetic data.Methodology for probabilistic earthquake forecasting is refined by including geodetic data when calculating evolving moment release during aftershock sequences and by better understanding the implications of transient deformation for earthquake likelihood.A geodesy program that encompasses a balanced mix of activities to sustain missioncritical capabilities, grows new competencies through the continuum of fundamental to applied research, and ensures sufficient resources for these endeavors provides a foundation by which the EHP can be a leader in the application of geodesy to earthquake science. With this in mind the following objectives provide a framework to guide EHP efforts:Fully utilize geodetic information to improve key products, such as the NSHM and EEW, and to address new ventures like the USGS Subduction Zone Science Plan.Expand the variety, accuracy, and timeliness of post-earthquake information products, such as PAGER (Prompt Assessment of Global Earthquakes for Response), through incorporation of geodetic observations.Determine if geodetic measurements of transient deformation can significantly improve estimates of earthquake probability.Maintain an observational strategy aligned with the target outcomes of this document that includes continuous monitoring, recording of ephemeral observations, focused data collection for use in research, and application-driven data processing and

  4. The HayWired Earthquake Scenario

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  5. Earthquake Facts

    Science.gov (United States)

    ... North Dakota, and Wisconsin. The core of the earth was the first internal structural element to be identified. In 1906 R.D. Oldham discovered it from his studies of earthquake records. The inner core is solid, and the outer core is liquid and so does not transmit ...

  6. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  7. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  8. The Effect of Ethanol Addition to Gasoline on Low- and Intermediate-Temperature Heat Release under Boosted Conditions in Kinetically Controlled Engines

    Science.gov (United States)

    Vuilleumier, David Malcolm

    The detailed study of chemical kinetics in engines has become required to further advance engine efficiency while simultaneously lowering engine emissions. This push for higher efficiency engines is not caused by a lack of oil, but by efforts to reduce anthropogenic carbon dioxide emissions, that cause global warming. To operate in more efficient manners while reducing traditional pollutant emissions, modern internal combustion piston engines are forced to operate in regimes in which combustion is no longer fully transport limited, and instead is at least partially governed by chemical kinetics of combusting mixtures. Kinetically-controlled combustion allows the operation of piston engines at high compression ratios, with partially-premixed dilute charges; these operating conditions simultaneously provide high thermodynamic efficiency and low pollutant formation. The investigations presented in this dissertation study the effect of ethanol addition on the low-temperature chemistry of gasoline type fuels in engines. These investigations are carried out both in a simplified, fundamental engine experiment, named Homogeneous Charge Compression Ignition, as well as in more applied engine systems, named Gasoline Compression Ignition engines and Partial Fuel Stratification engines. These experimental investigations, and the accompanying modeling work, show that ethanol is an effective scavenger of radicals at low temperatures, and this inhibits the low temperature pathways of gasoline oxidation. Further, the investigations measure the sensitivity of gasoline auto-ignition to system pressure at conditions that are relevant to modern engines. It is shown that at pressures above 40 bar and temperatures below 850 Kelvin, gasoline begins to exhibit Low-Temperature Heat Release. However, the addition of 20% ethanol raises the pressure requirement to 60 bar, while the temperature requirement remains unchanged. These findings have major implications for a range of modern engines

  9. Explanation of earthquake response spectra

    OpenAIRE

    Douglas, John

    2017-01-01

    This is a set of five slides explaining how earthquake response spectra are derived from strong-motion records and simple models of structures and their purpose within seismic design and assessment. It dates from about 2002 and I have used it in various introductory lectures on engineering seismology.

  10. Earthquake data base for Romania

    International Nuclear Information System (INIS)

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.

    2002-01-01

    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  11. Underground water stress release models

    Science.gov (United States)

    Li, Yong; Dang, Shenjun; Lü, Shaochuan

    2011-08-01

    The accumulation of tectonic stress may cause earthquakes at some epochs. However, in most cases, it leads to crustal deformations. Underground water level is a sensitive indication of the crustal deformations. We incorporate the information of the underground water level into the stress release models (SRM), and obtain the underground water stress release model (USRM). We apply USRM to the earthquakes occurred at Tangshan region. The analysis shows that the underground water stress release model outperforms both Poisson model and stress release model. Monte Carlo simulation shows that the simulated seismicity by USRM is very close to the real seismicity.

  12. Housing Damage Following Earthquake

    Science.gov (United States)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  13. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  14. Modeling of release of radionuclides from an engineered disposal facility for shallow-land disposal of low-level radioactive wastes

    International Nuclear Information System (INIS)

    Matsuzuru, H.; Suzuki, A.

    1989-01-01

    The computer code, ENBAR-1, for the simulation of radionuclide releases from an engineered disposal facility has been developed to evaluate the source term for subsequent migration of radionuclides in and through a natural barrier. The system considered here is that a waste package (waste form and container) is placed, together with backfill materials, into a concrete pit as a disposal unit for shallow-land disposal of low-level radioactive wastes. The code developed includes the following modules: water penetration into a concrete pit, corrosion of a drum as a container, leaching of radionuclides from a waste form, migration of radionuclides in backfill materials, release of radionuclides from the pit. The code has the advantage of its simplicity of operation and presentation while still allowing comprehensive evaluation of each element of an engineered disposal facility to be treated. The performance and source term of the facility might be readily estimated with a few key parameters to define the problem

  15. Geological and historical evidence of irregular recurrent earthquakes in Japan.

    Science.gov (United States)

    Satake, Kenji

    2015-10-28

    Great (M∼8) earthquakes repeatedly occur along the subduction zones around Japan and cause fault slip of a few to several metres releasing strains accumulated from decades to centuries of plate motions. Assuming a simple 'characteristic earthquake' model that similar earthquakes repeat at regular intervals, probabilities of future earthquake occurrence have been calculated by a government committee. However, recent studies on past earthquakes including geological traces from giant (M∼9) earthquakes indicate a variety of size and recurrence interval of interplate earthquakes. Along the Kuril Trench off Hokkaido, limited historical records indicate that average recurrence interval of great earthquakes is approximately 100 years, but the tsunami deposits show that giant earthquakes occurred at a much longer interval of approximately 400 years. Along the Japan Trench off northern Honshu, recurrence of giant earthquakes similar to the 2011 Tohoku earthquake with an interval of approximately 600 years is inferred from historical records and tsunami deposits. Along the Sagami Trough near Tokyo, two types of Kanto earthquakes with recurrence interval of a few hundred years and a few thousand years had been recognized, but studies show that the recent three Kanto earthquakes had different source extents. Along the Nankai Trough off western Japan, recurrence of great earthquakes with an interval of approximately 100 years has been identified from historical literature, but tsunami deposits indicate that the sizes of the recurrent earthquakes are variable. Such variability makes it difficult to apply a simple 'characteristic earthquake' model for the long-term forecast, and several attempts such as use of geological data for the evaluation of future earthquake probabilities or the estimation of maximum earthquake size in each subduction zone are being conducted by government committees. © 2015 The Author(s).

  16. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  17. Ionospheric Anomaly before Kyushu|Japan Earthquake

    Directory of Open Access Journals (Sweden)

    YANG Li

    2017-05-01

    Full Text Available GIM data released by IGS is used in the article and a new method of combining the Sliding Time Window Method and the Ionospheric TEC correlation analysis method of adjacent grid points is proposed to study the relationship between pre-earthquake ionospheric anomalies and earthquake. By analyzing the abnormal change of TEC in the 5 grid points around the seismic region, the abnormal change of ionospheric TEC is found before the earthquake and the correlation between the TEC sequences of lattice points is significantly affected by earthquake. Based on the analysis of the spatial distribution of TEC anomaly, anomalies of 6 h, 12 h and 6 h were found near the epicenter three days before the earthquake. Finally, ionospheric tomographic technology is used to do tomographic inversion on electron density. And the distribution of the electron density in the ionospheric anomaly is further analyzed.

  18. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    level of shaking intensity with empirical models of fatality losses calibrated on past earthquakes in each country. Non-seismic detections and macroseismic questionnaires collected online are combined to identify as many as possible of the felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the US Geological Survey, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. All together, we estimate that the number of detected felt earthquakes is around 1 000 per year, compared with the 35 000 earthquakes annually reported by the EMSC! Felt events are already the subject of the web page "Latest significant earthquakes" on EMSC website (http://www.emsc-csem.org/Earthquake/significant_earthquakes.php) and of a dedicated Twitter service @LastQuake. We will present the identification process of the earthquakes that matter, the smartphone application itself (to be released in May) and its future evolutions.

  19. The impact of the accuracy of indicator diagrams on the heat release characteristics calculation, used in the diagnosis of marine diesel engine

    Directory of Open Access Journals (Sweden)

    Witkowski Kazimierz

    2017-01-01

    Full Text Available The paper analyzes the possibility to use the electronic type indicators in the diagnosis of marine engines. It has been shown that in-depth analysis of indicator diagrams would be useful – calculation of heat release characteristics. To make this possible, measuring indicated systems should meet a number of important requirements in or-der to ensure that they can be used for the diagnostic purposes. These includes: high precision sensors for the measurement of cylinder pressure, high speed and accuracy of measuring and recording of measured values. These also includes reliable determination of the top dead center piston (TDC. In order to demonstrate the impact of positional error TDC, simulation study was conducted in which indicated diagrams were used, obtained on a medium-speed four-stroke marine diesel engine type A25/30 and the low-speed two-stroke marine diesel engine type RTA76, Sulzer company.

  20. COMBUSTION HEAT RELEASE RATE ANALYSIS OF C.I. ENGINE WITH SECONDARY CO-INJECTION OF DEE-H2O SOLUTION - A VIBRATIONAL APPROACH

    Directory of Open Access Journals (Sweden)

    Y. V. V. SATYANARAYANA MURTHY

    2015-08-01

    Full Text Available This paper discusses the combustion propensity of single cylinder direct injection engine fueled with palm kernel methyl ester (PKME, which is non- edible oil and a secondary co-injection of saturated Diethyl ether (DEE with water. DEE along with water is fumigated through a high pressure nozzle fitted to the inlet manifold of the engine and the flow rate of the secondary injection was electronically controlled. DEE is known to improve the cold starting problem in engines when used in straight diesel fuel. However, its application in emulsion form is little known. Experimental results show that for 5% DEE- H2O solution injection, occurrence of maximum net heat release rate is delayed due to controlled premixed combustion, which normally helped in better torque conversion when the piston is in accelerated mode. Vibration measurements in the frequency range of 900Hz to 1300Hz revealed that a new mode of combustion has taken place with different excitation frequencies.

  1. Authorized Limits for the Release of a 25 Ton Locomotive, Serial Number 21547, at the Area 25 Engine Maintenance, Assembly, and Disassembly Facility, Nevada Test Site, Nevada

    International Nuclear Information System (INIS)

    Gwin, Jeremy; Frenette, Douglas

    2010-01-01

    This document contains process knowledge and radiological data and analysis to support approval for release of the 25-ton locomotive, Serial Number 21547, at the Area 25 Engine Maintenance, Assembly, and Disassembly (EMAD) Facility, located on the Nevada Test Site (NTS). The 25-ton locomotive is a small, one-of-a-kind locomotive used to move railcars in support of the Nuclear Engine for Rocket Vehicle Application project. This locomotive was identified as having significant historical value by the Nevada State Railroad Museum in Boulder City, Nevada, where it will be used as a display piece. A substantial effort to characterize the radiological conditions of the locomotive was undertaken by the NTS Management and Operations Contractor, National Security Technologies, LLC (NSTec). During this characterization process, seven small areas on the locomotive had contamination levels that exceeded the NTS release criteria (limits consistent with U.S. Department of Energy (DOE) Order DOE O 5400.5, 'Radiation Protection of the Public and the Environment'). The decision was made to perform radiological decontamination of these known accessible impacted areas to further the release process. On February 9, 2010, NSTec personnel completed decontamination of these seven areas to within the NTS release criteria. Although all accessible areas of the locomotive had been successfully decontaminated to within NTS release criteria, it was plausible that inaccessible areas of the locomotive (i.e., those areas on the locomotive where it was not possible to perform radiological surveys) could potentially have contamination above unrestricted release limits. To access the majority of these inaccessible areas, the locomotive would have to be disassembled. A complete disassembly for a full radiological survey could have permanently destroyed parts and would have ruined the historical value of the locomotive. Complete disassembly would also add an unreasonable financial burden for the

  2. The fast release of stem cells from alginate-fibrin microbeads in injectable scaffolds for bone tissue engineering

    Science.gov (United States)

    Zhou, Hongzhi; Xu, Hockin H. K.

    2011-01-01

    Stem cell-encapsulating hydrogel microbeads of several hundred microns in size suitable for injection, that could quickly degrade to release the cells, are currently unavailable. The objectives of this study were to: (1) develop oxidized alginate-fibrin microbeads encapsulating human umbilical cord mesenchymal stem cells (hUCMSCs); (2) investigate microbead degradation, cell release, and osteogenic differentiation of the released cells for the first time. Three types of microbeads were fabricated to encapsulate hUCMSCs: (1) Alginate microbeads; (2) oxidized alginate microbeads; (3) oxidized alginate-fibrin microbeads. Microbeads with sizes of about 100–500 µm were fabricated with 1×106 hUCMSCs/mL of alginate. For the alginate group, there was little microbead degradation, with very few cells released at 21 d. For oxidized alginate, the microbeads started to slightly degrade at 14 d. In contrast, the oxidized alginate-fibrin microbeads started to degrade at 4 d and released the cells. At 7 d, the number of released cells greatly increased and showed a healthy polygonal morphology. At 21 d, the oxidized alginate-fibrin group had a live cell density that was 4-fold that of the oxidized alginate group, and 15-fold that of the alginate group. The released cells had osteodifferentiation, exhibiting highly elevated bone marker gene expressions of ALP, OC, collagen I, and Runx2. Alizarin staining confirmed the synthesis of bone minerals by hUCMSCs, with the mineral concentration at 21 d being 10-fold that at 7 d. In conclusion, fast-degradable alginate-fibrin microbeads with hUCMSC encapsulation were developed that could start to degrade and release the cells at 4 d. The released hUCMSCs had excellent proliferation, osteodifferentiation, and bone mineral synthesis. The alginate-fibrin microbeads are promising to deliver stem cells inside injectable scaffolds to promote tissue regeneration. PMID:21757229

  3. How fault geometry controls earthquake magnitude

    Science.gov (United States)

    Bletery, Q.; Thomas, A.; Karlstrom, L.; Rempel, A. W.; Sladen, A.; De Barros, L.

    2016-12-01

    Recent large megathrust earthquakes, such as the Mw9.3 Sumatra-Andaman earthquake in 2004 and the Mw9.0 Tohoku-Oki earthquake in 2011, astonished the scientific community. The first event occurred in a relatively low-convergence-rate subduction zone where events of its size were unexpected. The second event involved 60 m of shallow slip in a region thought to be aseismicaly creeping and hence incapable of hosting very large magnitude earthquakes. These earthquakes highlight gaps in our understanding of mega-earthquake rupture processes and the factors controlling their global distribution. Here we show that gradients in dip angle exert a primary control on mega-earthquake occurrence. We calculate the curvature along the major subduction zones of the world and show that past mega-earthquakes occurred on flat (low-curvature) interfaces. A simplified analytic model demonstrates that shear strength heterogeneity increases with curvature. Stress loading on flat megathrusts is more homogeneous and hence more likely to be released simultaneously over large areas than on highly-curved faults. Therefore, the absence of asperities on large faults might counter-intuitively be a source of higher hazard.

  4. A field release of genetically engineered gypsy moth (Lymantria dispar L.) Nuclear Polyhedrosis Virus (LdNPV)

    Science.gov (United States)

    Vincent D' Amico; Joseph S. Elkinton; John D. Podgwaite; James M. Slavicek; Michael L. McManus; John P. Burand

    1999-01-01

    The gypsy moth (Lymantria dispar L.) nuclear polyhedrosis virus was genetically engineered for nonpersistence by removal of the gene coding for polyhedrin production and stabilized using a coocclusion process. A β-galactosidase marker gene was inserted into the genetically engineered virus (LdGEV) so that infected larvae could be tested for...

  5. Radiological effluents released from nuclear rocket and ramjet engine tests at the Nevada Test Site 1959 through 1969: Fact Book

    Energy Technology Data Exchange (ETDEWEB)

    Friesen, H.N.

    1995-06-01

    Nuclear rocket and ramjet engine tests were conducted on the Nevada Test Site (NTS) in Area 25 and Area 26, about 80 miles northwest of Las Vegas, Nevada, from July 1959 through September 1969. This document presents a brief history of the nuclear rocket engine tests, information on the off-site radiological monitoring, and descriptions of the tests.

  6. Modeling of heat release and emissions from droplet combustion of multi component fuels in compression ignition engines

    DEFF Research Database (Denmark)

    Ivarsson, Anders

    emissions from the compression ignition engines (CI engines or diesel engines) are continuously increased. To comply with this, better modeling tools for the diesel combustion process are desired from the engine developers. The complex combustion process of a compression ignition engine may be divided...... it is well suited for optical line of sight diagnostics in both pre and post combustion regions. The work also includes some preliminary studies of radiant emissions from helium stabilized ethylene/air and methane/oxygen flames. It is demonstrated that nano particles below the sooting threshold actually...... of ethylene/air flames well known from the experimental work, was used for the model validation. Two cases were helium stabilized flames with φ = 1 and 2.14. The third case was an unstable flame with φ = 2.14. The unstable case was used to test whether a transient model would be able to predict the frequency...

  7. Environmental pH-controlled loading and release of protein on mesoporous hydroxyapatite nanoparticles for bone tissue engineering.

    Science.gov (United States)

    Zhang, Ning; Gao, Tianlin; Wang, Yu; Wang, Zongliang; Zhang, Peibiao; Liu, Jianguo

    2015-01-01

    To explore the controlled delivery of protein drugs in micro-environment established by osteoblasts or osteoclasts, the loading/release properties of bovine serum albumin (BSA) depending on pH environment were assessed. The adsorption amounts over mesoporous hydroxyapatite (MHA) or hydroxyapatite (HA) decreased as the pH increased, negatively correlating with zeta-potential values. The adsorption behavior over MHA fits well with the Freundlich and Langmuir models at different pHs. The results suggest that the adsorbed amount of protein on MHA or HA depended on the pH of protein solution. MHA adsorbed BSA at basic pH (MHApH 8.4) exhibited a different release kinetics compared with those in acid and neutral environments (MHApH 4.7 and MHApH 7.4), indicating that the release of protein could be regulated by environmental pH at which MHAs adsorb protein. MHApH 8.4 showed a sustained release for 6h before a gradual release when immersing in acidic environment, which is 2h longer than that in neutral environment. This suggests that MHApH 8.4 showed a more sustained release in acidic environment, which can be established by osteoclasts. The variation of adsorption strength between protein and MHA may be responsible for these behaviors. Our findings may be very useful for the development of MHA applications on both bone repair and protein delivery. Copyright © 2014. Published by Elsevier B.V.

  8. The Road to Total Earthquake Safety

    Science.gov (United States)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  9. A Decade of Giant Earthquakes - What does it mean?

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Terry C. Jr. [Los Alamos National Laboratory

    2012-07-16

    On December 26, 2004 the largest earthquake since 1964 occurred near Ache, Indonesia. The magnitude 9.2 earthquake and subsequent tsunami killed a quarter of million people; it also marked the being of a period of extraordinary seismicity. Since the Ache earthquake there have been 16 magnitude 8 earthquakes globally, including 2 this last April. For the 100 years previous to 2004 there was an average of 1 magnitude 8 earthquake every 2.2 years; since 2004 there has been 2 per year. Since magnitude 8 earthquakes dominate global seismic energy release, this period of seismicity has seismologist rethinking what they understand about plate tectonics and the connectivity between giant earthquakes. This talk will explore this remarkable period of time and its possible implications.

  10. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  11. Influence of environmental parameters and of their interactions on the release of metal(loid)s from a construction material in hydraulic engineering.

    Science.gov (United States)

    Schmukat, A; Duester, L; Goryunova, E; Ecker, D; Heininger, P; Ternes, T A

    2016-03-05

    Besides the leaching behaviour of a construction material under standardised test-specific conditions with laboratory water, for some construction materials it is advisable to test their environmental behaviour also under close to end use conditions. The envisaged end use combined with the product characteristics (e.g. mineral phases) is decisive for the choice of environmental factors that may change the release of substance that potentially cause adverse environmental effects (e.g. fertilisation or ecotoxicity). At the moment an experimental link is missing between mono-factorial standardised test systems and non standardised complex incubation experiments such as mesocosms which are closer to environmental conditions. Multi-factorial batch experiments may have the potential to close the gap. To verify this, batch experiments with copper slag were performed which is used as armour stones in hydraulic engineering. Design of experiments (DoE) was applied to evaluate the impact of pH, ionic strength, temperature and sediment content on the release of As, Cu, Mo, Ni, Pb, Sb and Zn. The study shows that release and sediment-eluent partitioning of metal(loid)s are impacted by interactions between the studied factors. Under the prevalent test conditions sediment acts as a sink enhancing most strongly the release of elements from the material. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Idaho National Engineering Laboratory response to the December 13, 1991, Congressional inquiry on offsite release of hazardous and solid waste containing radioactive materials from Department of Energy facilities

    International Nuclear Information System (INIS)

    Shapiro, C.; Garcia, K.M.; McMurtrey, C.D.; Williams, K.L.; Jordan, P.J.

    1992-05-01

    This report is a response to the December 13, 1991, Congressional inquiry that requested information on all hazardous and solid waste containing radioactive materials sent from Department of Energy facilities to offsite facilities for treatment or disposal since January 1, 1981. This response is for the Idaho National Engineering Laboratory. Other Department of Energy laboratories are preparing responses for their respective operations. The request includes ten questions, which the report divides into three parts, each responding to a related group of questions. Part 1 answers Questions 5, 6, and 7, which call for a description of Department of Energy and contractor documentation governing the release of waste containing radioactive materials to offsite facilities. ''Offsite'' is defined as non-Department of Energy and non-Department of Defense facilities, such as commercial facilities. Also requested is a description of the review process for relevant release criteria and a list of afl Department of Energy and contractor documents concerning release criteria as of January 1, 1981. Part 2 answers Questions 4, 8, and 9, which call for information about actual releases of waste containing radioactive materials to offsite facilities from 1981 to the present, including radiation levels and pertinent documentation. Part 3 answers Question 10, which requests a description of the process for selecting offsite facilities for treatment or disposal of waste from Department of Energy facilities. In accordance with instructions from the Department of Energy, the report does not address Questions 1, 2, and 3

  13. On the experimental approaches for the assessment of the release of engineered nanomaterials from nanocomposites by physical degradation processes

    International Nuclear Information System (INIS)

    Blázquez, M; Unzueta, I; Egizabal, A

    2014-01-01

    The LIFE+ Project SIRENA, Simulation of the release of nanomaterials from consumer products for environmental exposure assessment, (LIFE11 ENV/ES/596) has set up a Technological Surveillance System (TSS) to trace technical references at worldwide level related to nanocomposites and the release from nanocomposites. So far a total of seventy three items of different nature (from peer reviewed articles to presentations and contributions to congresses) have been selected and classified as n anomaterials release simulation technologies . In present document, different approaches for the simulation of different life cycle stages through the physical degradation of polymer nanocomposites at laboratory scale are assessed. In absence of a reference methodology, the comparison of the different protocols used still remains a challenge

  14. On the experimental approaches for the assessment of the release of engineered nanomaterials from nanocomposites by physical degradation processes

    Science.gov (United States)

    Blázquez, M.; Egizabal, A.; Unzueta, I.

    2014-08-01

    The LIFE+ Project SIRENA, Simulation of the release of nanomaterials from consumer products for environmental exposure assessment, (LIFE11 ENV/ES/596) has set up a Technological Surveillance System (TSS) to trace technical references at worldwide level related to nanocomposites and the release from nanocomposites. So far a total of seventy three items of different nature (from peer reviewed articles to presentations and contributions to congresses) have been selected and classified as "nanomaterials release simulation technologies". In present document, different approaches for the simulation of different life cycle stages through the physical degradation of polymer nanocomposites at laboratory scale are assessed. In absence of a reference methodology, the comparison of the different protocols used still remains a challenge.

  15. Pathways and mechanisms for product release in the engineered haloalkane dehalogenases explored using classical and random acceleration molecular dynamics simulations

    Czech Academy of Sciences Publication Activity Database

    Klvana, M.; Pavlová, M.; Koudeláková, T.; Chaloupková, R.; Dvořák, P.; Prokop, Z.; Stsiapanava, A.; Kutý, Michal; Kutá-Smatanová, Ivana; Dohnálek, Jan; Kulhánek, P.; Damborský, J.

    2009-01-01

    Roč. 392, č. 5 (2009), s. 1339-1356 ISSN 0022-2836 R&D Projects: GA MŠk(CZ) LC06010 Institutional research plan: CEZ:AV0Z40500505; CEZ:AV0Z60870520 Keywords : haloalkane dehalogenase * product release * random acceleration molecular dynamics Subject RIV: CD - Macromolecular Chemistry Impact factor: 3.871, year: 2009

  16. The January 17, 1994 Northridge Earthquake: Effects on selected industrial facilities and lifelines

    Energy Technology Data Exchange (ETDEWEB)

    Eli, M.W.; Sommer, S.C. [Lawrence Livermore National Lab., CA (United States); Roche, T.R.; Merz, K.L.

    1995-02-01

    Revision 0 of this report is being published in February 1995 to closely mark the one-year anniversary of the Northridge Earthquake. A September 1994 Draft version of the report was reviewed by DOE and NRC, and many of the review comments are incorporated into Revision 0. While this revision of the report is not entirely complete, it is being made available for comment, review, and evaluation. Since the report was written by several authors, sections of the report have slightly different styles. Several sections of Revision 0 are not complete, but are planned to be completed in Revision 1. The primary unfinished section is Section 3.3 on Electric Power Transmission. Other sections of Revision 0, such as Section 4.5.2 on the Energy Technology Engineering Center and 3.2 on Electric Power Generation, will be enhanced with further detailed information as it becomes available. In addition, further data, including processed response spectra for investigated facilities and cataloging of relay performance, will be added to Revision 1 depending upon investigation support. While Revision 0 of this report is being published by LLNL, Revision 1 is planned to be published by EPRI. The anticipated release date for Revision 1 is December 1995. Unfortunately, the one-year anniversary of the Northridge Earthquake was also marked by the devastating Hyogo-Ken Nanbu (or Hanshin-Awaji) Earthquake in Kobe, Japan. As compared to the Northridge Earthquake, there were many more deaths, collapsed structures, destroyed lifelines, and fires following the Kobe Earthquake. Lessons from the Kobe Earthquake will both reemphasize topics discussed in this report and provide further issues to be addressed when designing and retrofitting structures, systems, and components for seismic strong motion.

  17. Inventory of Engineered Nanoparticle-Containing Consumer Products Available in the Singapore Retail Market and Likelihood of Release into the Aquatic Environment.

    Science.gov (United States)

    Zhang, Yuanyuan; Leu, Yu-Rui; Aitken, Robert J; Riediker, Michael

    2015-07-24

    Consumer products containing engineered nanoparticles (ENP) are already entering the marketplace. This leads, inter alia, to questions about the potential for release of ENP into the environment from commercial products. We have inventoried the prevalence of ENP-containing consumer products in the Singapore market by carrying out onsite assessments of products sold in all major chains of retail and cosmetic stores. We have assessed their usage patterns and estimated release factors and emission quantities to obtain a better understanding of the quantities of ENP that are released into which compartments of the aquatic environment in Singapore. Products investigated were assessed for their likelihood to contain ENP based on the declaration of ENP by producers, feature descriptions, and the information on particle size from the literature. Among the 1,432 products investigated, 138 were "confirmed" and 293 were "likely" to contain ENP. Product categories included sunscreens, cosmetics, health and fitness, automotive, food, home and garden, clothing and footwear, and eyeglass/lens coatings. Among the 27 different types of nanomaterials identified, SiO2 was predominant, followed by TiO2 and ZnO, Carbon Black, Ag, and Au. The amounts of ENP released into the aquatic system, which was estimated on the basis of typical product use, ENP concentration in the product, daily use quantity, release factor, and market share, were in the range of several hundred tons per year. As these quantities are likely to increase, it will be important to further study the fate of ENP that reach the aquatic environment in Singapore.

  18. Inventory of Engineered Nanoparticle-Containing Consumer Products Available in the Singapore Retail Market and Likelihood of Release into the Aquatic Environment

    Directory of Open Access Journals (Sweden)

    Yuanyuan Zhang

    2015-07-01

    Full Text Available Consumer products containing engineered nanoparticles (ENP are already entering the marketplace. This leads, inter alia, to questions about the potential for release of ENP into the environment from commercial products. We have inventoried the prevalence of ENP-containing consumer products in the Singapore market by carrying out onsite assessments of products sold in all major chains of retail and cosmetic stores. We have assessed their usage patterns and estimated release factors and emission quantities to obtain a better understanding of the quantities of ENP that are released into which compartments of the aquatic environment in Singapore. Products investigated were assessed for their likelihood to contain ENP based on the declaration of ENP by producers, feature descriptions, and the information on particle size from the literature. Among the 1,432 products investigated, 138 were “confirmed” and 293 were “likely” to contain ENP. Product categories included sunscreens, cosmetics, health and fitness, automotive, food, home and garden, clothing and footwear, and eyeglass/lens coatings. Among the 27 different types of nanomaterials identified, SiO2 was predominant, followed by TiO2 and ZnO, Carbon Black, Ag, and Au. The amounts of ENP released into the aquatic system, which was estimated on the basis of typical product use, ENP concentration in the product, daily use quantity, release factor, and market share, were in the range of several hundred tons per year. As these quantities are likely to increase, it will be important to further study the fate of ENP that reach the aquatic environment in Singapore.

  19. Influence of environmental parameters and of their interactions on the release of metal(loid)s from a construction material in hydraulic engineering

    International Nuclear Information System (INIS)

    Schmukat, A.; Duester, L.; Goryunova, E.; Ecker, D.; Heininger, P.; Ternes, T.A.

    2016-01-01

    Highlights: • DoE supported multi-factorial study on the metal(loid) release from copper slag. • Interactions of four parameters were studied and weighted. • An effective separation method between slag and sediment was established. • The metal(loid) partitioning between sediment, slag and eluent is described. • The knowledge on the potential environmental impact of copper slag is increased. - Abstract: Besides the leaching behaviour of a construction material under standardised test-specific conditions with laboratory water, for some construction materials it is advisable to test their environmental behaviour also under close to end use conditions. The envisaged end use combined with the product characteristics (e.g. mineral phases) is decisive for the choice of environmental factors that may change the release of substance that potentially cause adverse environmental effects (e.g. fertilisation or ecotoxicity). At the moment an experimental link is missing between mono-factorial standardised test systems and non standardised complex incubation experiments such as mesocosms which are closer to environmental conditions. Multi-factorial batch experiments may have the potential to close the gap. To verify this, batch experiments with copper slag were performed which is used as armour stones in hydraulic engineering. Design of experiments (DoE) was applied to evaluate the impact of pH, ionic strength, temperature and sediment content on the release of As, Cu, Mo, Ni, Pb, Sb and Zn. The study shows that release and sediment-eluent partitioning of metal(loid)s are impacted by interactions between the studied factors. Under the prevalent test conditions sediment acts as a sink enhancing most strongly the release of elements from the material.

  20. Influence of environmental parameters and of their interactions on the release of metal(loid)s from a construction material in hydraulic engineering

    Energy Technology Data Exchange (ETDEWEB)

    Schmukat, A., E-mail: schmukat@harzwasserwerke.de [Harzwasserwerke GmbH, Zur Granetalsperre 8, 38685 Langelsheim (Germany); Federal Institute of Hydrology, Department of Aquatic Chemistry, Am Mainzer Tor 1, 56068 Koblenz (Germany); Duester, L. [Federal Institute of Hydrology, Department of Aquatic Chemistry, Am Mainzer Tor 1, 56068 Koblenz (Germany); Goryunova, E. [Federal Institute of Hydrology, Department of Aquatic Chemistry, Am Mainzer Tor 1, 56068 Koblenz (Germany); KAPP-Chemie GmbH & Co. KG, Industriestr. 2-4, 56357 Miehlen (Germany); Ecker, D.; Heininger, P.; Ternes, T.A. [Federal Institute of Hydrology, Department of Aquatic Chemistry, Am Mainzer Tor 1, 56068 Koblenz (Germany)

    2016-03-05

    Highlights: • DoE supported multi-factorial study on the metal(loid) release from copper slag. • Interactions of four parameters were studied and weighted. • An effective separation method between slag and sediment was established. • The metal(loid) partitioning between sediment, slag and eluent is described. • The knowledge on the potential environmental impact of copper slag is increased. - Abstract: Besides the leaching behaviour of a construction material under standardised test-specific conditions with laboratory water, for some construction materials it is advisable to test their environmental behaviour also under close to end use conditions. The envisaged end use combined with the product characteristics (e.g. mineral phases) is decisive for the choice of environmental factors that may change the release of substance that potentially cause adverse environmental effects (e.g. fertilisation or ecotoxicity). At the moment an experimental link is missing between mono-factorial standardised test systems and non standardised complex incubation experiments such as mesocosms which are closer to environmental conditions. Multi-factorial batch experiments may have the potential to close the gap. To verify this, batch experiments with copper slag were performed which is used as armour stones in hydraulic engineering. Design of experiments (DoE) was applied to evaluate the impact of pH, ionic strength, temperature and sediment content on the release of As, Cu, Mo, Ni, Pb, Sb and Zn. The study shows that release and sediment-eluent partitioning of metal(loid)s are impacted by interactions between the studied factors. Under the prevalent test conditions sediment acts as a sink enhancing most strongly the release of elements from the material.

  1. A prediction of the UO2 fission gas release data of Bellamy and Rich using a model recently developed by combustion engineering

    International Nuclear Information System (INIS)

    Freeburn, H.R.; Pati, S.R.

    1983-01-01

    The trend in the Light Water Reactor industry to higher discharge burnups of UO 2 fuel rods has initiated the modification of existing fuel rod models to better account for high burnup effects. A model recently developed by Combustion Engineering, Inc. (C-E) for fission gas release from UO 2 fuel recognizes the separate effects of temperature-dependent and temperature-independent release mechanisms. This model accounts for a moderate burnup enhancement that is based on a concept of a saturation inventory existing for the intra- and inter-grannular storage of fission gas within the fuel pellet. The saturation inventory, as modelled, is strongly dependent on the local temperature and the changing grain size of the fuel with burnup. Although the fitting constants of the model were determined solely from more current gas release data from fuel more typical of the C-E product line, the model, nonetheless, provides an excellent prediction of the Bellamy and Rich data over the entire burnup range represented by the data (+-1.6% gas release at a 1σ level). The ability to obtain a good comparison with this data base provides additional support for the use of the particular separation of the effects of thermal diffusion and burnup enhancement on fission gas release that is embodied in the model. Furthermore, the degree of burnup enhancement in the model is believed to be moderate enough to suggest that this high burnup effect should not impede the extension of discharge burnup limits associated with current design fuel rods for Pressurized Water Reactors

  2. Clustered and transient earthquake sequences in mid-continents

    Science.gov (United States)

    Liu, M.; Stein, S. A.; Wang, H.; Luo, G.

    2012-12-01

    Earthquakes result from sudden release of strain energy on faults. On plate boundary faults, strain energy is constantly accumulating from steady and relatively rapid relative plate motion, so large earthquakes continue to occur so long as motion continues on the boundary. In contrast, such steady accumulation of stain energy does not occur on faults in mid-continents, because the far-field tectonic loading is not steadily distributed between faults, and because stress perturbations from complex fault interactions and other stress triggers can be significant relative to the slow tectonic stressing. Consequently, mid-continental earthquakes are often temporally clustered and transient, and spatially migrating. This behavior is well illustrated by large earthquakes in North China in the past two millennia, during which no single large earthquakes repeated on the same fault segments, but moment release between large fault systems was complementary. Slow tectonic loading in mid-continents also causes long aftershock sequences. We show that the recent small earthquakes in the Tangshan region of North China are aftershocks of the 1976 Tangshan earthquake (M 7.5), rather than indicators of a new phase of seismic activity in North China, as many fear. Understanding the transient behavior of mid-continental earthquakes has important implications for assessing earthquake hazards. The sequence of large earthquakes in the New Madrid Seismic Zone (NMSZ) in central US, which includes a cluster of M~7 events in 1811-1812 and perhaps a few similar ones in the past millennium, is likely a transient process, releasing previously accumulated elastic strain on recently activated faults. If so, this earthquake sequence will eventually end. Using simple analysis and numerical modeling, we show that the large NMSZ earthquakes may be ending now or in the near future.

  3. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  4. Level Recession Of Emissions Release By Motor-And-Tractor Diesel Engines Through The Application Of Water-Fuel Emulsions

    Science.gov (United States)

    Ivanov, A.; Chikishev, E.

    2017-01-01

    The paper is dedicated to a problem of environmental pollution by emissions of hazardous substances with the exhaust gases of internal combustion engines. It is found that application of water-fuel emulsions yields the best results in diesels where production of a qualitative carburetion is the main problem for the organization of working process. During pilot studies the composition of a water-fuel emulsion with the patent held is developed. The developed composition of a water-fuel emulsion provides its stability within 14-18 months depending on mass content of components in it while stability of emulsions’ analogues makes 8-12 months. The mode of operation of pilot unit is described. Methodology and results of pilot study of operation of diesel engine on a water-fuel emulsion are presented. Cutting time of droplet combustion of a water-fuel emulsion improves combustion efficiency and reduces carbon deposition (varnish) on working surfaces. Partial dismantling of the engine after its operating time during 60 engine hours has shown that there is a removal of a carbon deposition in cylinder-piston group which can be observed visually. It is found that for steady operation of the diesel and ensuring decrease in level of emission of hazardous substances the water-fuel emulsion with water concentration of 18-20% is optimal.

  5. The ShakeOut scenario: A hypothetical Mw7.8 earthquake on the Southern San Andreas Fault

    Science.gov (United States)

    Porter, K.; Jones, L.; Cox, D.; Goltz, J.; Hudnut, K.; Mileti, D.; Perry, S.; Ponti, D.; Reichle, M.; Rose, A.Z.; Scawthorn, C.R.; Seligson, H.A.; Shoaf, K.I.; Treiman, J.; Wein, A.

    2011-01-01

    In 2008, an earthquake-planning scenario document was released by the U.S. Geological Survey (USGS) and California Geological Survey that hypothesizes the occurrence and effects of a Mw7.8 earthquake on the southern San Andreas Fault. It was created by more than 300 scientists and engineers. Fault offsets reach 13 m and up to 8 m at lifeline crossings. Physics-based modeling was used to generate maps of shaking intensity, with peak ground velocities of 3 m/sec near the fault and exceeding 0.5 m/sec over 10,000 km2. A custom HAZUS??MH analysis and 18 special studies were performed to characterize the effects of the earthquake on the built environment. The scenario posits 1,800 deaths and 53,000 injuries requiring emergency room care. Approximately 1,600 fires are ignited, resulting in the destruction of 200 million square feet of the building stock, the equivalent of 133,000 single-family homes. Fire contributes $87 billion in property and business interruption loss, out of the total $191 billion in economic loss, with most of the rest coming from shakerelated building and content damage ($46 billion) and business interruption loss from water outages ($24 billion). Emergency response activities are depicted in detail, in an innovative grid showing activities versus time, a new format introduced in this study. ?? 2011, Earthquake Engineering Research Institute.

  6. The research committee of Chuetsu-oki earthquake influences to Kashiwazaki-Kariwa Nuclear Power Station. How to press-release and take care of expression in articles at press side

    International Nuclear Information System (INIS)

    Hamamoto, Kazuko; Narabayashi, Tadashi; Kobayashi, Masahide; Akizuki, Teruo; Onishi, Hidetoshi

    2009-01-01

    As for the influences of Chuetsu-Oki Earthquake on Kashiwazaki-Kariwa Nuclear Power Station, we can conclude that 'the safety function of the nuclear power station, that is, 'Shutdown', 'Cooling down' and 'Isolating' functioned as per designed even against an earthquake beyond assumption, and fundamental nuclear safety could assured'. Nevertheless, it is said that one of the causes that harmful rumor had spread was due to mass communication media. In the press reports on some nuclear power station when affected by an earthquake and on trouble in the nuclear power station in the future, we will propose that the publication should be really useful for habitants and citizens and be promoted in the positive and expected direction, in order not to make the same mistake as this time. JSME should aim at implement the above-mentioned proposal under cooperation with other academic societies and organizations. (author)

  7. Cell-surface engineering by a conjugation-and-release approach based on the formation and cleavage of oxime linkages upon mild electrochemical oxidation and reduction.

    Science.gov (United States)

    Pulsipher, Abigail; Dutta, Debjit; Luo, Wei; Yousaf, Muhammad N

    2014-09-01

    We report a strategy to rewire cell surfaces for the dynamic control of ligand composition on cell membranes and the modulation of cell-cell interactions to generate three-dimensional (3D) tissue structures applied to stem-cell differentiation, cell-surface tailoring, and tissue engineering. We tailored cell surfaces with bioorthogonal chemical groups on the basis of a liposome-fusion and -delivery method to create dynamic, electroactive, and switchable cell-tissue assemblies through chemistry involving chemoselective conjugation and release. Each step to modify the cell surface: activation, conjugation, release, and regeneration, can be monitored and modulated by noninvasive, label-free analytical techniques. We demonstrate the utility of this methodology by the conjugation and release of small molecules to and from cell surfaces and by the generation of 3D coculture spheroids and multilayered cell tissues that can be programmed to undergo assembly and disassembly on demand. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A poly(glycerol sebacate)-coated mesoporous bioactive glass scaffold with adjustable mechanical strength, degradation rate, controlled-release and cell behavior for bone tissue engineering.

    Science.gov (United States)

    Lin, Dan; Yang, Kai; Tang, Wei; Liu, Yutong; Yuan, Yuan; Liu, Changsheng

    2015-07-01

    Various requirements in the field of tissue engineering have motivated the development of three-dimensional scaffold with adjustable physicochemical properties and biological functions. A series of multiparameter-adjustable mesoporous bioactive glass (MBG) scaffolds with uncrosslinked poly(glycerol sebacate) (PGS) coating was prepared in this article. MBG scaffold was prepared by a modified F127/PU co-templating process and then PGS was coated by a simple adsorption and lyophilization process. Through controlling macropore parameters and PGS coating amount, the mechanical strength, degradation rate, controlled-release and cell behavior of the composite scaffold could be modulated in a wide range. PGS coating successfully endowed MBG scaffold with improved toughness and adjustable mechanical strength covering the bearing range of trabecular bone (2-12MPa). Multilevel degradation rate of the scaffold and controlled-release rate of protein from mesopore could be achieved, with little impact on the protein activity owing to an "ultralow-solvent" coating and "nano-cavity entrapment" immobilization method. In vitro studies indicated that PGS coating promoted cell attachment and proliferation in a dose-dependent manner, without affecting the osteogenic induction capacity of MBG substrate. These results first provide strong evidence that uncrosslinked PGS might also yield extraordinary achievements in traditional MBG scaffold. With the multiparameter adjustability, the composite MBG/PGS scaffolds would have a hopeful prospect in bone tissue engineering. The design considerations and coating method of this study can also be extended to other ceramic-based artificial scaffolds and are expected to provide new thoughts on development of future tissue engineering materials. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. The 2016 Kumamoto earthquake sequence.

    Science.gov (United States)

    Kato, Aitaro; Nakamura, Kouji; Hiyama, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An M j 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an M j 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest.

  10. The 2016 Kumamoto earthquake sequence

    Science.gov (United States)

    KATO, Aitaro; NAKAMURA, Kouji; HIYAMA, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An Mj 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an Mj 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest. PMID:27725474

  11. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  12. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  13. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  14. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  15. POST Earthquake Debris Management — AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  16. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  17. Mechanical properties, biological activity and protein controlled release by poly(vinyl alcohol)–bioglass/chitosan–collagen composite scaffolds: A bone tissue engineering applications

    Energy Technology Data Exchange (ETDEWEB)

    Pon-On, Weeraphat, E-mail: fsciwpp@ku.ac.th [Department of Physics, Faculty of Science, Kasetsart University, Bangkok 10900 (Thailand); Charoenphandhu, Narattaphol; Teerapornpuntakit, Jarinthorn; Thongbunchoo, Jirawan; Krishnamra, Nateetip [Center of Calcium and Bone Research (COCAB), Faculty of Science, Mahidol University (Thailand); Department of Physiology, Faculty of Science, Mahidol University (Thailand); Tang, I-Ming [ThEP Center, Commission of Higher Education, 328 Si Ayutthaya Rd. (Thailand); Department of Materials Science, Faculty of Science, Kasetsart University, Bangkok 10900 (Thailand)

    2014-05-01

    In the present study, composite scaffolds made with different weight ratios (0.5:1, 1:1 and 2:1) of bioactive glass (15Ca:80Si:5P) (BG)/polyvinyl alcohol (PVA) (PVABG) and chitosan (Chi)/collagen (Col) (ChiCol) were prepared by three mechanical freeze–thaw followed by freeze-drying to obtain the porous scaffolds. The mechanical properties and the in vitro biocompatibility of the composite scaffolds to simulated body fluid (SBF) and to rat osteoblast-like UMR-106 cells were investigated. The results from the studies indicated that the porosity and compressive strength were controlled by the weight ratio of PVABG:ChiCol. The highest compressive modulus of the composites made was 214.64 MPa which was for the 1:1 weight ratio PVABG:ChiCol. Mineralization study in SBF showed the formation of apatite crystals on the PVABG:ChiCol surface after 7 days of incubation. In vitro cell availability and proliferation tests confirmed the osteoblast attachment and growth on the PVABG:ChiCol surface. MTT and ALP tests on the 1:1 weight ratio PVABG:ChiCol composite indicated that the UMR-106 cells were viable. Alkaline phosphatase activity was found to increase with increasing culturing time. In addition, we showed the potential of PVABG:ChiCol drug delivery through PBS solution studies. 81.14% of BSA loading had been achieved and controlled release for over four weeks was observed. Our results indicated that the PVABG:ChiCol composites, especially the 1:1 weight ratio composite exhibited significantly improved mechanical, mineral deposition, biological properties and controlled release. This made them potential candidates for bone tissue engineering applications. - Graphical abstract: Mechanical properties, biological activity and protein controlled release by poly(vinyl alcohol)–bioglass/chitosan–collagen composite scaffolds: A bone tissue engineering applications. - Highlights: • Preparation of PVABG:ChiCol hybrid composites and their bioactivities • Mechanical

  18. Mechanical properties, biological activity and protein controlled release by poly(vinyl alcohol)–bioglass/chitosan–collagen composite scaffolds: A bone tissue engineering applications

    International Nuclear Information System (INIS)

    Pon-On, Weeraphat; Charoenphandhu, Narattaphol; Teerapornpuntakit, Jarinthorn; Thongbunchoo, Jirawan; Krishnamra, Nateetip; Tang, I-Ming

    2014-01-01

    In the present study, composite scaffolds made with different weight ratios (0.5:1, 1:1 and 2:1) of bioactive glass (15Ca:80Si:5P) (BG)/polyvinyl alcohol (PVA) (PVABG) and chitosan (Chi)/collagen (Col) (ChiCol) were prepared by three mechanical freeze–thaw followed by freeze-drying to obtain the porous scaffolds. The mechanical properties and the in vitro biocompatibility of the composite scaffolds to simulated body fluid (SBF) and to rat osteoblast-like UMR-106 cells were investigated. The results from the studies indicated that the porosity and compressive strength were controlled by the weight ratio of PVABG:ChiCol. The highest compressive modulus of the composites made was 214.64 MPa which was for the 1:1 weight ratio PVABG:ChiCol. Mineralization study in SBF showed the formation of apatite crystals on the PVABG:ChiCol surface after 7 days of incubation. In vitro cell availability and proliferation tests confirmed the osteoblast attachment and growth on the PVABG:ChiCol surface. MTT and ALP tests on the 1:1 weight ratio PVABG:ChiCol composite indicated that the UMR-106 cells were viable. Alkaline phosphatase activity was found to increase with increasing culturing time. In addition, we showed the potential of PVABG:ChiCol drug delivery through PBS solution studies. 81.14% of BSA loading had been achieved and controlled release for over four weeks was observed. Our results indicated that the PVABG:ChiCol composites, especially the 1:1 weight ratio composite exhibited significantly improved mechanical, mineral deposition, biological properties and controlled release. This made them potential candidates for bone tissue engineering applications. - Graphical abstract: Mechanical properties, biological activity and protein controlled release by poly(vinyl alcohol)–bioglass/chitosan–collagen composite scaffolds: A bone tissue engineering applications. - Highlights: • Preparation of PVABG:ChiCol hybrid composites and their bioactivities • Mechanical

  19. Effect of injection pressure on heat release rate and emissions in CI engine using orange skin powder diesel solution

    International Nuclear Information System (INIS)

    Purushothaman, K.; Nagarajan, G.

    2009-01-01

    Experiments have been conducted to study the effect of injection pressure on the combustion process and exhaust emissions of a direct injection diesel engine fueled with Orange Skin Powder Diesel Solution (OSPDS). Earlier investigation by the authors revealed that 30% OSPDS was optimum for better performance and emissions. In the present investigation the injection pressure was varied with 30% OSPDS and the combustion, performance and emissions characteristics were compared with those of diesel fuel. The different injection pressures studied were 215 bar, 235 bar and 255 bar. The results showed that the cylinder pressure with 30% OSPDS at 235 bar fuel injection pressure, was higher than that of diesel fuel as well as at other injection pressures. Similarly, the ignition delay was longer and with shorter combustion duration with 30% OSPDS at 235 bar injection pressure. The brake thermal efficiency was better at 235 bar than that of other fuel injection pressures with OSPDS and lower than that of diesel fuel. The NO x emission with 30% OSPDS was higher at 235 bar. The hydrocarbon and CO emissions were lower with 30% OSPDS at 235 bar. The smoke emission with 30% OSPDS was marginally lower at 235 bar and marginally higher at 215 bar than for diesel fuel. The combustion, performance and emission characteristics of the engine operating on the test fuels at 235 bar injection pressure were better than other injection pressures

  20. Experimental investigation of n-butanol/diesel fuel blends and n-butanol fumigation – Evaluation of engine performance, exhaust emissions, heat release and flammability analysis

    International Nuclear Information System (INIS)

    Şahin, Zehra; Durgun, Orhan; Aksu, Orhan N.

    2015-01-01

    Highlights: • n-Butanol/diesel fuel blends and n-butanol fumigation investigated experimentally. • Flammability analysis of n-butanol performed. • Smoke decreases significantly for n-butanol/diesel fuel blends and n-butanol fumigation. • HC emission increases significantly for n-butanol/diesel fuel blends and n-butanol fumigation. • 2% n-Butanol/diesel fuel blend decreases slightly BSFC. - Abstract: The aim of this paper is to investigate and compare the effects of n-butanol/diesel fuel blends (nBDFBs) and n-butanol fumigation (nBF) on the engine performance and exhaust emissions in a turbocharged automobile diesel engine. Also, evaluations based on heat release and flammability analysis have been done. Experiments have been performed for various n-nBDFBs and nBF at different engine speeds and loads. For nBDFBs and nBF tests; nB2, nB4 and nB6 and nBF2, nBF4 and nBF6n-butanol percentages were selected. Here, for example nB2 and nBF2 contains 2% n-butanol and 98% diesel fuel by volume respectively. The test results showed that smoke decreases significantly by applying both of these two methods. However, decrement ratios of smoke for fumigation method are higher than that of blend method. NO x emission decreases for nB2, but it increases for nB4 and nB6 at selected engine speeds and loads. NO x emission decreases generally for nBF. For nB2 and nB4, BSFC decreases slightly but it increases for nB6. For nBF, BSFC increases at all of the test conditions. Adding n-butanol to diesel fuel becomes expensive for two methods. For nBDFBs, heat release rate (HRR) diagrams exhibit similar typical characteristic to NDF. However, for nBF, HRR shows slightly different pattern from NDF and a double peak is observed in the HRR diagram. The first peak occurs earlier than NDF and the second peak takes places later. In addition, this diagram shows that the first peak becomes larger and the second peak diminishes as n-butanol ratio is increased. Because of pilot injection of

  1. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  2. Scaling and spatial complementarity of tectonic earthquake swarms

    KAUST Repository

    Passarelli, Luigi

    2017-11-10

    Tectonic earthquake swarms (TES) often coincide with aseismic slip and sometimes precede damaging earthquakes. In spite of recent progress in understanding the significance and properties of TES at plate boundaries, their mechanics and scaling are still largely uncertain. Here we evaluate several TES that occurred during the past 20 years on a transform plate boundary in North Iceland. We show that the swarms complement each other spatially with later swarms discouraged from fault segments activated by earlier swarms, which suggests efficient strain release and aseismic slip. The fault area illuminated by earthquakes during swarms may be more representative of the total moment release than the cumulative moment of the swarm earthquakes. We use these findings and other published results from a variety of tectonic settings to discuss general scaling properties for TES. The results indicate that the importance of TES in releasing tectonic strain at plate boundaries may have been underestimated.

  3. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  4. A bFGF-releasing silk/PLGA-based biohybrid scaffold for ligament/tendon tissue engineering using mesenchymal progenitor cells.

    Science.gov (United States)

    Sahoo, Sambit; Toh, Siew Lok; Goh, James C H

    2010-04-01

    An ideal scaffold that provides a combination of suitable mechanical properties along with biological signals is required for successful ligament/tendon regeneration in mesenchymal stem cell-based tissue engineering strategies. Among the various fibre-based scaffolds that have been used, hybrid fibrous scaffolds comprising both microfibres and nanofibres have been recently shown to be particularly promising. This study developed a biohybrid fibrous scaffold system by coating bioactive bFGF-releasing ultrafine PLGA fibres over mechanically robust slowly-degrading degummed knitted microfibrous silk scaffolds. On the ECM-like biomimetic architecture of ultrafine fibres, sustained release of bFGF mimicked the ECM in function, initially stimulating mesenchymal progenitor cell (MPC) proliferation, and subsequently, their tenogeneic differentiation. The biohybrid scaffold system not only facilitated MPC attachment and promoted cell proliferation, with cells growing both on ultrafine PLGA fibres and silk microfibres, but also stimulated tenogeneic differentiation of seeded MPCs. Upregulated gene expression of ligament/tendon-specific ECM proteins and increased collagen production likely contributed to enhancing mechanical properties of the constructs, generating a ligament/tendon analogue that has the potential to be used to repair injured ligaments/tendons. Copyright 2010 Elsevier Ltd. All rights reserved.

  5. Developing an Internet Oriented Platform for Earthquake Engineering Application and Web-based Virtual Reality Simulation System for Seismic hazards: Towards Disaster Mitigation in Metropolises

    Directory of Open Access Journals (Sweden)

    Ali Alaghehbandian

    2003-04-01

    Full Text Available This paper reviews the state of the art on risk communication to the public, with an emphasis on simulation of seismic hazards using VRML. Rapid growth computer technologies, especially the Internet provide human beings new measures to deal with engineering and social problems which were hard to solve in traditional ways. This paper presents a prototype of an application platform based on the Internet using VR (Virtual Reality for civil engineering considering building an information system of risk communication for seismic hazards and at the moment in the case of bridge structure.

  6. Design basis earthquakes for critical industrial facilities and their characteristics, and the Southern Hyogo prefecture earthquake, 17 January 1995

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Heki

    1998-12-01

    This paper deals with how to establish the concept of the design basis earthquake (DBE) for critical industrial facilities such as nuclear power plants in consideration of disasters such as the Southern Hyogo prefecture earthquake, the so-called Kobe earthquake in 1995. The author once discussed various DBEs at the 7th World Conference on Earthquake Engineering. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared the values of accelerations of a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo prefecture earthquake in 1995 exceeded the previous assumption of the author, even though the results of the previous paper had been pessimistic. According to the experience of the Kobe event, the author will point out the necessity of the third earthquake S{sub s} adding to S{sub 1} and S{sub 2} of previous DBEs.

  7. Estimation of Natural Frequencies During Earthquakes

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Rytter, A

    1997-01-01

    This paper presents two different recursive prediction error method (RPEM} implementations of multivariate Auto-Regressive Moving- Average (ARMAV) models for identification of a time variant civil engineering structure subject to an earthquake. The two techniques are tested on measurements made...

  8. Experimental study of structural response to earthquakes

    International Nuclear Information System (INIS)

    Clough, R.W.; Bertero, V.V.; Bouwkamp, J.G.; Popov, E.P.

    1975-01-01

    The objectives, methods, and some of the principal results obtained from experimental studies of the behavior of structures subjected to earthquakes are described. Although such investigations are being conducted in many laboratories throughout the world, the information presented deals specifically with projects being carried out at the Earthquake Engineering Research Center (EERC) of the University of California, Berkeley. A primary purpose of these investigations is to obtain detailed information on the inelastic response mechanisms in typical structural systems so that the experimentally observed performance can be compared with computer generated analytical predictions. Only by such comparisons can the mathematical models used in dynamic nonlinear analyses be verified and improved. Two experimental procedures for investigating earthquake structural response are discussed: the earthquake simulator facility which subjects the base of the test structure to acceleration histories similar to those recorded in actual earthquakes, and systems of hydraulic rams which impose specified displacement histories on the test components, equivalent to motions developed in structures subjected to actual'quakes. The general concept and performance of the 20ft square EERC earthquake simulator is described, and the testing of a two story concrete frame building is outlined. Correlation of the experimental results with analytical predictions demonstrates that satisfactory agreement can be obtained only if the mathematical model incorporates a stiffness deterioration mechanism which simulates the cracking and other damage suffered by the structure

  9. Cyclic characteristics of earthquake time histories

    International Nuclear Information System (INIS)

    Hall, J.R. Jr; Shukla, D.K.; Kissenpfennig, J.F.

    1977-01-01

    From an engineering standpoint, an earthquake record may be characterized by a number of parameters, one of which is its 'cyclic characteristics'. The cyclic characteristics are most significant in fatigue analysis of structures and liquefaction analysis of soils where, in addition to the peak motion, cyclic buildup is significant. Whereas duration peak amplitude and response spectra for earthquakes have been studied extensively, the cyclic characteristics of earthquake records have not received an equivalent attention. Present procedures to define the cyclic characteristics are generally based upon counting the number of peaks at various amplitude ranges on a record. This paper presents a computer approach which describes a time history by an amplitude envelope and a phase curve. Using Fast Fourier Transform Techniques, an earthquake time history is represented as a projection along the x-axis of a rotating vector-the length the vector is given by the amplitude spectra-and the angle between the vector and x-axis is given by the phase curve. Thus one cycle is completed when the vector makes a full rotation. Based upon Miner's cumulative damage concept, the computer code automatically combines the cycles of various amplitudes to obtain the equivalent number of cycles of a given amplitude. To illustrate the overall results, the cyclic characteristics of several real and synthetic earthquake time histories have been studied and are presented in the paper, with the conclusion that this procedure provides a physical interpretation of the cyclic characteristics of earthquakes. (Auth.)

  10. Earthquakes trigger the loss of groundwater biodiversity

    Science.gov (United States)

    Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero

    2014-09-01

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  11. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  12. An engineered pathway for glyoxylate metabolism in tobacco plants aimed to avoid the release of ammonia in photorespiration

    Directory of Open Access Journals (Sweden)

    Carvalho Josirley de FC

    2011-11-01

    Full Text Available Abstract Background The photorespiratory nitrogen cycle in C3 plants involves an extensive diversion of carbon and nitrogen away from the direct pathways of assimilation. The liberated ammonia is re-assimilated, but up to 25% of the carbon may be released into the atmosphere as CO2. Because of the loss of CO2 and high energy costs, there has been considerable interest in attempts to decrease the flux through the cycle in C3 plants. Transgenic tobacco plants were generated that contained the genes gcl and hyi from E. coli encoding glyoxylate carboligase (EC 4.1.1.47 and hydroxypyruvate isomerase (EC 5.3.1.22 respectively, targeted to the peroxisomes. It was presumed that the two enzymes could work together and compete with the aminotransferases that convert glyoxylate to glycine, thus avoiding ammonia production in the photorespiratory nitrogen cycle. Results When grown in ambient air, but not in elevated CO2, the transgenic tobacco lines had a distinctive phenotype of necrotic lesions on the leaves. Three of the six lines chosen for a detailed study contained single copies of the gcl gene, two contained single copies of both the gcl and hyi genes and one line contained multiple copies of both gcl and hyi genes. The gcl protein was detected in the five transgenic lines containing single copies of the gcl gene but hyi protein was not detected in any of the transgenic lines. The content of soluble amino acids including glycine and serine, was generally increased in the transgenic lines growing in air, when compared to the wild type. The content of soluble sugars, glucose, fructose and sucrose in the shoot was decreased in transgenic lines growing in air, consistent with decreased carbon assimilation. Conclusions Tobacco plants have been generated that produce bacterial glyoxylate carboligase but not hydroxypyruvate isomerase. The transgenic plants exhibit a stress response when exposed to air, suggesting that some glyoxylate is diverted away from

  13. Fault roughness and strength heterogeneity control earthquake size and stress drop

    KAUST Repository

    Zielke, Olaf

    2017-01-13

    An earthquake\\'s stress drop is related to the frictional breakdown during sliding and constitutes a fundamental quantity of the rupture process. High-speed laboratory friction experiments that emulate the rupture process imply stress drop values that greatly exceed those commonly reported for natural earthquakes. We hypothesize that this stress drop discrepancy is due to fault-surface roughness and strength heterogeneity: an earthquake\\'s moment release and its recurrence probability depend not only on stress drop and rupture dimension but also on the geometric roughness of the ruptured fault and the location of failing strength asperities along it. Using large-scale numerical simulations for earthquake ruptures under varying roughness and strength conditions, we verify our hypothesis, showing that smoother faults may generate larger earthquakes than rougher faults under identical tectonic loading conditions. We further discuss the potential impact of fault roughness on earthquake recurrence probability. This finding provides important information, also for seismic hazard analysis.

  14. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    Science.gov (United States)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    Recently our understanding of tectonic faulting has been shaken by the discoveries of seismic tremor, low frequency earthquakes, slow slip events, and other models of fault slip. These phenomenas represent models of failure that were thought to be non-existent and theoretically impossible only a few years ago. Slow earthquakes are seismic phenomena in which the rupture of geological faults in the earth's crust occurs gradually without creating strong tremors. Despite the growing number of observations of slow earthquakes their origin remains unresolved. Studies show that the duration of slow earthquakes ranges from a few seconds to a few hundred seconds. The regular earthquakes with which most people are familiar release a burst of built-up stress in seconds, slow earthquakes release energy in ways that do little damage. This study focus on the characteristics of the Mw5.6 earthquake occurred in Sofia seismic zone on May 22nd, 2012. The Sofia area is the most populated, industrial and cultural region of Bulgaria that faces considerable earthquake risk. The Sofia seismic zone is located in South-western Bulgaria - the area with pronounce tectonic activity and proved crustal movement. In 19th century the city of Sofia (situated in the centre of the Sofia seismic zone) has experienced two strong earthquakes with epicentral intensity of 10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK64).The 2012 quake occurs in an area characterized by a long quiescence (of 95 years) for moderate events. Moreover, a reduced number of small earthquakes have also been registered in the recent past. The Mw5.6 earthquake is largely felt on the territory of Bulgaria and neighbouring countries. No casualties and severe injuries have been reported. Mostly moderate damages were observed in the cities of Pernik and Sofia and their surroundings. These observations could be assumed indicative for a

  15. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  16. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  17. Earthquakes, detecting and understanding them

    International Nuclear Information System (INIS)

    2008-05-01

    The signatures at the surface of the Earth is continually changing on a geological timescale. The tectonic plates, which make up this surface, are moving in relation to each other. On human timescale, these movements are the result of earthquakes, which suddenly, release energy accumulated over a period of time. The vibrations they produce propagate through the interior of the Earth: these are seismic waves. However, other phenomena can generate seismic waves, such as volcanoes, quarry blasts, etc. The surf of the ocean waves on the coasts, the wind in the trees and human activity (industry and road traffic) all contribute to the 'seismic background noise'. Sensors are able to detect signals from events which are then discriminated, analyzed and located. Earthquakes and active volcanoes are not distributed randomly over the surface of the globe: they mainly coincide with mountain chains and ocean trenches and ridges. 'An earthquake results from the abrupt release of the energy accumulated by movements and rubbing of different plates'. The study of the propagation of seismic waves has allowed to determine the outline of the plates inside the Earth and has highlighted their movements. There are seven major plates which are colliding, diverging or sliding past each other. Each year the continents move several centimeters with respect to one another. This process, known as 'continental drift', was finally explained by plate tectonics. The initial hypothesis for this science dates from the beginning of the 20. century, but it was not confirmed until the 1960's. It explains that convection inside the Earth is the source of the forces required for these movements. This science, as well as explaining these great movements, has provided a coherent, unifying and quantitative framework, which unites the explanations for all the geophysical phenomena under one mechanism. (authors)

  18. The 2007 Mentawai earthquake sequence on the Sumatra megathrust

    Science.gov (United States)

    Konca, A.; Avouac, J.; Sladen, A.; Meltzner, A. J.; Kositsky, A. P.; Sieh, K.; Fang, P.; Li, Z.; Galetzka, J.; Genrich, J.; Chlieh, M.; Natawidjaja, D. H.; Bock, Y.; Fielding, E. J.; Helmberger, D. V.

    2008-12-01

    The Sumatra Megathrust has recently produced a flurry of large interplate earthquakes starting with the giant Mw 9.15, Aceh earthquake of 2004. All of these earthquakes occurred within the area monitored by the Sumatra Geodetic Array (SuGAr), which provided exceptional records of near-field co-seismic and postseismic ground displacements. The most recent of these major earthquakes, an Mw 8.4 earthquake and an Mw 7.9 earthquake twelve hours later, occurred in the Mentawai islands area where devastating historical earthquakes had happened in 1797 and 1833. The 2007 earthquake sequence provides an exceptional opportunity to understand the variability of the earthquakes along megathrusts and their relation to interseismic coupling. The InSAR, GPS and teleseismic modeling shows that 2007 earthquakes ruptured a fraction of the strongly coupled Mentawai patch of the megathrust, which is also only a fraction of the 1833 rupture area. It also released a much smaller moment than the one released in 1833, or than the deficit of moment that has accumulated since. Both earthquakes of 2007 consist of 2 sub-events which are 50 to 100 km apart from each other. On the other hand, the northernmost slip patch of 8.4 and southern slip patch of 7.9 earthquakes abut each other, but they ruptured 12 hours apart. Sunda megathrust earthquakes of recent years include a rupture of a strongly coupled patch that closely mimics a prior rupture of that patch and which is well correlated with the interseismic coupling pattern (Nias-Simeulue section), as well as a rupture sequence of a strongly coupled patch that differs substantially in the details from its most recent predecessors (Mentawai section). We conclude that (1) seismic asperities are probably persistent features which arise form heterogeneous strain build up in the interseismic period; and (2) the same portion of a megathrust can rupture in different ways depending on whether asperities break as isolated events or cooperate to produce

  19. Mechanical properties, biological activity and protein controlled release by poly(vinyl alcohol)-bioglass/chitosan-collagen composite scaffolds: a bone tissue engineering applications.

    Science.gov (United States)

    Pon-On, Weeraphat; Charoenphandhu, Narattaphol; Teerapornpuntakit, Jarinthorn; Thongbunchoo, Jirawan; Krishnamra, Nateetip; Tang, I-Ming

    2014-05-01

    In the present study, composite scaffolds made with different weight ratios (0.5:1, 1:1 and 2:1) of bioactive glass (15Ca:80Si:5P) (BG)/polyvinyl alcohol (PVA) (PVABG) and chitosan (Chi)/collagen (Col) (ChiCol) were prepared by three mechanical freeze-thaw followed by freeze-drying to obtain the porous scaffolds. The mechanical properties and the in vitro biocompatibility of the composite scaffolds to simulated body fluid (SBF) and to rat osteoblast-like UMR-106 cells were investigated. The results from the studies indicated that the porosity and compressive strength were controlled by the weight ratio of PVABG:ChiCol. The highest compressive modulus of the composites made was 214.64 MPa which was for the 1:1 weight ratio PVABG:ChiCol. Mineralization study in SBF showed the formation of apatite crystals on the PVABG:ChiCol surface after 7 days of incubation. In vitro cell availability and proliferation tests confirmed the osteoblast attachment and growth on the PVABG:ChiCol surface. MTT and ALP tests on the 1:1 weight ratio PVABG:ChiCol composite indicated that the UMR-106 cells were viable. Alkaline phosphatase activity was found to increase with increasing culturing time. In addition, we showed the potential of PVABG:ChiCol drug delivery through PBS solution studies. 81.14% of BSA loading had been achieved and controlled release for over four weeks was observed. Our results indicated that the PVABG:ChiCol composites, especially the 1:1 weight ratio composite exhibited significantly improved mechanical, mineral deposition, biological properties and controlled release. This made them potential candidates for bone tissue engineering applications. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Cooperative earthquake research between the United States and the People's Republic of China

    Energy Technology Data Exchange (ETDEWEB)

    Russ, D.P.; Johnson, L.E.

    1986-01-01

    This paper describes cooperative research by scientists of the US and the People's Republic of China (PRC) which has resulted in important new findings concerning the fundamental characteristics of earthquakes and new insight into mitigating earthquake hazards. There have been over 35 projects cooperatively sponsored by the Earthquake Studies Protocol in the past 5 years. The projects are organized into seven annexes, including investigations in earthquake prediction, intraplate faults and earthquakes, earthquake engineering and hazards investigation, deep crustal structure, rock mechanics, seismology, and data exchange. Operational earthquake prediction experiments are currently being developed at two primary sites: western Yunnan Province near the town of Xiaguan, where there are several active faults, and the northeast China plain, where the devastating 1976 Tangshan earthquake occurred.

  1. Cigarette company trade secrets are not secret: an analysis of reverse engineering reports in internal tobacco industry documents released as a result of litigation.

    Science.gov (United States)

    Velicer, Clayton; Lempert, Lauren K; Glantz, Stanton

    2015-09-01

    Use previously secret tobacco industry documents to assess tobacco companies' routine claims of trade secret protection for information on cigarette ingredients, additives and construction made to regulatory agencies, as well as the companies' refusal to publicly disclose this information. We analysed previously secret tobacco industry documents available at (http://legacy.library.ucsf.edu) to identify 100 examples of seven major tobacco companies' reverse engineering of their competitors' brands between 1937 and 2001. These reverse engineering reports contain detailed data for 142 different measurements for at least two companies, including physical parameters of the cigarettes, tobacco types, humectants, additives, flavourings, and smoke constituents of competitors' cigarettes. These 100 documents were distributed to 564 employees, including top managers in domestic and foreign offices across multiple departments, including executive leadership, research and design, product development, marketing and legal. These documents reported new competitors' products, measured ingredient changes over time, and informed companies' decisions regarding ingredients in their own products. Because cigarette companies routinely analyse their competitors' cigarettes in great detail, this information is neither secret nor commercially valuable and, thus, does not meet the legal definition of a 'trade secret.' This information is only being kept 'secret' from the people consuming cigarettes and the scientific community. Public agencies should release this detailed information because it would provide valuable information about how ingredients affect addictiveness and toxicity, and would help the public health community and consumers better understand the impact of cigarette design on human health. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  3. United States earthquake early warning system: how theory and analysis can save America before the big one happens

    OpenAIRE

    Rockabrand, Ryan

    2017-01-01

    Approved for public release; distribution is unlimited The United States is extremely vulnerable to catastrophic earthquakes. More than 143 million Americans may be threatened by damaging earthquakes in the next 50 years. This thesis argues that the United States is unprepared for the most catastrophic earthquakes the country faces today. Earthquake early warning systems are a major solution in practice to reduce economic risk, to protect property and the environment, and to save lives. Ot...

  4. Real-time earthquake data feasible

    Science.gov (United States)

    Bush, Susan

    Scientists agree that early warning devices and monitoring of both Hurricane Hugo and the Mt. Pinatubo volcanic eruption saved thousands of lives. What would it take to develop this sort of early warning and monitoring system for earthquake activity?Not all that much, claims a panel assigned to study the feasibility, costs, and technology needed to establish a real-time earthquake monitoring (RTEM) system. The panel, drafted by the National Academy of Science's Committee on Seismology, has presented its findings in Real-Time Earthquake Monitoring. The recently released report states that “present technology is entirely capable of recording and processing data so as to provide real-time information, enabling people to mitigate somewhat the earthquake disaster.” RTEM systems would consist of two parts—an early warning system that would give a few seconds warning before severe shaking, and immediate postquake information within minutes of the quake that would give actual measurements of the magnitude. At this time, however, this type of warning system has not been addressed at the national level for the United States and is not included in the National Earthquake Hazard Reduction Program, according to the report.

  5. A 'new generation' earthquake catalogue

    Directory of Open Access Journals (Sweden)

    E. Boschi

    2000-06-01

    Full Text Available In 1995, we published the first release of the Catalogo dei Forti Terremoti in Italia, 461 a.C. - 1980, in Italian (Boschi et al., 1995. Two years later this was followed by a second release, again in Italian, that included more earthquakes, more accurate research and a longer time span (461 B.C. to 1990 (Boschi et al., 1997. Aware that the record of Italian historical seismicity is probably the most extensive of the whole world, and hence that our catalogue could be of interest for a wider interna-tional readership, Italian was clearly not the appropriate language to share this experience with colleagues from foreign countries. Three years after publication of the second release therefore, and after much additional research and fine tuning of methodologies and algorithms, I am proud to introduce this third release in English. All the tools and accessories have been translated along with the texts describing the development of the underlying research strategies and current contents. The English title is Catalogue of Strong Italian Earthquakes, 461 B.C. to 1997. This Preface briefly describes the scientific context within which the Catalogue of Strong Italian Earthquakes was conceived and progressively developed. The catalogue is perhaps the most impor-tant outcome of a well-established joint project between the Istituto Nazionale di Geofisica, the leading Italian institute for basic and applied research in seismology and solid earth geophysics, and SGA (Storia Geofisica Ambiente, a private firm specialising in the historical investigation and systematisation of natural phenomena. In her contribution "Method of investigation, typology and taxonomy of the basic data: navigating between seismic effects and historical contexts", Emanuela Guidoboni outlines the general framework of modern historical seismology, its complex relation with instrumental seismology on the one hand and historical research on the other. This presentation also highlights

  6. Who is Responsible for Human Suffering due to Earthquakes?

    Science.gov (United States)

    Wyss, M.

    2012-12-01

    A court in L'Aquila, Italy, convicted seven to six years in prison and a combined fine of two million Euros for not following their "obligation to avoid death, injury and damage, or at least to minimize them," as the prosecution alleged. These men lose their jobs and pensions, and are banned from holding public office. Meanwhile, the town of L'Aquila is teeming with furious citizens, who are preparing additional civil suits against the defendants, whom they hold responsible for the deaths of their loved ones, killed by collapsing buildings during the magnitude 6.3 earthquake of April 6, 2009. Before this shock, an earthquake swarm had scared the inhabitants for several weeks. To calm the population, the vice-director of the Department of Civil Protection (DCP) called a meeting of the Italian Commission of Great Risks (CGR) in L'Aquila to assess the situation on March 31. One hour before this meeting, the vice-director stated in a TV interview that the seismic situation in L'Aquila was "certainly normal" and posed "no danger" and he added that "the scientific community continues to assure me that, to the contrary, it's a favorable situation because of the continuous discharge of energy." This statement is untrue in two ways. Firstly, small earthquakes do not release enough strain energy to reduce the potential for a large shock, and secondly no seismologist would make such a statement because we know it is not true. However, the population clung to the idea: "the more tremors, the less danger". People who lost relatives allege that they would have left their homes, had they not been falsely assured of their safety. The court treated all seven alike, although they had very different functions and obligations. Two were leaders in DCP, four were members of the CGR, and one was a seismology expert, who brought the latest seismic data. The minutes of the meeting show that none of the experts said anything wrong. They all stated that the probability of a main shock to

  7. National Earthquake Hazards Program at a Crossroads

    Science.gov (United States)

    Showstack, Randy

    The U.S.National Earthquake Hazards Reduction Program, which turns 25 years old on 1 October 2003, is passing through two major transitions, which experts said either could weaken or strengthen the program. On 1 March, a federal government reorganization placed NEHRP's lead agency,the Federal Emergency Management Agency (FEMA),within the new Department of Homeland Security (DHS). A number of earthquake scientists and engineers expressed concern that NEHRP, which already faces budgetary and organizational challenges, and lacks visibility,could end up being marginalized in the bureaucratic shuffle. Some experts, though,as well as agency officials, said they hope DHS will recognize synergies between dealing with earthquakes and terrorist attacks.

  8. Lessons learned from the 1994 Northridge Earthquake

    International Nuclear Information System (INIS)

    Eli, M.W.; Sommer, S.C.

    1995-01-01

    Southern California has a history of major earthquakes and also has one of the largest metropolitan areas in the United States. The 1994 Northridge Earthquake challenged the industrial facilities and lifetime infrastructure in the northern Los Angeles (LA) area. Lawrence Livermore National Laboratory (LLNL) sent a team of engineers to conduct an earthquake damage investigation in the Northridge area, on a project funded jointly by the United States Nuclear Regulatory Commission (USNRC) and the United States Department of Energy (USDOE). Many of the structures, systems, and components (SSCs) and lifelines that suffered damage are similar to those found in nuclear power plants and in USDOE facilities. Lessons learned from these experiences can have some applicability at commercial nuclear power plants

  9. Thermal Radiation Anomalies Associated with Major Earthquakes

    Science.gov (United States)

    Ouzounov, Dimitar; Pulinets, Sergey; Kafatos, Menas C.; Taylor, Patrick

    2017-01-01

    Recent developments of remote sensing methods for Earth satellite data analysis contribute to our understanding of earthquake related thermal anomalies. It was realized that the thermal heat fluxes over areas of earthquake preparation is a result of air ionization by radon (and other gases) and consequent water vapor condensation on newly formed ions. Latent heat (LH) is released as a result of this process and leads to the formation of local thermal radiation anomalies (TRA) known as OLR (outgoing Longwave radiation, Ouzounov et al, 2007). We compare the LH energy, obtained by integrating surface latent heat flux (SLHF) over the area and time with released energies associated with these events. Extended studies of the TRA using the data from the most recent major earthquakes allowed establishing the main morphological features. It was also established that the TRA are the part of more complex chain of the short-term pre-earthquake generation, which is explained within the framework of a lithosphere-atmosphere coupling processes.

  10. Earthquake magnitude estimation using the τ c and P d method for earthquake early warning systems

    Science.gov (United States)

    Jin, Xing; Zhang, Hongcai; Li, Jun; Wei, Yongxiang; Ma, Qiang

    2013-10-01

    Earthquake early warning (EEW) systems are one of the most effective ways to reduce earthquake disaster. Earthquake magnitude estimation is one of the most important and also the most difficult parts of the entire EEW system. In this paper, based on 142 earthquake events and 253 seismic records that were recorded by the KiK-net in Japan, and aftershocks of the large Wenchuan earthquake in Sichuan, we obtained earthquake magnitude estimation relationships using the τ c and P d methods. The standard variances of magnitude calculation of these two formulas are ±0.65 and ±0.56, respectively. The P d value can also be used to estimate the peak ground motion of velocity, then warning information can be released to the public rapidly, according to the estimation results. In order to insure the stability and reliability of magnitude estimation results, we propose a compatibility test according to the natures of these two parameters. The reliability of the early warning information is significantly improved though this test.

  11. Retrospective stress-forecasting of earthquakes

    Science.gov (United States)

    Gao, Yuan; Crampin, Stuart

    2015-04-01

    Observations of changes in azimuthally varying shear-wave splitting (SWS) above swarms of small earthquakes monitor stress-induced changes to the stress-aligned vertical microcracks pervading the upper crust, lower crust, and uppermost ~400km of the mantle. (The microcracks are intergranular films of hydrolysed melt in the mantle.) Earthquakes release stress, and an appropriate amount of stress for the relevant magnitude must accumulate before each event. Iceland is on an extension of the Mid-Atlantic Ridge, where two transform zones, uniquely run onshore. These onshore transform zones provide semi-continuous swarms of small earthquakes, which are the only place worldwide where SWS can be routinely monitored. Elsewhere SWS must be monitored above temporally-active occasional swarms of small earthquakes, or in infrequent SKS and other teleseismic reflections from the mantle. Observations of changes in SWS time-delays are attributed to stress-induced changes in crack aspect-ratios allowing stress-accumulation and stress-relaxation to be identified. Monitoring SWS in SW Iceland in 1988, stress-accumulation before an impending earthquake was recognised and emails were exchanged between the University of Edinburgh (EU) and the Iceland Meteorological Office (IMO). On 10th November 1988, EU emailed IMO that a M5 earthquake could occur soon on a seismically-active fault plane where seismicity was still continuing following a M5.1 earthquake six-months earlier. Three-days later, IMO emailed EU that a M5 earthquake had just occurred on the specified fault-plane. We suggest this is a successful earthquake stress-forecast, where we refer to the procedure as stress-forecasting earthquakes as opposed to predicting or forecasting to emphasise the different formalism. Lack of funds has prevented us monitoring SWS on Iceland seismograms, however, we have identified similar characteristic behaviour of SWS time-delays above swarms of small earthquakes which have enabled us to

  12. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  13. Bam Earthquake in Iran

    CERN Multimedia

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  14. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  15. Prediction of strong earthquake motions on rock surface using evolutionary process models

    International Nuclear Information System (INIS)

    Kameda, H.; Sugito, M.

    1984-01-01

    Stochastic process models are developed for prediction of strong earthquake motions for engineering design purposes. Earthquake motions with nonstationary frequency content are modeled by using the concept of evolutionary processes. Discussion is focused on the earthquake motions on bed rocks which are important for construction of nuclear power plants in seismic regions. On this basis, two earthquake motion prediction models are developed, one (EMP-IB Model) for prediction with given magnitude and epicentral distance, and the other (EMP-IIB Model) to account for the successive fault ruptures and the site location relative to the fault of great earthquakes. (Author) [pt

  16. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  17. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  18. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  19. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  20. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  1. Prevention of strong earthquakes: Goal or utopia?

    Science.gov (United States)

    Mukhamediev, Sh. A.

    2010-11-01

    In the present paper, we consider ideas suggesting various kinds of industrial impact on the close-to-failure block of the Earth’s crust in order to break a pending strong earthquake (PSE) into a number of smaller quakes or aseismic slips. Among the published proposals on the prevention of a forthcoming strong earthquake, methods based on water injection and vibro influence merit greater attention as they are based on field observations and the results of laboratory tests. In spite of this, the cited proofs are, for various reasons, insufficient to acknowledge the proposed techniques as highly substantiated; in addition, the physical essence of these methods has still not been fully understood. First, the key concept of the methods, namely, the release of the accumulated stresses (or excessive elastic energy) in the source region of a forthcoming strong earthquake, is open to objection. If we treat an earthquake as a phenomenon of a loss in stability, then, the heterogeneities of the physicomechanical properties and stresses along the existing fault or its future trajectory, rather than the absolute values of stresses, play the most important role. In the present paper, this statement is illustrated by the classical examples of stable and unstable fractures and by the examples of the calculated stress fields, which were realized in the source regions of the tsunamigenic earthquakes of December 26, 2004 near the Sumatra Island and of September 29, 2009 near the Samoa Island. Here, just before the earthquakes, there were no excessive stresses in the source regions. Quite the opposite, the maximum shear stresses τmax were close to their minimum value, compared to τmax in the adjacent territory. In the present paper, we provide quantitative examples that falsify the theory of the prevention of PSE in its current form. It is shown that the measures for the prevention of PSE, even when successful for an already existing fault, can trigger or accelerate a catastrophic

  2. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  3. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    Science.gov (United States)

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  4. GEM - The Global Earthquake Model

    Science.gov (United States)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  5. Effects of intratracheally instilled laser printer-emitted engineered nanoparticles in a mouse model: A case study of toxicological implications from nanomaterials released during consumer use.

    Science.gov (United States)

    Pirela, Sandra V; Lu, Xiaoyan; Miousse, Isabelle; Sisler, Jennifer D; Qian, Yong; Guo, Nancy; Koturbash, Igor; Castranova, Vincent; Thomas, Treye; Godleski, John; Demokritou, Philip

    2016-01-01

    Incorporation of engineered nanomaterials (ENMs) into toners used in laser printers has led to countless quality and performance improvements. However, the release of ENMs during printing (consumer use) has raised concerns about their potential adverse health effects. The aim of this study was to use "real world" printer-emitted particles (PEPs), rather than raw toner powder, and assess the pulmonary responses following exposure by intratracheal instillation. Nine-week old male Balb/c mice were exposed to various doses of PEPs (0.5, 2.5 and 5 mg/kg body weight) by intratracheal instillation. These exposure doses are comparable to real world human inhalation exposures ranging from 13.7 to 141.9 h of printing. Toxicological parameters reflecting distinct mechanisms of action were evaluated, including lung membrane integrity, inflammation and regulation of DNA methylation patterns. Results from this in vivo toxicological analysis showed that while intratracheal instillation of PEPs caused no changes in the lung membrane integrity, there was a pulmonary immune response, indicated by an elevation in neutrophil and macrophage percentage over the vehicle control and low dose PEPs groups. Additionally, exposure to PEPs upregulated expression of the Ccl5 ( Rantes ), Nos1 and Ucp2 genes in the murine lung tissue and modified components of the DNA methylation machinery ( Dnmt3a ) and expression of transposable element (TE) LINE-1 compared to the control group. These genes are involved in both the repair process from oxidative damage and the initiation of immune responses to foreign pathogens. The results are in agreement with findings from previous in vitro cellular studies and suggest that PEPs may cause immune responses in addition to modifications in gene expression in the murine lung at doses that can be comparable to real world exposure scenarios, thereby raising concerns of deleterious health effects.

  6. Incorporating human-triggered earthquake risks into energy and water policies

    Science.gov (United States)

    Klose, C. D.; Seeber, L.; Jacob, K. H.

    2010-12-01

    A comprehensive understanding of earthquake risks in urbanized regions requires an accurate assessment of both urban vulnerabilities and hazards from earthquakes, including ones whose timing might be affected by human activities. Socioeconomic risks associated with human-triggered earthquakes are often misconstrued and receive little scientific, legal, and public attention. Worldwide, more than 200 damaging earthquakes, associated with industrialization and urbanization, were documented since the 20th century. Geomechanical pollution due to large-scale geoengineering activities can advance the clock of earthquakes, trigger new seismic events or even shot down natural background seismicity. Activities include mining, hydrocarbon production, fluid injections, water reservoir impoundments and deep-well geothermal energy production. This type of geohazard has impacts on human security on a regional and national level. Some planned or considered future engineering projects raise particularly strong concerns about triggered earthquakes, such as for instance, sequestration of carbon dioxide by injecting it deep underground and large-scale natural gas production in the Marcellus shale in the Appalacian basin. Worldwide examples of earthquakes are discussed, including their associated losses of human life and monetary losses (e.g., 1989 Newcastle and Volkershausen earthquakes, 2001 Killari earthquake, 2006 Basel earthquake, 2010 Wenchuan earthquake). An overview is given on global statistics of human-triggered earthquakes, including depths and time delay of triggering. Lastly, strategies are described, including risk mitigation measures such as urban planning adaptations and seismic hazard mapping.

  7. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  8. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  9. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  10. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  11. Earthquake Early Warning: User Education and Designing Effective Messages

    Science.gov (United States)

    Burkett, E. R.; Sellnow, D. D.; Jones, L.; Sellnow, T. L.

    2014-12-01

    The U.S. Geological Survey (USGS) and partners are transitioning from test-user trials of a demonstration earthquake early warning system (ShakeAlert) to deciding and preparing how to implement the release of earthquake early warning information, alert messages, and products to the public and other stakeholders. An earthquake early warning system uses seismic station networks to rapidly gather information about an occurring earthquake and send notifications to user devices ahead of the arrival of potentially damaging ground shaking at their locations. Earthquake early warning alerts can thereby allow time for actions to protect lives and property before arrival of damaging shaking, if users are properly educated on how to use and react to such notifications. A collaboration team of risk communications researchers and earth scientists is researching the effectiveness of a chosen subset of potential earthquake early warning interface designs and messages, which could be displayed on a device such as a smartphone. Preliminary results indicate, for instance, that users prefer alerts that include 1) a map to relate their location to the earthquake and 2) instructions for what to do in response to the expected level of shaking. A number of important factors must be considered to design a message that will promote appropriate self-protective behavior. While users prefer to see a map, how much information can be processed in limited time? Are graphical representations of wavefronts helpful or confusing? The most important factor to promote a helpful response is the predicted earthquake intensity, or how strong the expected shaking will be at the user's location. Unlike Japanese users of early warning, few Californians are familiar with the earthquake intensity scale, so we are exploring how differentiating instructions between intensity levels (e.g., "Be aware" for lower shaking levels and "Drop, cover, hold on" at high levels) can be paired with self-directed supplemental

  12. Use of earthquake experience data

    International Nuclear Information System (INIS)

    Eder, S.J.; Eli, M.W.

    1991-01-01

    At many of the older existing US Department of Energy (DOE) facilities, the need has arisen for evaluation guidelines for natural phenomena hazard assessment. The effect of a design basis earthquake at most of these facilities is one of the main concerns. Earthquake experience data can provide a basis for the needed seismic evaluation guidelines, resulting in an efficient screening evaluation methodology for several of the items that are in the scope of the DOE facility reviews. The experience-based screening evaluation methodology, when properly established and implemented by trained engineers, has proven to result in sufficient safety margins and focuses on real concerns via facility walkdowns, usually at costs much less than the alternative options of analysis and testing. This paper summarizes a program that is being put into place to establish uniform seismic evaluation guidelines and criteria for evaluation of existing DOE facilities. The intent of the program is to maximize use of past experience, in conjunction with a walkdown screening evaluation process

  13. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    Science.gov (United States)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  14. The 2016 Central Italy Earthquake: an Overview

    Science.gov (United States)

    Amato, A.

    2016-12-01

    The M6 central Italy earthquake occurred on the seismic backbone of the Italy, just in the middle of the highest hazard belt. The shock hit suddenly during the night of August 24, when people were asleep; no foreshocks occurred before the main event. The earthquake ruptured from 10 km to the surface, and produced a more than 17,000 aftershocks (Oct. 19) spread on a 40x20 km2 area elongated NW-SE. It is geologically very similar to previous recent events of the Apennines. Both the 2009 L'Aquila earthquake to the south and the 1997 Colfiorito to the north, were characterized by the activation of adjacent fault segments. Despite its magnitude and the well known seismic hazard of the region, the earthquake produced extensive damage and 297 fatalities. The town of Amatrice, that paid the highest toll, was classified in zone 1 (the highest) since 1915, but the buildings in this and other villages revealed highly vulnerable. In contrast, in the town of Norcia, that also experienced strong ground shaking, no collapses occurred, most likely due to the retrofitting carried out after an earthquake in 1979. Soon after the quake, the INGV Crisis Unit convened at night in the Rome headquarters, in order to coordinate the activities. The first field teams reached the epicentral area at 7 am with the portable seismic stations installed to monitor the aftershocks; other teams followed to map surface faults, damage, to measure GPS sites, to install instruments for site response studies, and so on. The INGV Crisis Unit includes the Press office and the INGVterremoti team, in order to manage and coordinate the communication towards the Civil Protection Dept. (DPC), the media and the web. Several tens of reports and updates have been delivered in the first month of the sequence to DPC. Also due to the controversial situation arisen from the L'Aquila earthquake and trials, particular attention was given to the communication: continuous and timely information has been released to

  15. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    Science.gov (United States)

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  16. Failures and suggestions in Earthquake forecasting and prediction

    Science.gov (United States)

    Sacks, S. I.

    2013-12-01

    Seismologists have had poor success in earthquake prediction. However, wide ranging observations from earlier great earthquakes show that precursory data can exist. In particular, two aspects seem promising. In agreement with simple physical modeling, b-values decrease in highly loaded fault zones for years before failure. Potentially more usefully, in high stress regions, breakdown of dilatant patches leading to failure can yield expelled water-related observations. The volume increase (dilatancy) caused by high shear stresses decreases the pore pressure. Eventually, water flows back in restoring the pore pressure, promoting failure and expelling the extra water. Of course, in a generally stressed region there may be many small patches that fail, such as observed before the 1975 Haicheng earthquake. Only a few days before the major event will most of the dilatancy breakdown occur in the fault zone itself such as for the Tangshan, 1976 destructive event. Observations of 'water release' effects have been observed before the 1923 great Kanto earthquake, the 1984 Yamasaki event, the 1975 Haicheng and the 1976 Tangshan earthquakes and also the 1995 Kobe earthquake. While there are obvious difficulties in water release observations, not least because there is currently no observational network anywhere, historical data does suggest some promise if we broaden our approach to this difficult subject.

  17. New characteristics of intensity assessment of Sichuan Lushan "4.20" M s7.0 earthquake

    Science.gov (United States)

    Sun, Baitao; Yan, Peilei; Chen, Xiangzhao

    2014-08-01

    The post-earthquake rapid accurate assessment of macro influence of seismic ground motion is of significance for earthquake emergency relief, post-earthquake reconstruction and scientific research. The seismic intensity distribution map released by the Lushan earthquake field team of the China Earthquake Administration (CEA) five days after the strong earthquake ( M7.0) occurred in Lushan County of Sichuan Ya'an City at 8:02 on April 20, 2013 provides a scientific basis for emergency relief, economic loss assessment and post-earthquake reconstruction. In this paper, the means for blind estimation of macroscopic intensity, field estimation of macro intensity, and review of intensity, as well as corresponding problems are discussed in detail, and the intensity distribution characteristics of the Lushan "4.20" M7.0 earthquake and its influential factors are analyzed, providing a reference for future seismic intensity assessments.

  18. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  19. An interdisciplinary approach to study Pre-Earthquake processes

    Science.gov (United States)

    Ouzounov, D.; Pulinets, S. A.; Hattori, K.; Taylor, P. T.

    2017-12-01

    We will summarize a multi-year research effort on wide-ranging observations of pre-earthquake processes. Based on space and ground data we present some new results relevant to the existence of pre-earthquake signals. Over the past 15-20 years there has been a major revival of interest in pre-earthquake studies in Japan, Russia, China, EU, Taiwan and elsewhere. Recent large magnitude earthquakes in Asia and Europe have shown the importance of these various studies in the search for earthquake precursors either for forecasting or predictions. Some new results were obtained from modeling of the atmosphere-ionosphere connection and analyses of seismic records (foreshocks /aftershocks), geochemical, electromagnetic, and thermodynamic processes related to stress changes in the lithosphere, along with their statistical and physical validation. This cross - disciplinary approach could make an impact on our further understanding of the physics of earthquakes and the phenomena that precedes their energy release. We also present the potential impact of these interdisciplinary studies to earthquake predictability. A detail summary of our approach and that of several international researchers will be part of this session and will be subsequently published in a new AGU/Wiley volume. This book is part of the Geophysical Monograph series and is intended to show the variety of parameters seismic, atmospheric, geochemical and historical involved is this important field of research and will bring this knowledge and awareness to a broader geosciences community.

  20. Experimental evidence that thrust earthquake ruptures might open faults.

    Science.gov (United States)

    Gabuchian, Vahe; Rosakis, Ares J; Bhat, Harsha S; Madariaga, Raúl; Kanamori, Hiroo

    2017-05-18

    Many of Earth's great earthquakes occur on thrust faults. These earthquakes predominantly occur within subduction zones, such as the 2011 moment magnitude 9.0 eathquake in Tohoku-Oki, Japan, or along large collision zones, such as the 1999 moment magnitude 7.7 earthquake in Chi-Chi, Taiwan. Notably, these two earthquakes had a maximum slip that was very close to the surface. This contributed to the destructive tsunami that occurred during the Tohoku-Oki event and to the large amount of structural damage caused by the Chi-Chi event. The mechanism that results in such large slip near the surface is poorly understood as shallow parts of thrust faults are considered to be frictionally stable. Here we use earthquake rupture experiments to reveal the existence of a torquing mechanism of thrust fault ruptures near the free surface that causes them to unclamp and slip large distances. Complementary numerical modelling of the experiments confirms that the hanging-wall wedge undergoes pronounced rotation in one direction as the earthquake rupture approaches the free surface, and this torque is released as soon as the rupture breaks the free surface, resulting in the unclamping and violent 'flapping' of the hanging-wall wedge. Our results imply that the shallow extent of the seismogenic zone of a subducting interface is not fixed and can extend up to the trench during great earthquakes through a torquing mechanism.

  1. Earthquakes of Garhwal Himalaya region of NW Himalaya, India: A study of relocated earthquakes and their seismogenic source and stress

    Science.gov (United States)

    R, A. P.; Paul, A.; Singh, S.

    2017-12-01

    Since the continent-continent collision 55 Ma, the Himalaya has accommodated 2000 km of convergence along its arc. The strain energy is being accumulated at a rate of 37-44 mm/yr and releases at time as earthquakes. The Garhwal Himalaya is located at the western side of a Seismic Gap, where a great earthquake is overdue atleast since 200 years. This seismic gap (Central Seismic Gap: CSG) with 52% probability for a future great earthquake is located between the rupture zones of two significant/great earthquakes, viz. the 1905 Kangra earthquake of M 7.8 and the 1934 Bihar-Nepal earthquake of M 8.0; and the most recent one, the 2015 Gorkha earthquake of M 7.8 is in the eastern side of this seismic gap (CSG). The Garhwal Himalaya is one of the ideal locations of the Himalaya where all the major Himalayan structures and the Himalayan Seimsicity Belt (HSB) can ably be described and studied. In the present study, we are presenting the spatio-temporal analysis of the relocated local micro-moderate earthquakes, recorded by a seismicity monitoring network, which is operational since, 2007. The earthquake locations are relocated using the HypoDD (double difference hypocenter method for earthquake relocations) program. The dataset from July, 2007- September, 2015 have been used in this study to estimate their spatio-temporal relationships, moment tensor (MT) solutions for the earthquakes of M>3.0, stress tensors and their interactions. We have also used the composite focal mechanism solutions for small earthquakes. The majority of the MT solutions show thrust type mechanism and located near the mid-crustal-ramp (MCR) structure of the detachment surface at 8-15 km depth beneath the outer lesser Himalaya and higher Himalaya regions. The prevailing stress has been identified to be compressional towards NNE-SSW, which is the direction of relative plate motion between the India and Eurasia continental plates. The low friction coefficient estimated along with the stress inversions

  2. Quantifying slip balance in the earthquake cycle: Coseismic slip model constrained by interseismic coupling

    KAUST Repository

    Wang, Lifeng

    2015-11-11

    The long-term slip on faults has to follow, on average, the plate motion, while slip deficit is accumulated over shorter time scales (e.g., between the large earthquakes). Accumulated slip deficits eventually have to be released by earthquakes and aseismic processes. In this study, we propose a new inversion approach for coseismic slip, taking interseismic slip deficit as prior information. We assume a linear correlation between coseismic slip and interseismic slip deficit, and invert for the coefficients that link the coseismic displacements to the required strain accumulation time and seismic release level of the earthquake. We apply our approach to the 2011 M9 Tohoku-Oki earthquake and the 2004 M6 Parkfield earthquake. Under the assumption that the largest slip almost fully releases the local strain (as indicated by borehole measurements, Lin et al., 2013), our results suggest that the strain accumulated along the Tohoku-Oki earthquake segment has been almost fully released during the 2011 M9 rupture. The remaining slip deficit can be attributed to the postseismic processes. Similar conclusions can be drawn for the 2004 M6 Parkfield earthquake. We also estimate the required time of strain accumulation for the 2004 M6 Parkfield earthquake to be ~25 years (confidence interval of [17, 43] years), consistent with the observed average recurrence time of ~22 years for M6 earthquakes in Parkfield. For the Tohoku-Oki earthquake, we estimate the recurrence time of~500-700 years. This new inversion approach for evaluating slip balance can be generally applied to any earthquake for which dense geodetic measurements are available.

  3. Quantifying slip balance in the earthquake cycle: Coseismic slip model constrained by interseismic coupling

    KAUST Repository

    Wang, Lifeng; Hainzl, Sebastian; Mai, Paul Martin

    2015-01-01

    The long-term slip on faults has to follow, on average, the plate motion, while slip deficit is accumulated over shorter time scales (e.g., between the large earthquakes). Accumulated slip deficits eventually have to be released by earthquakes and aseismic processes. In this study, we propose a new inversion approach for coseismic slip, taking interseismic slip deficit as prior information. We assume a linear correlation between coseismic slip and interseismic slip deficit, and invert for the coefficients that link the coseismic displacements to the required strain accumulation time and seismic release level of the earthquake. We apply our approach to the 2011 M9 Tohoku-Oki earthquake and the 2004 M6 Parkfield earthquake. Under the assumption that the largest slip almost fully releases the local strain (as indicated by borehole measurements, Lin et al., 2013), our results suggest that the strain accumulated along the Tohoku-Oki earthquake segment has been almost fully released during the 2011 M9 rupture. The remaining slip deficit can be attributed to the postseismic processes. Similar conclusions can be drawn for the 2004 M6 Parkfield earthquake. We also estimate the required time of strain accumulation for the 2004 M6 Parkfield earthquake to be ~25 years (confidence interval of [17, 43] years), consistent with the observed average recurrence time of ~22 years for M6 earthquakes in Parkfield. For the Tohoku-Oki earthquake, we estimate the recurrence time of~500-700 years. This new inversion approach for evaluating slip balance can be generally applied to any earthquake for which dense geodetic measurements are available.

  4. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  5. Prediction of the UO/sub 2/ fission gas release data of Bellamy and Rich using a model recently developed by Combustion Engineering

    International Nuclear Information System (INIS)

    Freeburn, H.R.; Pati, S.R.

    1983-01-01

    The trend in the light water reactor industry to higher discharge burnups of UO/sub 2/ fuel rods has initiated the modification of existing fuel rod models to better account for high burnup effects. The degree to which fission gas release from UO/sub 2/ fuel is enhanced at higher burnup is being addressed in the process. Fission gas release modeling should include the separation of the individual effects of thermal diffusion and any burnup enhancement on the release. Although some modelers have interpreted the Bellamy and Rich data on fission gas release from UO/sub 2/ fuel in this fashion, they have assumed that below about 1250 0 C the gas release is not temperature-dependent, and this has led them to predict a very strong burnup enhancement of gas release above 20 MWd/kgU. More recent data, however, suggest that an appreciable amount of fission gas is released by a thermal diffusion mechanism at even lower temperatures and will add to the fission gas released due to the temperature-independent mechanisms of knockout and recoil

  6. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    Science.gov (United States)

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  7. Quantification of social contributions to earthquake mortality

    Science.gov (United States)

    Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.

    2013-12-01

    Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.

  8. Lower bound earthquake magnitude for probabilistic seismic hazard evaluation

    International Nuclear Information System (INIS)

    McCann, M.W. Jr.; Reed, J.W.

    1990-01-01

    This paper presents the results of a study that develops an engineering and seismological basis for selecting a lower-bound magnitude (LBM) for use in seismic hazard assessment. As part of a seismic hazard analysis the range of earthquake magnitudes that are included in the assessment of the probability of exceedance of ground motion must be defined. The upper-bound magnitude is established by earth science experts based on their interpretation of the maximum size of earthquakes that can be generated by a seismic source. The lower-bound or smallest earthquake that is considered in the analysis must also be specified. The LBM limits the earthquakes that are considered in assessing the probability that specified ground motion levels are exceeded. In the past there has not been a direct consideration of the appropriate LBM value that should be used in a seismic hazard assessment. This study specifically looks at the selection of a LBM for use in seismic hazard analyses that are input to the evaluation/design of nuclear power plants (NPPs). Topics addressed in the evaluation of a LBM are earthquake experience data at heavy industrial facilities, engineering characteristics of ground motions associated with small-magnitude earthquakes, probabilistic seismic risk assessments (seismic PRAs), and seismic margin evaluations. The results of this study and the recommendations concerning a LBM for use in seismic hazard assessments are discussed. (orig.)

  9. Short presentation on some researches activities about near field earthquakes

    International Nuclear Information System (INIS)

    Donald, John

    2002-01-01

    The major hazard posed by earthquakes is often thought to be due to moderate to large magnitude events. However, there have been many cases where earthquakes of moderate and even small magnitude have caused very significant destruction when they have coincided with population centres. Even though the area of intense ground shaking caused by such events is generally small, the epicentral motions can be severe enough to cause damage even in well-engineered structures. Two issues are addressed here, the first being the identification of the minimum earthquake magnitude likely to cause damage to engineered structures and the limits of the near-field for small-to-moderate magnitude earthquakes. The second issue addressed is whether features of near-field ground motions such as directivity, which can significantly enhance the destructive potential, occur in small-to-moderate magnitude events. The accelerograms from the 1986 San Salvador (El Salvador) earthquake indicate that it may be non conservative to assume that near-field directivity effects only need to be considered for earthquakes of moment magnitude M 6.5 and greater. (author)

  10. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  11. The 2016 Kumamoto Earthquakes: Cascading Geological Hazards and Compounding Risks

    Directory of Open Access Journals (Sweden)

    Katsuichiro Goda

    2016-08-01

    Full Text Available A sequence of two strike-slip earthquakes occurred on 14 and 16 April 2016 in the intraplate region of Kyushu Island, Japan, apart from subduction zones, and caused significant damage and disruption to the Kumamoto region. The analyses of regional seismic catalog and available strong motion recordings reveal striking characteristics of the events, such as migrating seismicity, earthquake surface rupture, and major foreshock-mainshock earthquake sequences. To gain valuable lessons from the events, a UK Earthquake Engineering Field Investigation Team (EEFIT was dispatched to Kumamoto, and earthquake damage surveys were conducted to relate observed earthquake characteristics to building and infrastructure damage caused by the earthquakes. The lessons learnt from the reconnaissance mission have important implications on current seismic design practice regarding the required seismic resistance of structures under multiple shocks and the seismic design of infrastructure subject to large ground deformation. The observations also highlight the consequences of cascading geological hazards on community resilience. To share the gathered damage data widely, geo-tagged photos are organized using Google Earth and the kmz file is made publicly available.

  12. ESTIMATION OF AMPLIFICATION FACTOR IN EARTHQUAKE ENGINEERING

    Directory of Open Access Journals (Sweden)

    Nazarov Yuriy Pavlovich

    2015-03-01

    Full Text Available The authors are the developers of Odyssey Software (Eurosoft Co. for the analysis of seismological data and computing of seismic loads and their parameters. While communicating with the users of the software, the authors have revealed some uncertainty about both understanding of the term "amplification factor (AF" and calculation of the amplification factor using various methods. In this article, a simple example shows that the determination of the amplification factor as the ratio of the acceleration’s spectrum to the maximal acceleration is derived from the classical definition of AF in the form of the ratio of maximal dynamic displacement to the displacement by the action of static load. Deterministic and probabilistic ap-proaches for the calculating of the AF were discussed. There was an example of AFs calculation and their envelopes for translational and rotational components of seismic impact by using Odyssey Software.

  13. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  14. Earthquake Risk Mitigation in the Tokyo Metropolitan area

    Science.gov (United States)

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.

    2010-12-01

    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  15. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  16. Earthquakes; May-June 1982

    Science.gov (United States)

    Person, W.J.

    1982-01-01

    There were four major earthquakes (7.0-7.9) during this reporting period: two struck in Mexico, one in El Salvador, and one in teh Kuril Islands. Mexico, El Salvador, and China experienced fatalities from earthquakes.

  17. Correlation of pre-earthquake electromagnetic signals with laboratory and field rock experiments

    Directory of Open Access Journals (Sweden)

    T. Bleier

    2010-09-01

    Full Text Available Analysis of the 2007 M5.4 Alum Rock earthquake near San José California showed that magnetic pulsations were present in large numbers and with significant amplitudes during the 2 week period leading up the event. These pulsations were 1–30 s in duration, had unusual polarities (many with only positive or only negative polarities versus both polarities, and were different than other pulsations observed over 2 years of data in that the pulse sequence was sustained over a 2 week period prior to the quake, and then disappeared shortly after the quake. A search for the underlying physics process that might explain these pulses was was undertaken, and one theory (Freund, 2002 demonstrated that charge carriers were released when various types of rocks were stressed in a laboratory environment. It was also significant that the observed charge carrier generation was transient, and resulted in pulsating current patterns. In an attempt to determine if this phenomenon occurred outside of the laboratory environment, the authors scaled up the physics experiment from a relatively small rock sample in a dry laboratory setting, to a large 7 metric tonne boulder comprised of Yosemite granite. This boulder was located in a natural, humid (above ground setting at Bass Lake, Ca. The boulder was instrumented with two Zonge Engineering, Model ANT4 induction type magnetometers, two Trifield Air Ion Counters, a surface charge detector, a geophone, a Bruker Model EM27 Fourier Transform Infra Red (FTIR spectrometer with Sterling cycle cooler, and various temperature sensors. The boulder was stressed over about 8 h using expanding concrete (Bustartm, until it fractured into three major pieces. The recorded data showed surface charge build up, magnetic pulsations, impulsive air conductivity changes, and acoustical cues starting about 5 h before the boulder actually broke. These magnetic and air conductivity pulse signatures resembled both the laboratory

  18. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  19. Statistical physics approach to earthquake occurrence and forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Arcangelis, Lucilla de [Department of Industrial and Information Engineering, Second University of Naples, Aversa (CE) (Italy); Godano, Cataldo [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy); Grasso, Jean Robert [ISTerre, IRD-CNRS-OSUG, University of Grenoble, Saint Martin d’Héres (France); Lippiello, Eugenio, E-mail: eugenio.lippiello@unina2.it [Department of Mathematics and Physics, Second University of Naples, Caserta (Italy)

    2016-04-25

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space–time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for

  20. Proposal of the concept of selection of accidents that release large amounts of radioactive substances in the high temperature engineering test reactor

    International Nuclear Information System (INIS)

    Ono, Masato; Honda, Yuki; Takada, Shoji; Sawa, Kazuhiro

    2015-01-01

    In Position, construction and equipment of testing and research reactor to be subjected to the use standards for rules Article 53 (prevention of expansion of the accident to release a large amount of radioactive material) generation the frequency is a lower accident than design basis accident, when what is likely to release a large amount of radioactive material or radiation from the facility has occurred, and take the necessary measures in order to prevent the spread of the accident. There is provided a lower accident than frequency design basis accidents, for those that may release a large amount of radioactive material or radiation. (author)

  1. The Manchester earthquake swarm of October 2002

    Science.gov (United States)

    Baptie, B.; Ottemoeller, L.

    2003-04-01

    An earthquake sequence started in the Greater Manchester area of the United Kingdom on October 19, 2002. This has continued to the time of writing and has consisted of more than 100 discrete earthquakes. Three temporary seismograph stations were installed to supplement existing permanent stations and to better understand the relationship between the seismicity and local geology. Due to the urban location, these were experienced by a large number of people. The largest event on October 21 had a magnitude ML 3.9. The activity appears to be an earthquake swarm, since there is no clear distinction between a main shock and aftershocks. However, most of the energy during the sequence was actually released in two earthquakes separated by a few seconds in time, on October 21 at 11:42. Other examples of swarm activity in the UK include Comrie (1788-1801, 1839-46), Glenalmond (1970-72), Doune (1997) and Blackford (1997-98, 2000-01) in central Scotland, Constantine (1981, 1986, 1992-4) in Cornwall, and Johnstonbridge (mid1980s) and Dumfries (1991,1999). The clustering of these events in time and space does suggest that there is a causal relationship between the events of the sequence. Joint hypocenter determination was used to simultaneously locate the swarm earthquakes, determine station corrections and improve the relative locations. It seems likely that all events in the sequence originate from a relatively small source volume. This is supported by the similarities in source mechanism and waveform signals between the various events. Focal depths were found to be very shallow and of the order of about 2-3 km. Source mechanisms determined for the largest of the events show strike-slip solutions along either northeast-southwest or northwest-southeast striking fault planes. The surface expression of faults in the epicentral area is generally northwest-southeast, suggesting that this is the more likely fault plane.

  2. Calculated concentrations of any radionuclide deposited on the ground by release from underground nuclear detonations, tests of nuclear rockets, and tests of nuclear ramjet engines

    International Nuclear Information System (INIS)

    Hicks, H.G.

    1981-11-01

    This report presents calculated gamma radiation exposure rates and ground deposition of related radionuclides resulting from three types of event that deposited detectable radioactivity outside the Nevada Test Site complex, namely, underground nuclear detonations, tests of nuclear rocket engines and tests of nuclear ramjet engines

  3. Fast rise times and the physical mechanism of deep earthquakes

    Science.gov (United States)

    Houston, H.; Williams, Q.

    1991-01-01

    A systematic global survey of the rise times and stress drops of deep and intermediate earthquakes is reported. When the rise times are scaled to the seismic moment release of the events, their average is nearly twice as fast for events deeper than about 450 km as for shallower events.

  4. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  5. Sequential VEGF and BMP-2 releasing PLA-PEG-PLA scaffolds for bone tissue engineering: I. Design and in vitro tests.

    Science.gov (United States)

    Eğri, Sinan; Eczacıoğlu, Numan

    2017-03-01

    Biodegradable PLA-PEG-PLA block copolymers were synthesized with desired backbone structures and molecular weights using PEG20000. Rectangular scaffolds were prepared by freeze drying with or without using NaCl particles. Bone morphogenetic protein (BMP)-2 was loaded to the matrix after the scaffold formation for sustained release while vascular endothelial growth factor (VEGF) was loaded within the pores with gelatin solution. VEGF release was quite fast and almost 60% of it was released in 2 d. However, sequential - sustained released was observed for BMP-2 in the following few months. Corporation of VEGF/BMP-2 couple into the scaffolds increased the cell adhesion and proliferation. Neither significant cytotoxicity nor apoptosis/necrosis were observed.

  6. Short-Term Forecasting of Taiwanese Earthquakes Using a Universal Model of Fusion-Fission Processes

    Science.gov (United States)

    Cheong, Siew Ann; Tan, Teck Liang; Chen, Chien-Chih; Chang, Wu-Lung; Liu, Zheng; Chew, Lock Yue; Sloot, Peter M. A.; Johnson, Neil F.

    2014-01-01

    Predicting how large an earthquake can be, where and when it will strike remains an elusive goal in spite of the ever-increasing volume of data collected by earth scientists. In this paper, we introduce a universal model of fusion-fission processes that can be used to predict earthquakes starting from catalog data. We show how the equilibrium dynamics of this model very naturally explains the Gutenberg-Richter law. Using the high-resolution earthquake catalog of Taiwan between Jan 1994 and Feb 2009, we illustrate how out-of-equilibrium spatio-temporal signatures in the time interval between earthquakes and the integrated energy released by earthquakes can be used to reliably determine the times, magnitudes, and locations of large earthquakes, as well as the maximum numbers of large aftershocks that would follow. PMID:24406467

  7. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  8. Earthquakes, May-June 1991

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  9. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  10. [Earthquakes in El Salvador].

    Science.gov (United States)

    de Ville de Goyet, C

    2001-02-01

    The Pan American Health Organization (PAHO) has 25 years of experience dealing with major natural disasters. This piece provides a preliminary review of the events taking place in the weeks following the major earthquakes in El Salvador on 13 January and 13 February 2001. It also describes the lessons that have been learned over the last 25 years and the impact that the El Salvador earthquakes and other disasters have had on the health of the affected populations. Topics covered include mass-casualties management, communicable diseases, water supply, managing donations and international assistance, damages to the health-facilities infrastructure, mental health, and PAHO's role in disasters.

  11. Evaluation of earthquake resistance design for underground structures of nuclear power plant, (1)

    International Nuclear Information System (INIS)

    Tohma, Junichi; Kokusho, Kenji; Iwatate, Takahiro; Ohtomo, Keizo

    1986-01-01

    As to earthquake resistant design of underground civil engineering structures related with emergency cooling water system of nuclear power plant, it is required these structures must maintain the function of great important their own facilities during earthquakes, especially for design earthquake motion. In this study, shaft pipline, pit and duct for cooling sea water facilities were chosen as typical underground structures, and the authors deal with the seismic design method for calculation of the principal sectional force in these structures generated by design earthquake motion. Especially, comparative investigations concerned with response displacement method versus dynamic analysis methods (lumped mass analysis and finite element analysis) are discussed. (author)

  12. Earthquake clustering in modern seismicity and its relationship with strong historical earthquakes around Beijing, China

    Science.gov (United States)

    Wang, Jian; Main, Ian G.; Musson, Roger M. W.

    2017-11-01

    Beijing, China's capital city, is located in a typical intraplate seismic belt, with relatively high-quality instrumental catalogue data available since 1970. The Chinese historical earthquake catalogue contains six strong historical earthquakes of Ms ≥ 6 around Beijing, the earliest in 294 AD. This poses a significant potential hazard to one of the most densely populated and economically active parts of China. In some intraplate areas, persistent clusters of events associated with historical events can occur over centuries, for example, the ongoing sequence in the New Madrid zone of the eastern US. Here we will examine the evidence for such persistent clusters around Beijing. We introduce a metric known as the `seismic density index' that quantifies the degree of clustering of seismic energy release. For a given map location, this multi-dimensional index depends on the number of events, their magnitudes, and the distances to the locations of the surrounding population of earthquakes. We apply the index to modern instrumental catalogue data between 1970 and 2014, and identify six clear candidate zones. We then compare these locations to earthquake epicentre and seismic intensity data for the six largest historical earthquakes. Each candidate zone contains one of the six historical events, and the location of peak intensity is within 5 km or so of the reported epicentre in five of these cases. In one case—the great Ms 8 earthquake of 1679—the peak is closer to the area of strongest shaking (Intensity XI or more) than the reported epicentre. The present-day event rates are similar to those predicted by the modified Omori law but there is no evidence of ongoing decay in event rates. Accordingly, the index is more likely to be picking out the location of persistent weaknesses in the lithosphere. Our results imply zones of high seismic density index could be used in principle to indicate the location of unrecorded historical of palaeoseismic events, in China and

  13. The Alaska earthquake, March 27, 1964: lessons and conclusions

    Science.gov (United States)

    Eckel, Edwin B.

    1970-01-01

    One of the greatest earthquakes of all time struck south-central Alaska on March 27, 1964. Strong motion lasted longer than for most recorded earthquakes, and more land surface was dislocated, vertically and horizontally, than by any known previous temblor. Never before were so many effects on earth processes and on the works of man available for study by scientists and engineers over so great an area. The seismic vibrations, which directly or indirectly caused most of the damage, were but surface manifestations of a great geologic event-the dislocation of a huge segment of the crust along a deeply buried fault whose nature and even exact location are still subjects for speculation. Not only was the land surface tilted by the great tectonic event beneath it, with resultant seismic sea waves that traversed the entire Pacific, but an enormous mass of land and sea floor moved several tens of feet horizontally toward the Gulf of Alaska. Downslope mass movements of rock, earth, and snow were initiated. Subaqueous slides along lake shores and seacoasts, near-horizontal movements of mobilized soil (“landspreading”), and giant translatory slides in sensitive clay did the most damage and provided the most new knowledge as to the origin, mechanics, and possible means of control or avoidance of such movements. The slopes of most of the deltas that slid in 1964, and that produced destructive local waves, are still as steep or steeper than they were before the earthquake and hence would be unstable or metastable in the event of another great earthquake. Rockslide avalanches provided new evidence that such masses may travel on cushions of compressed air, but a widely held theory that glaciers surge after an earthquake has not been substantiated. Innumerable ground fissures, many of them marked by copious emissions of water, caused much damage in towns and along transportation routes. Vibration also consolidated loose granular materials. In some coastal areas, local

  14. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  15. Performance of HEPA filters at LLNL following the 1980 and 1989 earthquakes

    International Nuclear Information System (INIS)

    Bergman, W.; Elliott, J.; Wilson, K.

    1995-01-01

    The Lawrence Livermore National Laboratory has experienced two significant earthquakes for which data is available to assess the ability of HEPA filters to withstand seismic conditions. A 5.9 magnitude earthquake with an epicenter 10 miles from LLNL struck on January 24, l980. Estimates of the peak ground accelerations ranged from 0.2 to 0.3 g. A 7.0 magnitude earthquake with an epicenter about 50 miles from LLNL struck on October 17, 1989. Measurements of the ground accelerations at LLNL averaged 0.1 g. The results from the in-place filter tests obtained after each of the earthquakes were compiled and studied to determine if the earthquakes had caused filter leakage. Our study showed that only the 1980 earthquake resulted in a small increase in the number of HEPA filters developing leaks. In the 12 months following the 1980 and 1989 earthquakes, the in-place filter tests showed 8.0% and 4.1% of all filters respectively developed leaks. The average percentage of filters developing leaks from 1980 to 1993 was 3.3%+/-1.7%. The increase in the filter leaks is significant for the 1980 earthquake, but not for the 1989 earthquake. No contamination was detected following the earthquakes that would suggest transient releases from the filtration system

  16. Performance of HEPA filters at LLNL following the 1980 and 1989 earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Bergman, W.; Elliott, J.; Wilson, K. [Lawrence Livermore National Laboratory, CA (United States)

    1995-02-01

    The Lawrence Livermore National Laboratory has experienced two significant earthquakes for which data is available to assess the ability of HEPA filters to withstand seismic conditions. A 5.9 magnitude earthquake with an epicenter 10 miles from LLNL struck on January 24, l980. Estimates of the peak ground accelerations ranged from 0.2 to 0.3 g. A 7.0 magnitude earthquake with an epicenter about 50 miles from LLNL struck on October 17, 1989. Measurements of the ground accelerations at LLNL averaged 0.1 g. The results from the in-place filter tests obtained after each of the earthquakes were compiled and studied to determine if the earthquakes had caused filter leakage. Our study showed that only the 1980 earthquake resulted in a small increase in the number of HEPA filters developing leaks. In the 12 months following the 1980 and 1989 earthquakes, the in-place filter tests showed 8.0% and 4.1% of all filters respectively developed leaks. The average percentage of filters developing leaks from 1980 to 1993 was 3.3%+/-1.7%. The increase in the filter leaks is significant for the 1980 earthquake, but not for the 1989 earthquake. No contamination was detected following the earthquakes that would suggest transient releases from the filtration system.

  17. Assessment of Structural Resistance of building 4862 to Earthquake and Tornado Forces [SEC 1 and 2

    International Nuclear Information System (INIS)

    METCALF, I.L.

    1999-01-01

    This report presents the results of work done for Hanford Engineering Laboratory under contract Y213-544-12662. LATA performed an assessment of building 4862 resistance to earthquake and tornado forces

  18. Assessment of Structural Resistance of building 4862 to Earthquake and Tornado Forces [SEC 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    METCALF, I.L.

    1999-12-06

    This report presents the results of work done for Hanford Engineering Laboratory under contract Y213-544-12662. LATA performed an assessment of building 4862 resistance to earthquake and tornado forces.

  19. Improving the Earthquake Resilience of Buildings The worst case approach

    CERN Document Server

    Takewaki, Izuru; Fujita, Kohei

    2013-01-01

    Engineers are always interested in the worst-case scenario. One of the most important and challenging missions of structural engineers may be to narrow the range of unexpected incidents in building structural design. Redundancy, robustness and resilience play an important role in such circumstances. Improving the Earthquake Resilience of Buildings: The worst case approach discusses the importance of worst-scenario approach for improved earthquake resilience of buildings and nuclear reactor facilities. Improving the Earthquake Resilience of Buildings: The worst case approach consists of two parts. The first part deals with the characterization and modeling of worst or critical ground motions on inelastic structures and the related worst-case scenario in the structural design of ordinary simple building structures. The second part of the book focuses on investigating the worst-case scenario for passively controlled and base-isolated buildings. This allows for detailed consideration of a range of topics includin...

  20. Initiatives to Reduce Earthquake Risk of Developing Countries

    Science.gov (United States)

    Tucker, B. E.

    2008-12-01

    The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of

  1. Fighting and preventing post-earthquake fires in nuclear power plant

    International Nuclear Information System (INIS)

    Lu Xuefeng; Zhang Xin

    2011-01-01

    Nuclear power plant post-earthquake fires will cause not only personnel injury, severe economic loss, but also serious environmental pollution. For the moment, nuclear power is in a position of rapid development in China. Considering the earthquake-prone characteristics of our country, it is of great engineering importance to investigate the nuclear power plant post-earthquake fires. This article analyzes the cause, influential factors and development characteristics of nuclear power plant post-earthquake fires in details, and summarizes the three principles should be followed in fighting and preventing nuclear power plant post-earthquake fires, such as solving problems in order of importance and urgency, isolation prior to prevention, immediate repair and regular patrol. Three aspects were pointed out that should be paid attention in fighting and preventing post-earthquake fires. (authors)

  2. The EM Earthquake Precursor

    Science.gov (United States)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  3. Earthquakes: no danger for deep underground nuclear waste repositories

    International Nuclear Information System (INIS)

    2010-03-01

    On the Earth, the continental plates are steadily moving. Principally at the plate boundaries such shifts produce stresses which are released in form of earthquakes. The highest the built-up energy, the more violent will be the shaking. Earthquakes accompany mankind from very ancient times on and they disturb the population. Till now nobody is able to predict where and when they will take place. But on the Earth there are regions where, due to their geological situation, the occurrence of earthquakes is more probable than elsewhere. The impact of a very strong earthquake on the structures at the Earth surface depends on several factors. Besides the ground structure, the density of buildings, construction style and materials used play an important role. Construction-related technical measures can improve the safety of buildings and, together with a correct behaviour of the people concerned, save many lives. Earthquakes are well known in Switzerland. Here, the stresses are due to the collision of the African and European continental plates that created the Alps. The impact of earthquake is more limited in the underground than at the Earth surface. There is no danger for deep underground repositories

  4. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  5. What is a surprise earthquake? The example of the 2002, San Giuliano (Italy event

    Directory of Open Access Journals (Sweden)

    M. Mucciarelli

    2005-06-01

    Full Text Available Both in scientific literature and in the mass media, some earthquakes are defined as «surprise earthquakes». Based on his own judgment, probably any geologist, seismologist or engineer may have his own list of past «surprise earthquakes». This paper tries to quantify the underlying individual perception that may lead a scientist to apply such a definition to a seismic event. The meaning is different, depending on the disciplinary approach. For geologists, the Italian database of seismogenic sources is still too incomplete to allow for a quantitative estimate of the subjective degree of belief. For seismologists, quantification is possible defining the distance between an earthquake and its closest previous neighbor. Finally, for engineers, the San Giuliano quake could not be considered a surprise, since probabilistic site hazard estimates reveal that the change before and after the earthquake is just 4%.

  6. Ground Motion Prediction for Great Interplate Earthquakes in Kanto Basin Considering Variation of Source Parameters

    Science.gov (United States)

    Sekiguchi, H.; Yoshimi, M.; Horikawa, H.

    2011-12-01

    Broadband ground motions are estimated in the Kanto sedimentary basin which holds Tokyo metropolitan area inside for anticipated great interplate earthquakes along surrounding plate boundaries. Possible scenarios of great earthquakes along Sagami trough are modeled combining characteristic properties of the source area and adequate variation in source parameters in order to evaluate possible ground motion variation due to next Kanto earthquake. South to the rupture area of the 2011 Tohoku earthquake along the Japan trench, we consider possible M8 earthquake. The ground motions are computed with a four-step hybrid technique. We first calculate low-frequency ground motions at the engineering basement. We then calculate higher-frequency ground motions at the same position, and combine the lower- and higher-frequency motions using a matched filter. We finally calculate ground motions at the surface by computing the response of the alluvium-diluvium layers to the combined motions at the engineering basement.

  7. Investigation of the relationship between earthquakes and indoor radon concentrations at a building in Gyeongju, Korea

    Directory of Open Access Journals (Sweden)

    Jae Wook Kim

    2018-04-01

    Full Text Available This article measured and analyzed the indoor radon concentrations at one university building in Gyeongju, Republic of Korea, to investigate if there is any relationship between earthquakes and indoor radon concentration. Since 12 September 2016, when two 5.1 and 5.8 magnitude earthquakes occurred, hundreds of aftershocks affected Gyeongju until January 2017. The measurements were made at the ground floor of the Energy Engineering Hall of Dongguk University in Gyeongju over a period between February 2016 and January 2017. The measurements were made with an RAD7 detector on the basis of the US Environmental Protection Agency measurement protocol. Each measurement was continuously made every 30 minutes over the measurement period every month. Among earthquakes with 2.0 or greater magnitude, the earthquakes whose occurrence timings fell into the measurement periods were screened for further analysis. We observed similar spike-like patterns between the indoor radon concentration distributions and earthquakes: a sudden increase in the peak indoor radon concentration 1–4 days before an earthquake, gradual decrease before the earthquake, and sudden drop on the day of the earthquake if the interval between successive earthquakes was moderately longer, for example, 3 days in this article. Keywords: Earthquakes, Gyeongju, Indoor Radon Concentration, RAD7, Radon Anomaly

  8. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  9. Small Buildings in Earthquake Areas. Educational Building Digest 2.

    Science.gov (United States)

    Mooij, D.

    This booklet is intended for builders and others who actually construct small buildings in earthquake areas and not for professionally qualified architects or engineers. In outline form with sketches the following topics are discussed: general construction and design principles; foundations; earth walls; brick, block, and stone walls; timber frame…

  10. 75 FR 50749 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Science.gov (United States)

    2010-08-17

    ... accommodate Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov... of Technology, 365 Innovation Drive, Memphis, TN 38152-3115. Please note admittance instructions...: Trends and developments in the science and engineering of earthquake hazards reduction; The effectiveness...

  11. Hysteresis behavior of seismic isolators in earthquakes near a fault ...

    African Journals Online (AJOL)

    Seismic performance and appropriate design of structures located near the faults has always been a major concern of design engineers. Because during an earthquake; the effects of plasticity will make differences in characteristics of near field records. These pulsed movements at the beginning of records will increase the ...

  12. Fracture analysis of concrete gravity dam under earthquake induced ...

    African Journals Online (AJOL)

    Michael Horsfall

    Fracture analysis of concrete gravity dam under earthquake induced loads. 1. ABBAS MANSOURI;. 2 ... 1 Civil Engineering, Islamic Azad University (South Branch of Tehran)Tehran, Iran ..... parameter has on the results of numerical calculations. In this analysis ... with the help of Abaqus software (Abaqus theory manual ...

  13. The earthquakes of the Baltic shield

    International Nuclear Information System (INIS)

    Slunga, R.

    1990-06-01

    More than 200 earthquakes in the Baltic Shield area in the size range ML 0.6-4.5 have been studied by dense regional seismic networks. The analysis includes focal depths, dynamic source parameters, and fault plane solutions. In southern Sweden a long part of the Protogene zone marks a change in the seismic activity. The focal depths indicate three crustal layers: Upper crust (0-18 km in southern Sweden, 0-13 km in northern Sweden), middle crust down to 35 km, and the quiet lower crust. The fault plane solutions show that strike-slip is dominating. Along the Tornquist line significant normal faulting occurs. The stresses released by the earthquakes show a remarkable consistency with a regional principle compression N60W. This indicates that plate-tectonic processes are more important than the land uplift. The spatial distribution is consistent with a model where the earthquakes are breakdowns of asperities on normally stably sliding faults. The aseismic sliding is estimated to be 2000 times more extensive than the seismic sliding. Southern Sweden is estimated to deform horizontally at a rate of 1 mm/year or more. (orig.)

  14. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  15. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    Science.gov (United States)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  16. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  17. A 30-year history of earthquake crisis communication in California and lessons for the future

    Science.gov (United States)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  18. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  19. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  20. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  1. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  2. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  3. Engineering Saccharomyces cerevisiae To Release 3-Mercaptohexan-1-ol during Fermentation through Overexpression of an S. cerevisiae Gene, STR3, for Improvement of Wine Aroma▿

    Science.gov (United States)

    Holt, Sylvester; Cordente, Antonio G.; Williams, Simon J.; Capone, Dimitra L.; Jitjaroen, Wanphen; Menz, Ian R.; Curtin, Chris; Anderson, Peter A.

    2011-01-01

    Sulfur-containing aroma compounds are key contributors to the flavor of a diverse range of foods and beverages. The tropical fruit characters of Vitis vinifera L. cv. Sauvignon blanc wines are attributed to the presence of the aromatic thiols 3-mercaptohexan-1-ol (3MH), 3-mercaptohexan-1-ol-acetate, and 4-mercapto-4-methylpentan-2-one (4MMP). These volatile thiols are found in small amounts in grape juice and are formed from nonvolatile cysteinylated precursors during fermentation. In this study, we overexpressed a Saccharomyces cerevisiae gene, STR3, which led to an increase in 3MH release during fermentation of a V. vinifera L. cv. Sauvignon blanc juice. Characterization of the enzymatic properties of Str3p confirmed it to be a pyridoxal-5′-phosphate-dependent cystathionine β-lyase, and we demonstrated that this enzyme was able to cleave the cysteinylated precursors of 3MH and 4MMP to release the free thiols. These data provide direct evidence for a yeast enzyme able to release aromatic thiols in vitro that can be applied in the development of self-cloned yeast to enhance wine flavor. PMID:21478306

  4. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  5. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    Science.gov (United States)

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    , and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.

  6. Coseismic deformation of the 2001 El Salvador and 2002 Denali fault earthquakes from GPS geodetic measurements

    Science.gov (United States)

    Hreinsdottir, Sigrun

    2005-07-01

    GPS geodetic measurements are used to study two major earthquakes, the 2001 MW 7.7 El Salvador and 2002 MW 7.9 Denali Fault earthquakes. The 2001 MW 7.7 earthquake was a normal fault event in the subducting Cocos plate offshore El Salvador. Coseismic displacements of up to 15 mm were measured at permanent GPS stations in Central America. The GPS data were used to constrain the location of and slip on the normal fault. One month later a MW 6.6 strike-slip earthquake occurred in the overriding Caribbean plate. Coulomb stress changes estimated from the M W 7.7 earthquake suggest that it triggered the MW 6.6 earthquake. Coseismic displacement from the MW 6.6 earthquake, about 40 mm at a GPS station in El Salvador, indicates that the earthquake triggered additional slip on a fault close to the GPS station. The MW 6.6 earthquake further changed the stress field in the overriding Caribbean plate, with triggered seismic activity occurring west and possibly also to the east of the rupture in the days to months following the earthquake. The MW 7.9 Denali Fault earthquake ruptured three faults in the interior of Alaska. It initiated with a thrust motion on the Susitna Glacier fault but then ruptured the Denali and Totschunda faults with predominantly right-lateral strike-slip motion unilaterally from west to east. GPS data measured in the two weeks following the earthquake suggest a complex coseismic rupture along the faults with two main regions of moment release along the Denali fault. A large amount of additional data were collected in the year following the earthquake which greatly improved the resolution on the fault, revealing more details of the slip distribution. We estimate a total moment release of 6.81 x 1020 Nm in the earthquake with a M W 7.2 thrust subevent on Susitna Glacier fault. The slip on the Denali fault is highly variable, with 4 main pulses of moment release. The largest moment pulse corresponds to a MW 7.5 subevent, about 40 km west of the Denali

  7. Characteristics of broadband slow earthquakes explained by a Brownian model

    Science.gov (United States)

    Ide, S.; Takeo, A.

    2017-12-01

    Brownian slow earthquake (BSE) model (Ide, 2008; 2010) is a stochastic model for the temporal change of seismic moment release by slow earthquakes, which can be considered as a broadband phenomena including tectonic tremors, low frequency earthquakes, and very low frequency (VLF) earthquakes in the seismological frequency range, and slow slip events in geodetic range. Although the concept of broadband slow earthquake may not have been widely accepted, most of recent observations are consistent with this concept. Then, we review the characteristics of slow earthquakes and how they are explained by BSE model. In BSE model, the characteristic size of slow earthquake source is represented by a random variable, changed by a Gaussian fluctuation added at every time step. The model also includes a time constant, which divides the model behavior into short- and long-time regimes. In nature, the time constant corresponds to the spatial limit of tremor/SSE zone. In the long-time regime, the seismic moment rate is constant, which explains the moment-duration scaling law (Ide et al., 2007). For a shorter duration, the moment rate increases with size, as often observed for VLF earthquakes (Ide et al., 2008). The ratio between seismic energy and seismic moment is constant, as shown in Japan, Cascadia, and Mexico (Maury et al., 2017). The moment rate spectrum has a section of -1 slope, limited by two frequencies corresponding to the above time constant and the time increment of the stochastic process. Such broadband spectra have been observed for slow earthquakes near the trench axis (Kaneko et al., 2017). This spectrum also explains why we can obtain VLF signals by stacking broadband seismograms relative to tremor occurrence (e.g., Takeo et al., 2010; Ide and Yabe, 2014). The fluctuation in BSE model can be non-Gaussian, as far as the variance is finite, as supported by the central limit theorem. Recent observations suggest that tremors and LFEs are spatially characteristic

  8. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  9. Localised controlled release of simvastatin from porous chitosan–gelatin scaffolds engrafted with simvastatin loaded PLGA-microparticles for bone tissue engineering application

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, Piergiorgio [Department of Mechanical and Aerospace Engineering, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Turin (Italy); School of Clinical Dentistry, University of Sheffield, 19 Claremont Crescent, Sheffield (United Kingdom); Nandagiri, Vijay Kumar [Department of Mechanical and Aerospace Engineering, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Turin (Italy); School of Pharmacy, Royal College of Surgeons in Ireland, 123, St. Stephen Green, Dublin 2 (Ireland); Daly, Jacqueline [Division of Biology, Department of Anatomy, Royal College of Surgeons in Ireland, 123, St. Stephen Green, Dublin 2 (Ireland); Chiono, Valeria; Mattu, Clara; Tonda-Turo, Chiara; Ciardelli, Gianluca [Department of Mechanical and Aerospace Engineering, Politecnico di Torino, Corso Duca degli Abruzzi 24, 10129 Turin (Italy); Ramtoola, Zebunnissa, E-mail: zramtoola@rcsi.ie [School of Pharmacy, Royal College of Surgeons in Ireland, 123, St. Stephen Green, Dublin 2 (Ireland)

    2016-02-01

    Localised controlled release of simvastatin from porous freeze-dried chitosan–gelatin (CH–G) scaffolds was investigated by incorporating simvastatin loaded poly-(DL-lactide-co-glycolide) acid (PLGA) microparticles (MSIMs) into the scaffolds. MSIMs at 10% w/w simvastatin loading were prepared using a single emulsion-solvent evaporation method. The MSIM optimal amount to be incorporated into the scaffolds was selected by analysing the effect of embedding increasing amounts of blank PLGA microparticles (BL-MPs) on the scaffold physical properties and on the in vitro cell viability using a clonal human osteoblastic cell line (hFOB). Increasing the BL-MP content from 0% to 33.3% w/w showed a significant decrease in swelling degree (from 1245 ± 56% to 570 ± 35%). Scaffold pore size and distribution changed significantly as a function of BL-MP loading. Compressive modulus of scaffolds increased with increasing BL-MP amount up to 16.6% w/w (23.0 ± 1.0 kPa). No significant difference in cell viability was observed with increasing BL-MP loading. Based on these results, a content of 16.6% w/w MSIM particles was incorporated successfully in CH–G scaffolds, showing a controlled localised release of simvastatin able to influence the hFOB cell proliferation and the osteoblastic differentiation after 11 days. - Highlights: • Simvastatin loaded PLGA microparticle engrafted porous CH–G scaffolds were produced. • The microparticle optimal amount to be incorporated into the scaffolds was studied. • Physical properties of scaffolds changed as a function of microparticle loading. • The level of simvastatin released enhanced cell proliferation and mineralisation.

  10. Localised controlled release of simvastatin from porous chitosan–gelatin scaffolds engrafted with simvastatin loaded PLGA-microparticles for bone tissue engineering application

    International Nuclear Information System (INIS)

    Gentile, Piergiorgio; Nandagiri, Vijay Kumar; Daly, Jacqueline; Chiono, Valeria; Mattu, Clara; Tonda-Turo, Chiara; Ciardelli, Gianluca; Ramtoola, Zebunnissa

    2016-01-01

    Localised controlled release of simvastatin from porous freeze-dried chitosan–gelatin (CH–G) scaffolds was investigated by incorporating simvastatin loaded poly-(DL-lactide-co-glycolide) acid (PLGA) microparticles (MSIMs) into the scaffolds. MSIMs at 10% w/w simvastatin loading were prepared using a single emulsion-solvent evaporation method. The MSIM optimal amount to be incorporated into the scaffolds was selected by analysing the effect of embedding increasing amounts of blank PLGA microparticles (BL-MPs) on the scaffold physical properties and on the in vitro cell viability using a clonal human osteoblastic cell line (hFOB). Increasing the BL-MP content from 0% to 33.3% w/w showed a significant decrease in swelling degree (from 1245 ± 56% to 570 ± 35%). Scaffold pore size and distribution changed significantly as a function of BL-MP loading. Compressive modulus of scaffolds increased with increasing BL-MP amount up to 16.6% w/w (23.0 ± 1.0 kPa). No significant difference in cell viability was observed with increasing BL-MP loading. Based on these results, a content of 16.6% w/w MSIM particles was incorporated successfully in CH–G scaffolds, showing a controlled localised release of simvastatin able to influence the hFOB cell proliferation and the osteoblastic differentiation after 11 days. - Highlights: • Simvastatin loaded PLGA microparticle engrafted porous CH–G scaffolds were produced. • The microparticle optimal amount to be incorporated into the scaffolds was studied. • Physical properties of scaffolds changed as a function of microparticle loading. • The level of simvastatin released enhanced cell proliferation and mineralisation.

  11. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    source and propagation of seismic waves. In many cases, active faults are capable of buildup and sudden release of tectonic stress. Hence, monitoring the active fault systems near epicentral regions of past earthquakes would be a necessity. In this paper, we try to detect possible anomalies in SLHF and AT during two moderate earthquakes of 6 - 6.5 M in Iran and explain the relationships between the seismic activities prior to these earthquake and active faulting in the area. Our analysis shows abnormal SLHF 5~10 days before these earthquakes. Meaningful anomalous concentrations usually occurred in the epicentral area. On the other hand, spatial distributions of these variations were in accordance with the local active faults. It is concluded that the anomalous increase in SLHF shows great potential in providing early warning of a disastrous earthquake, provided that there is a better understanding of the background noise due to the seasonal effects and climatic factors involved. Changes in near surface air temperature along nearby active faults, one or two weeks before the earthquakes, although not as significant as SLHF changes, can be considered as another earthquake indicator.

  12. Impact-based earthquake alerts with the U.S. Geological Survey's PAGER system: what's next?

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Garcia, D.; So, E.; Hearne, M.

    2012-01-01

    In September 2010, the USGS began publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses with its Prompt Assessment of Global Earthquakes for Response (PAGER) system. These estimates significantly enhanced the utility of the USGS PAGER system which had been, since 2006, providing estimated population exposures to specific shaking intensities. Quantifying earthquake impacts and communicating estimated losses (and their uncertainties) to the public, the media, humanitarian, and response communities required a new protocol—necessitating the development of an Earthquake Impact Scale—described herein and now deployed with the PAGER system. After two years of PAGER-based impact alerting, we now review operations, hazard calculations, loss models, alerting protocols, and our success rate for recent (2010-2011) events. This review prompts analyses of the strengths, limitations, opportunities, and pressures, allowing clearer definition of future research and development priorities for the PAGER system.

  13. Areas prone to slow slip events impede earthquake rupture propagation and promote afterslip

    Science.gov (United States)

    Rolandone, Frederique; Nocquet, Jean-Mathieu; Mothes, Patricia A.; Jarrin, Paul; Vallée, Martin; Cubas, Nadaya; Hernandez, Stephen; Plain, Morgan; Vaca, Sandro; Font, Yvonne

    2018-01-01

    At subduction zones, transient aseismic slip occurs either as afterslip following a large earthquake or as episodic slow slip events during the interseismic period. Afterslip and slow slip events are usually considered as distinct processes occurring on separate fault areas governed by different frictional properties. Continuous GPS (Global Positioning System) measurements following the 2016 Mw (moment magnitude) 7.8 Ecuador earthquake reveal that large and rapid afterslip developed at discrete areas of the megathrust that had previously hosted slow slip events. Regardless of whether they were locked or not before the earthquake, these areas appear to persistently release stress by aseismic slip throughout the earthquake cycle and outline the seismic rupture, an observation potentially leading to a better anticipation of future large earthquakes. PMID:29404404

  14. HOT Faults", Fault Organization, and the Occurrence of the Largest Earthquakes

    Science.gov (United States)

    Carlson, J. M.; Hillers, G.; Archuleta, R. J.

    2006-12-01

    We apply the concept of "Highly Optimized Tolerance" (HOT) for the investigation of spatio-temporal seismicity evolution, in particular mechanisms associated with largest earthquakes. HOT provides a framework for investigating both qualitative and quantitative features of complex feedback systems that are far from equilibrium and punctuated by rare, catastrophic events. In HOT, robustness trade-offs lead to complexity and power laws in systems that are coupled to evolving environments. HOT was originally inspired by biology and engineering, where systems are internally very highly structured, through biological evolution or deliberate design, and perform in an optimum manner despite fluctuations in their surroundings. Though faults and fault systems are not designed in ways comparable to biological and engineered structures, feedback processes are responsible in a conceptually comparable way for the development, evolution and maintenance of younger fault structures and primary slip surfaces of mature faults, respectively. Hence, in geophysical applications the "optimization" approach is perhaps more aptly replaced by "organization", reflecting the distinction between HOT and random, disorganized configurations, and highlighting the importance of structured interdependencies that evolve via feedback among and between different spatial and temporal scales. Expressed in the terminology of the HOT concept, mature faults represent a configuration optimally organized for the release of strain energy; whereas immature, more heterogeneous fault networks represent intermittent, suboptimal systems that are regularized towards structural simplicity and the ability to generate large earthquakes more easily. We discuss fault structure and associated seismic response pattern within the HOT concept, and outline fundamental differences between this novel interpretation to more orthodox viewpoints like the criticality concept. The discussion is flanked by numerical simulations of a

  15. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  16. Geotechnical hazards from large earthquakes and heavy rainfalls

    CERN Document Server

    Kazama, Motoki; Lee, Wei

    2017-01-01

    This book is a collection of papers presented at the International Workshop on Geotechnical Natural Hazards held July 12–15, 2014, in Kitakyushu, Japan. The workshop was the sixth in the series of Japan–Taiwan Joint Workshops on Geotechnical Hazards from Large Earthquakes and Heavy Rainfalls, held under the auspices of the Asian Technical Committee No. 3 on Geotechnology for Natural Hazards of the International Society for Soil Mechanics and Geotechnical Engineering. It was co-organized by the Japanese Geotechnical Society and the Taiwanese Geotechnical Society. The contents of this book focus on geotechnical and natural hazard-related issues in Asia such as earthquakes, tsunami, rainfall-induced debris flows, slope failures, and landslides. The book contains the latest information and mitigation technology on earthquake- and rainfall-induced geotechnical natural hazards. By dissemination of the latest state-of-the-art research in the area, the information contained in this book will help researchers, des...

  17. USGS response to an urban earthquake, Northridge '94

    Science.gov (United States)

    Updike, Randall G.; Brown, William M.; Johnson, Margo L.; Omdahl, Eleanor M.; Powers, Philip S.; Rhea, Susan; Tarr, Arthur C.

    1996-01-01

    The urban centers of our Nation provide our people with seemingly unlimited employment, social, and cultural opportunities as a result of the complex interactions of a diverse population embedded in an highly-engineered environment. Catastrophic events in one or more of the natural earth systems which underlie or envelop urban environment can have radical effects on the integrity and survivability of that environment. Earthquakes have for centuries been the source of cataclysmic events on cities throughout the world. Unlike many other earth processes, the effects of major earthquakes transcend all political, social, and geomorphic boundaries and can have decided impact on cities tens to hundreds of kilometers from the epicenter. In modern cities, where buildings, transportation corridors, and lifelines are complexly interrelated, the life, economic, and social vulnerabilities in the face of a major earthquake can be particularly acute.

  18. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  19. Mental Health of Survivors of the 2010 Haitian Earthquake Living in the United States

    Centers for Disease Control (CDC) Podcasts

    2010-04-16

    Thousands of survivors of the 2010 Haitian Earthquake are currently living in the United States. This podcast features a brief non-disease-specific interview with Dr. Marc Safran, CDC's longest serving psychiatrist, about a few of the mental health challenges such survivors may face.  Created: 4/16/2010 by CDC Center of Attribution: Mental and Behavioral Health Team, 2010 CDC Haiti Earthquake Mission, CDC Emergency Operations Center.   Date Released: 5/6/2010.

  20. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  1. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  2. Future Developments for the Earthquake Early Warning System following the 2011 off the Pacific Coast of Tohoku Earthquake

    Science.gov (United States)

    Yamada, M.; Mori, J. J.

    2011-12-01

    The 2011 off the Pacific Coast of Tohoku Earthquake (Mw9.0) caused significant damage over a large area of northeastern Honshu. An earthquake early warning was issued to the public in the Tohoku region about 8 seconds after the first P-arrival, which is 31 seconds after the origin time. There was no 'blind zone', and warnings were received at all locations before S-wave arrivals, since the earthquake was fairly far offshore. Although the early warning message was properly reported in Tohoku region which was the most severely affected area, a message was not sent to the more distant Tokyo region because the intensity was underestimated. . This underestimation was because the magnitude determination in the first few seconds was relatively small (Mj8.1)., and there was no consideration of a finite fault with a long length. Another significant issue is that warnings were sometimes not properly provided for aftershocks. Immediately following the earthquake, the waveforms of some large aftershocks were contaminated by long-period surface waves from the mainshock, which made it difficult to pick P-wave arrivals. Also, correctly distinguishing and locating later aftershocks was sometimes difficult, when multiple events occurred within a short period of time. This masinhock begins with relatively small moment release for the first 10 s . Since the amplitude of the initial waveforms is small, most methods that use amplitudes and periods of the P-wave (e.g. Wu and Kanamori, 2005) cannot correctly determine the size of the4 earthquake in the first several seconds. The current JMA system uses the peak displacement amplitude for the magnitude estimation, and the magnitude saturated at about M8 1 minute after the first P-wave arrival. . Magnitudes of smaller earthquakes can be correctly identified from the first few seconds of P- or S-wave arrivals, but this M9 event cannot be characterized in such a short time. The only way to correctly characterize the size of the Tohoku

  3. Finite element simulation of earthquake cycle dynamics for continental listric fault system

    Science.gov (United States)

    Wei, T.; Shen, Z. K.

    2017-12-01

    We simulate stress/strain evolution through earthquake cycles for a continental listric fault system using the finite element method. A 2-D lithosphere model is developed, with the upper crust composed of plasto-elastic materials and the lower crust/upper mantle composed of visco-elastic materials respectively. The media is sliced by a listric fault, which is soled into the visco-elastic lower crust at its downdip end. The system is driven laterally by constant tectonic loading. Slip on fault is controlled by rate-state friction. We start with a simple static/dynamic friction law, and drive the system through multiple earthquake cycles. Our preliminary results show that: (a) periodicity of the earthquake cycles is strongly modulated by the static/dynamic friction, with longer period correlated with higher static friction and lower dynamic friction; (b) periodicity of earthquake is a function of fault depth, with less frequent events of greater magnitudes occurring at shallower depth; and (c) rupture on fault cannot release all the tectonic stress in the system, residual stress is accumulated in the hanging wall block at shallow depth close to the fault, which has to be released either by conjugate faulting or inelastic folding. We are in a process of exploring different rheologic structure and friction laws and examining their effects on earthquake behavior and deformation pattern. The results will be applied to specific earthquakes and fault zones such as the 2008 great Wenchuan earthquake on the Longmen Shan fault system.

  4. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  5. Defence against earthquakes: a red thread of history

    International Nuclear Information System (INIS)

    Guidoboni, Emanuela

    2015-01-01

    This note gives a short overview from the ancient world down to the end of the eighteenth century (before engineering began as a science, that is) on the idea of “housing safety” and earthquakes. The idea varies, but persists throughout the cultural and economic contexts of history’s changing societies, and in relation to class and lifestyle. Historical research into earthquakes in Italy from the ancient world to the twentieth century has shown how variable the idea actually is, as emerges from theoretical treatises, practical wisdom and projects drawn up in the wake of destructive events. In the seventeenth century the theoretical interpretation of earthquakes began to swing towards a mechanistic view of the Earth, affecting how the effects and propagation of earthquakes were observed. Strong earthquakes continued to occur and cause damage, and after yet another seismic disaster – Umbria 1751 – new building techniques were advocated. The attempt was to make house walls bind more solidly by special linking of the wooden structure of floors and roof beams. Following the massive seismic crisis of February-March 1783, which left central and southern Calabria in ruins, a new house was proposed, called 'baraccata': it was a wooden structure filled in with light materials. This was actually already to be founding the ancient Mediterranean basin (including Pompei); but only at that time was it perfected, proposed by engineers and circulated as an important building innovation. At the end of the eighteenth century town planners came to the fore in the search for safe housing. They suggested new regular shapes, broad grid-plan streets with a specific view to achieving housing safety and ensuring an escape route in case of earthquake. Such rules and regulations were then abandoned or lost, proving that it is not enough to try out [it

  6. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  7. Optimal Zn-Modified Ca–Si-Based Ceramic Nanocoating with Zn Ion Release for Osteoblast Promotion and Osteoclast Inhibition in Bone Tissue Engineering

    Directory of Open Access Journals (Sweden)

    Jiangming Yu

    2017-01-01

    Full Text Available We investigated the slow release of Zn ion (Zn2+ from nanocoatings and compared the in vitro response of osteoblasts (MC3T3-E1 and proosteoclasts (RAW 264.7 cultured on Ca2ZnSi2O7 nanocoated with different Zn/Ca molar ratios on a Ti-6Al-4V (i.e., Ti substrate to optimize cell behaviors and molecule levels. Significant morphology differences were observed among samples. By comparing with pure Ti and CaSiO3 nanocoating, the morphology of Ca2ZnSi2O7 ceramic nanocoatings was rough and contained small nanoparticles or aggregations. Slow Zn2+ release from nanocoatings was observed and Zn2+ concentration was regulated by varying the Zn/Ca ratios. The cell-response results showed Ca2ZnSi2O7 nanocoating at different Zn/Ca molar ratios for osteoblasts and osteoclasts. Compared to other nanocoatings and Ti, sample Zn/Ca (0.3 showed the highest cell viability and upregulated expression of the osteogenic differentiation genes ALP, COL-1, and OCN. Additionally, sample Zn/Ca (0.3 showed the greatest inhibition of RAW 264.7 cell growth and decreased the mRNA levels of osteoclast-related genes OAR, TRAP, and HYA1. Therefore, the optimal Zn-Ca ratio of 0.3 in Ca2ZnSi2O7 ceramic nanocoating on Ti had a dual osteoblast-promoting and osteoclast-inhibiting effect to dynamically balance osteoblasts/osteoclasts. These optimal Zn-Ca ratios are valuable for Ca2ZnSi2O7 ceramic nanocoating on Ti-coated implants for potential applications in bone tissue regeneration.

  8. Living with earthquakes - development and usage of earthquake-resistant construction methods in European and Asian Antiquity

    Science.gov (United States)

    Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes

    2010-05-01

    Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the

  9. Slope earthquake stability

    CERN Document Server

    Changwei, Yang; Jing, Lian; Wenying, Yu; Jianjing, Zhang

    2017-01-01

    This book begins with the dynamic characteristics of the covering layerbedrock type slope, containing monitoring data of the seismic array, shaking table tests, numerical analysis and theoretical derivation. Then it focuses on the landslide mechanism and assessment method. It also proposes a model that assessing the hazard area based on the field investigations. Many questions, exercises and solutions are given. Researchers and engineers in the field of Geotechnical Engineering and Anti-seismic Engineering can benefit from it.

  10. Rapid estimation of the economic consequences of global earthquakes

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  11. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  12. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  13. The role of post-earthquake structural safety in pre-earthquake retrof in decision: guidelines and applications

    International Nuclear Information System (INIS)

    Bazzurro, P.; Telleen, K.; Maffei, J.; Yin, J.; Cornell, C.A.

    2009-01-01

    Critical structures such as hospitals, police stations, local administrative office buildings, and critical lifeline facilities, are expected to be operational immediately after earthquakes. Any rational decision about whether these structures are strong enough to meet this goal or whether pre-empitive retrofitting is needed cannot be made without an explicit consideration of post-earthquake safety and functionality with respect to aftershocks. Advanced Seismic Assessment Guidelines offer improvement over previous methods for seismic evaluation of buildings where post-earthquake safety and usability is a concern. This new method allows engineers to evaluate the like hood that a structure may have restricted access or no access after an earthquake. The building performance is measured in terms of the post-earthquake occupancy classifications Green Tag, Yellow Tag, and Red Tag, defining these performance levels quantitatively, based on the structure's remaining capacity to withstand aftershocks. These color-coded placards that constitute an established practice in US could be replaced by the standard results of inspections (A to E) performed by the Italian Dept. of Civil Protection after an event. The article also shows some applications of these Guidelines to buildings of the largest utility company in California, Pacific Gas and Electric Company (PGE). [it

  14. Long-Term Fault Memory: A New Time-Dependent Recurrence Model for Large Earthquake Clusters on Plate Boundaries

    Science.gov (United States)

    Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.; Campbell, M. R.

    2017-12-01

    A challenge for earthquake hazard assessment is that geologic records often show large earthquakes occurring in temporal clusters separated by periods of quiescence. For example, in Cascadia, a paleoseismic record going back 10,000 years shows four to five clusters separated by approximately 1,000 year gaps. If we are still in the cluster that began 1700 years ago, a large earthquake is likely to happen soon. If the cluster has ended, a great earthquake is less likely. For a Gaussian distribution of recurrence times, the probability of an earthquake in the next 50 years is six times larger if we are still in the most recent cluster. Earthquake hazard assessments typically employ one of two recurrence models, neither of which directly incorporate clustering. In one, earthquake probability is time-independent and modeled as Poissonian, so an earthquake is equally likely at any time. The fault has no "memory" because when a prior earthquake occurred has no bearing on when the next will occur. The other common model is a time-dependent earthquake cycle in which the probability of an earthquake increases with time until one happens, after which the probability resets to zero. Because the probability is reset after each earthquake, the fault "remembers" only the last earthquake. This approach can be used with any assumed probability density function for recurrence times. We propose an alternative, Long-Term Fault Memory (LTFM), a modified earthquake cycle model where the probability of an earthquake increases with time until one happens, after which it decreases, but not necessarily to zero. Hence the probability of the next earthquake depends on the fault's history over multiple cycles, giving "long-term memory". Physically, this reflects an earthquake releasing only part of the elastic strain stored on the fault. We use the LTFM to simulate earthquake clustering along the San Andreas Fault and Cascadia. In some portions of the simulated earthquake history, events would

  15. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  16. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  17. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  18. Earthquake in Japan: The IAEA mission gives its report

    International Nuclear Information System (INIS)

    Anon.

    2007-01-01

    Following the seism that occurred on the 16. july 2007 in Japan (magnitude 6.6 on Richter scale), an IAEA mission has inspected the nuclear power plant of Kashiwazaki Kariwa at the beginning of August. The mission has estimated that the safety of the installation has been provided during and after the earthquake, in spite of the fact that the earthquake has gone past the seism level taken as reference in the conception of the nuclear facility. The systems and the components were in a better state that it could be imagined after a such earthquake. The release have been under the authorised thresholds. At the moment of the seism, three reactors were running on the seven ones of the nuclear power plant, and stopped automatically. The unit 2 that started up, has also stopped automatically. The reactors 1, 5 and 6 were stopped for maintenance. Water poured out coming from the spent fuel storage pool because of the earth tremors. It was picked and thrown out by the release pipe to the sea without notable impact on environment ( volume 1.2 m 3 ). One hundred of containers was overturned. Traces of iodine, chromium 51 and cobalt 60 have been found in the ventilation filters ( reactor 7) these elements have been released in atmosphere in very low quantities. (N.C.)

  19. Radon observation for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  20. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  1. Seismic experience in power and industrial facilities as it relates to small magnitude earthquakes

    International Nuclear Information System (INIS)

    Swan, S.W.; Horstman, N.G.

    1987-01-01

    The data base on the performance of power and industrial facilities in small magnitude earthquakes (M = 4.0 - 5.5) is potentially very large. In California alone many earthquakes in this magnitude range occur every year, often near industrial areas. In 1986 for example, in northern California alone, there were 76 earthquakes between Richter magnitude 4.0 and 5.5. Experience has shown that the effects of small magnitude earthquakes are seldom significant to well-engineered facilities. (The term well-engineered is here defined to include most modern industrial installations, as well as power plants and substations.) Therefore detailed investigations of small magnitude earthquakes are normally not considered worthwhile. The purpose of this paper is to review the tendency toward seismic damage of equipment installations representative of nuclear power plant safety systems. Estimates are made of the thresholds of seismic damage to certain types of equipment in terms of conventional means of measuring the damage potential of an earthquake. The objective is to define thresholds of damage that can be correlated with Richter magnitude. In this manner an earthquake magnitude might be chosen below which damage to nuclear plant safety systems is not considered credible

  2. A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM

    Science.gov (United States)

    Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.

    2007-12-01

    The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.

  3. Research on Collection of Earthquake Disaster Information from the Crowd

    Science.gov (United States)

    Nian, Z.

    2017-12-01

    In China, the assessment of the earthquake disasters information is mainly based on the inversion of the seismic source mechanism and the pre-calculated population data model, the real information of the earthquake disaster is usually collected through the government departments, the accuracy and the speed need to be improved. And in a massive earthquake like the one in Mexico, the telecommunications infrastructure on ground were damaged , the quake zone was difficult to observe by satellites and aircraft in the bad weather. Only a bit of information was sent out through maritime satellite of other country. Thus, the timely and effective development of disaster relief was seriously affected. Now Chinese communication satellites have been orbiting, people don't only rely on the ground telecom base station to keep communication with the outside world, to open the web page,to land social networking sites, to release information, to transmit images and videoes. This paper will establish an earthquake information collection system which public can participate. Through popular social platform and other information sources, the public can participate in the collection of earthquake information, and supply quake zone information, including photos, video, etc.,especially those information made by unmanned aerial vehicle (uav) after earthqake, the public can use the computer, potable terminals, or mobile text message to participate in the earthquake information collection. In the system, the information will be divided into earthquake zone basic information, earthquake disaster reduction information, earthquake site information, post-disaster reconstruction information etc. and they will been processed and put into database. The quality of data is analyzed by multi-source information, and is controlled by local public opinion on them to supplement the data collected by government departments timely and implement the calibration of simulation results ,which will better guide

  4. The 2013 Crete (Hellenic Arc) Earthquake Sequence

    Science.gov (United States)

    Karakostas, V. G.; Papadimitriou, E. E.; Vallianatos, F.

    2014-12-01

    The western Hellenic Arc is a well known place of active interplate deformation, where the convergence motion vector is perpendicular to the subduction front. On 12 October 2013 this area was hit by a strong (Mw=6.7) earthquake, occurred on a thrust fault onto the coupled part of the overriding and descending plates, with the compression axis being oriented in the direction of plate convergence. This was the first strong (M>6.0) event to have occurred onto this segment of the descending slab, which has accommodated the largest (M8.3) known earthquake in the Mediterranean area, and to be recorded by the Hellenic Unified Seismological Network (HUSN) that has been considerably improved in the last five years. The first 2-days relocated seismicity shows activation of the upper part of the descending slab, downdip of the plate interface and forming a relatively narrow aftershock area on map view. The less densely visited by aftershocks area, where the main shock is also encompassed, is considered as the high-slip area along the downdip portion of the subducting plane. Dense concentration of the intraslab aftershocks are probably due to the increase of static stress generated by the main shock. A spectacular feature of the aftershock activity concerns the lateral extension of the slipped area, which appears very sharply defined. This provides evidence on localized coupling and aseismically creeping areas, explaining the low coupling ratio in the Hellenic Arc, as it derives from comparison between relative plate motion and seismic energy release. Elucidating the issue of how far the associated large-slip zone might be extended along the plate interface during the main rupture is crucial in assessing future earthquake hazards from subduction events in the study area. This research has been co-funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project.

  5. Surface slip during large Owens Valley earthquakes

    KAUST Repository

    Haddon, E. K.; Amos, C. B.; Zielke, Olaf; Jayko, A. S.; Burgmann, R.

    2016-01-01

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from approximate to 1.0 to 6.0 m and average 3.31.1 m (2 sigma). Vertical offsets are predominantly east-down between approximate to 0.1 and 2.4 m, with a mean of 0.80.5 m. The average lateral-to-vertical ratio compiled at specific sites is approximate to 6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7-11 m and net average of 4.41.5 m, corresponding to a geologic M-w approximate to 7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.12.0 m, 12.8 +/- 1.5 m, and 16.6 +/- 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between approximate to 0.6 and 1.6 mm/yr (1 sigma) over the late Quaternary.

  6. Surface slip during large Owens Valley earthquakes

    KAUST Repository

    Haddon, E. K.

    2016-01-10

    The 1872 Owens Valley earthquake is the third largest known historical earthquake in California. Relatively sparse field data and a complex rupture trace, however, inhibited attempts to fully resolve the slip distribution and reconcile the total moment release. We present a new, comprehensive record of surface slip based on lidar and field investigation, documenting 162 new measurements of laterally and vertically displaced landforms for 1872 and prehistoric Owens Valley earthquakes. Our lidar analysis uses a newly developed analytical tool to measure fault slip based on cross-correlation of sublinear topographic features and to produce a uniquely shaped probability density function (PDF) for each measurement. Stacking PDFs along strike to form cumulative offset probability distribution plots (COPDs) highlights common values corresponding to single and multiple-event displacements. Lateral offsets for 1872 vary systematically from approximate to 1.0 to 6.0 m and average 3.31.1 m (2 sigma). Vertical offsets are predominantly east-down between approximate to 0.1 and 2.4 m, with a mean of 0.80.5 m. The average lateral-to-vertical ratio compiled at specific sites is approximate to 6:1. Summing displacements across subparallel, overlapping rupture traces implies a maximum of 7-11 m and net average of 4.41.5 m, corresponding to a geologic M-w approximate to 7.5 for the 1872 event. We attribute progressively higher-offset lateral COPD peaks at 7.12.0 m, 12.8 +/- 1.5 m, and 16.6 +/- 1.4 m to three earlier large surface ruptures. Evaluating cumulative displacements in context with previously dated landforms in Owens Valley suggests relatively modest rates of fault slip, averaging between approximate to 0.6 and 1.6 mm/yr (1 sigma) over the late Quaternary.

  7. 9th Structural Engineering Convention 2014

    CERN Document Server

    2015-01-01

    The book presents research papers presented by academicians, researchers, and practicing structural engineers from India and abroad in the recently held Structural Engineering Convention (SEC) 2014 at Indian Institute of Technology Delhi during 22 – 24 December 2014. The book is divided into three volumes and encompasses multidisciplinary areas within structural engineering, such as earthquake engineering and structural dynamics, structural mechanics, finite element methods, structural vibration control, advanced cementitious and composite materials, bridge engineering, and soil-structure interaction. Advances in Structural Engineering is a useful reference material for structural engineering fraternity including undergraduate and postgraduate students, academicians, researchers and practicing engineers.

  8. Precisely locating the Klamath Falls, Oregon, earthquakes

    Science.gov (United States)

    Qamar, A.; Meagher, K.L.

    1993-01-01

    The Klamath Falls earthquakes on September 20, 1993, were the largest earthquakes centered in Oregon in more than 50 yrs. Only the magnitude 5.75 Milton-Freewater earthquake in 1936, which was centered near the Oregon-Washington border and felt in an area of about 190,000 sq km, compares in size with the recent Klamath Falls earthquakes. Although the 1993 earthquakes surprised many local residents, geologists have long recognized that strong earthquakes may occur along potentially active faults that pass through the Klamath Falls area. These faults are geologically related to similar faults in Oregon, Idaho, and Nevada that occasionally spawn strong earthquakes

  9. Response and recovery lessons from the 2010-2011 earthquake sequence in Canterbury, New Zealand

    Science.gov (United States)

    Pierepiekarz, Mark; Johnston, David; Berryman, Kelvin; Hare, John; Gomberg, Joan S.; Williams, Robert A.; Weaver, Craig S.

    2014-01-01

    The impacts and opportunities that result when low-probability moderate earthquakes strike an urban area similar to many throughout the US were vividly conveyed in a one-day workshop in which social and Earth scientists, public officials, engineers, and an emergency manager shared their experiences of the earthquake sequence that struck the city of Christchurch and surrounding Canterbury region of New Zealand in 2010-2011. Without question, the earthquake sequence has had unprecedented impacts in all spheres on New Zealand society, locally to nationally--10% of the country's population was directly impacted and losses total 8-10% of their GDP. The following paragraphs present a few lessons from Christchurch.

  10. Ground motion following selection of SRS design basis earthquake and associated deterministic approach

    International Nuclear Information System (INIS)

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section's Seismic Qualification Program for reactor restart

  11. Performance evaluation recommendations of nuclear power plants outdoor significant civil structures earthquake resistance. Performance evaluation examples

    International Nuclear Information System (INIS)

    2005-06-01

    The Japan Society of Civil Engineers has updated performance evaluation recommendations of nuclear power plants outdoor significant civil structures earthquake resistance in June 2005. Based on experimental and analytical considerations, analytical seismic models of soils for underground structures, effects of vertical motions on time-history dynamic analysis and shear fracture of reinforced concretes by cyclic loadings have been incorporated in new recommendations. This document shows outdoor civil structures earthquake resistance and endurance performance evaluation examples based on revised recommendations. (T. Tanaka)

  12. Application of Incremental Dynamic Analysis (IDA) Method for Studying the Dynamic Behavior of Structures During Earthquakes

    OpenAIRE

    Javanpour, M.; Zarfam, P.

    2017-01-01

    Prediction of existing buildings’ vulnerability by future earthquakes is one of the most essential topics in structural engineering. Modeling steel structures is a giant step in determining the damage caused by the earthquake, as such structures are increasingly being used in constructions. Hence, two same-order steel structures with two types of structural systems were selected (coaxial moment frames and moment frame). In most cases, a specific structure needs to satisfy several functional l...

  13. Factors Contributing to the Catastrophe in Mexico City During the Earthquake of September 19, 1985

    OpenAIRE

    Beck, James L.; Hall, John F.

    1986-01-01

    The extensive damage to high‐rise buildings in Mexico City during the September 19, 1985 earthquake is primarily due to the intensity of the ground shaking exceeding what was previously considered credible for the city by Mexican engineers. There were two major factors contributing to the catastrophe, resonance in the sediments of an ancient lake that once existed in the Valley of Mexico, and the long duration of shaking compared with other coastal earthquakes in the last 50 years. Both of th...

  14. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  15. Demonstration of pb-PSHA with Ras-Elhekma earthquake, Egypt

    Directory of Open Access Journals (Sweden)

    Elsayed Fergany

    2017-06-01

    Full Text Available The main goal of this work is to: (1 argue for the importance of a physically-based probabilistic seismic hazard analysis (pb-PSHA methodology and show examples to support the argument from recent events, (2 demonstrate the methodology with the ground motion simulations of May 28, 1998, Mw = 5.5 Ras-Elhekma earthquake, north Egypt. The boundaries for the possible rupture parameters that may have been identified prior to the 1998 Ras-Elhekma earthquake were estimated. A range of simulated ground-motions for the Ras-Elhekma earthquake was “predicted” for frequency 0.5–25 Hz at three sites, where the large earthquake was recorded, with average epicentral distances of 220 km. The best rupture model of the 1998 Ras-Elhekma earthquake was identified by calculated the goodness of fit between observed and synthesized records at sites FYM, HAG, and KOT. We used the best rupture scenario of the 1998 earthquake to synthesize the ground motions at interested sites where the main shock was not recorded. Based on the good fit of simulated and observed seismograms, we concluded that this methodology can provide realistic ground motion of an earthquake and highly recommended for engineering purposes in advance or foregoing large earthquakes at non record sites. We propose that there is a need for this methodology for good-representing the true hazard with reducing uncertainties.

  16. A study on generation of simulated earthquake ground motion for seismic design of nuclear power plant

    International Nuclear Information System (INIS)

    Ichiki, Tadaharu; Matsumoto, Takuji; Kitada, Yoshio; Osaki, Yorihiko; Kanda, Jun; Masao, Toru.

    1985-01-01

    The aseismatic design of nuclear power generation facilities carried out in Japan at present must conform to the ''Guideline for aseismatic design examination regarding power reactor facilities'' decided by the Atomic Energy Commission in 1978. In this guideline, the earthquake motion used for the analysis of dynamic earthquake response is to be given in the form of the magnitude determined on the basis of the investigation of historical earthquakes and active faults around construction sites and the response spectra corresponding to the distance from epicenters. Accordingly when the analysis of dynamic earthquake response is actually carried out, the simulated earthquake motion made in conformity with these set up response spectra is used as the input earthquake motion for the design. For the purpose of establishing the techniques making simulated earthquake motion which is more appropriate and rational from engineering viewpoint, the research was carried out, and the results are summarized in this paper. The techniques for making simulated earthquake motion, the response of buildings and the response spectra of floors are described. (Kako, I.)

  17. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    Science.gov (United States)

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  18. The Pocatello Valley, Idaho, earthquake

    Science.gov (United States)

    Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

    1975-01-01

    A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26. 

  19. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  20. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  1. Streamflow responses in Chile to megathrust earthquakes in the 20th and 21st centuries

    Science.gov (United States)

    Mohr, Christian; Manga, Michael; Wang, Chi-yuen; Korup, Oliver

    2016-04-01

    Both coseismic static stress and dynamic stresses associated with seismic waves may cause responses in hydrological systems. Such responses include changes in the water level, hydrochemistry and streamflow discharge. Earthquake effects on hydrological systems provide a means to study the interaction between stress changes and regional hydrology, which is otherwise rarely possible. Chile is a country of frequent and large earthquakes and thus provides abundant opportunities to study such interactions and processes. We analyze streamflow responses in Chile to several megathrust earthquakes, including the 1943 Mw 8.1 Coquimbo, 1950 Mw 8.2 Antofagasta, 1960 Mw 9.5 Valdivia, 1985 Mw 8.0 Valparaiso, 1995 Mw 8.0 Antofagasta, 2010 Mw 8.8 Maule, and the 2014 Mw 8.2 Iquique earthquakes. We use data from 716 stream gauges distributed from the Altiplano in the North to Tierra del Fuego in the South. This network covers the Andes mountain ranges, the central valley, the Coastal Mountain ranges and (mainly in the more southern parts) the Coastal flats. We combine empirical magnitude-distance relationships, machine learning tools, and process-based modeling to characterize responses. We first assess the streamflow anomalies and relate these to topographical, hydro-climatic, geological and earthquake-related (volumetric and dynamic strain) factors using various classifiers. We then apply 1D-groundwater flow modeling to selected catchments in order to test competing hypotheses for the origin of streamflow changes. We show that the co-seismic responses of streamflow mostly involved increasing discharges. We conclude that enhanced vertical permeability can explain most streamflow responses at the regional scale. The total excess water released by a single earthquake, i.e. the Maule earthquake, yielded up to 1 km3. Against the background of megathrust earthquakes frequently hitting Chile, the amount of water released by earthquakes is substantial, particularly for the arid northern

  2. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  3. Gambling scores for earthquake predictions and forecasts

    Science.gov (United States)

    Zhuang, Jiancang

    2010-04-01

    This paper presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points betted by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. We also calculate the upper bound of the gambling score when the true model is a renewal process, the stress release model or the ETAS model and when the reference model is the Poisson model.

  4. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  5. Release plan for Big Pete

    International Nuclear Information System (INIS)

    Edwards, T.A.

    1996-11-01

    This release plan is to provide instructions for the Radiological Control Technician (RCT) to conduct surveys for the unconditional release of ''Big Pete,'' which was used in the removal of ''Spacers'' from the N-Reactor. Prior to performing surveys on the rear end portion of ''Big Pete,'' it shall be cleaned (i.e., free of oil, grease, caked soil, heavy dust). If no contamination is found, the vehicle may be released with the permission of the area RCT Supervisor. If contamination is found by any of the surveys, contact the cognizant Radiological Engineer for decontamination instructions

  6. The fate of organic compounds in a cement-based repository: impact on the engineered barrier and the release of C-14 from the near field

    International Nuclear Information System (INIS)

    Wieland, E.; Rothardt, J.; Schlotterbeck, G.

    2015-01-01

    The degradation of organic materials is taken into account in the safety analysis for a L/ILW (Low- and intermediate-level radioactive waste) repository in Switzerland with the aim of assessing possible impacts on the cement barrier. The waste forms to be disposed of in the planned L/ILW repository will contain HMW polymers and LMW monomeric organic materials. It is anticipated that these organic materials have different degradation rates and therefore different life times in a repository. While the decomposition of LMW organics is expected to be fast and complete during the oxic and early anoxic states of a repository, i.e. before and shortly after repository closure, the decomposition of the HMW polymeric materials is expected to be very slow and, for some materials, to occur over the entire life time of the repository. The degradation of organic materials generates CO 2 which gives rise to carbonation of the cement barrier. The maximum acceptable loading of organics in the near field with no detrimental effect on radionuclide immobilization can be estimated on the assumption that at maximum 2/3 of the total portlandite inventory of hydrated cement is allowed to convert to CaCO 3 in the case of waste compartments for which the cementitious barrier should remain intact. The maximum loading is determined by the inventory of the organic material under consideration as well as the carbon content and the oxidation state of carbon of the material. Carbon-14 bound in organic compounds is considered to be an important contributor to the annual dose released from a L/ILW repository. While the 14 C inventory is well known, the chemical speciation of 14 C in the cementitious near field upon liberation in the course of the corrosion of activated steel is only poorly understood. Preliminary corrosion tests with non-activated steel powders show the formation of gaseous and dissolved organic carbon species, e.g. alkanes/alkenes, alcohols, aldehydes, and carboxylic acids

  7. Does knowledge signify protection? The SEISMOPOLIS centre for improvement of behavior in case of an earthquake

    Science.gov (United States)

    Dandoulaki, M.; Kourou, A.; Panoutsopoulou, M.

    2009-04-01

    It is vastly accepted that earthquake education is the way to earthquake protection. Nonetheless experience demonstrates that knowing what to do does not necessarily result in a better behaviour in case of a real earthquake. A research project titled: "Seismopolis" - "Pilot integrated System for Public Familiarization with Earthquakes and Information on Earthquake Protection" aimed at the improvement of the behaviour of people through an appropriate amalgamation of knowledge transfer and virtually experiencing an earthquake situation. Seismopolis combines well established education means such as books and leaflets with new technologies like earthquake simulation and virtual reality. It comprises a series of 5 main spaces that the visitor passes one-by-one. Space 1. Reception and introductory information. Visitors are given fundamental information on earthquakes and earthquake protection, as well as on the appropriate behaviour in case of an earthquake. Space 2. Earthquake simulation room Visitors experience an earthquake in a room. A typical kitchen is set on a shake table area (3m x 6m planar triaxial shake table) and is shaken in both horizontal and vertical directions by introducing seismographs of real or virtual earthquakes. Space 3. Virtual reality room Visitors may have the opportunity to virtually move around in the building or in the city after an earthquake disaster and take action as in a real-life situation, wearing stereoscopic glasses and using navigation tools. Space 4. Information and resources library Visitors are offered the opportunity to know more about earthquake protection. A series of means are available for this, some developed especially for Seismopolis (3 books, 2 Cds, a website and an interactive table game). Space 5. De-briefing area Visitors may be subjected to a pedagogical and psychological evaluation at the end of their visit and offered support if needed. For the evaluation of the "Seismopolis" Centre, a pilot application of the

  8. Simulation of earthquakes with cellular automata

    Directory of Open Access Journals (Sweden)

    P. G. Akishin

    1998-01-01

    Full Text Available The relation between cellular automata (CA models of earthquakes and the Burridge–Knopoff (BK model is studied. It is shown that the CA proposed by P. Bak and C. Tang,although they have rather realistic power spectra, do not correspond to the BK model. We present a modification of the CA which establishes the correspondence with the BK model.An analytical method of studying the evolution of the BK-like CA is proposed. By this method a functional quadratic in stress release, which can be regarded as an analog of the event energy, is constructed. The distribution of seismic events with respect to this “energy” shows rather realistic behavior, even in two dimensions. Special attention is paid to two-dimensional automata; the physical restrictions on compression and shear stiffnesses are imposed.

  9. 1983 Borah Peak earthquake and INEL structural performance

    International Nuclear Information System (INIS)

    Gorman, V.W.; Guenzler, R.C.

    1983-12-01

    At 8:06 a.m. Mountain Daylight Time on October 28, 1983 an earthquake registering 7.3 on the Richter Magnitude scale occurred about 30 km northwest of the town of Mackay, in central Idaho. This report describes the event and associated effects and the responses of facilities at Idaho National Engineering Laboratory (INEL), located approximately 100 km. from the epicenter, to ground motion. 21 references, 36 figures, 5 tables

  10. Earthquake protection of essential civil and industrial equipments

    International Nuclear Information System (INIS)

    Bourrier, P.; Le Breton, F.; Thevenot, A.

    1986-01-01

    This document presents the principal reflexions concerning seismic engineering applications for equipment and the difference of the non-employment towards these structures. The notion of essential equipment is then pointed out as well as the main particularities of equipment considered as structures. Finally, this document illustrates a few pathological examples encountered after an earthquake, and presents some equipments of a nuclear power plant which to resist an increased safety seism [fr

  11. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  12. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  13. Permeability, storage and hydraulic diffusivity controlled by earthquakes

    Science.gov (United States)

    Brodsky, E. E.; Fulton, P. M.; Xue, L.

    2016-12-01

    Earthquakes can increase permeability in fractured rocks. In the farfield, such permeability increases are attributed to seismic waves and can last for months after the initial earthquake. Laboratory studies suggest that unclogging of fractures by the transient flow driven by seismic waves is a viable mechanism. These dynamic permeability increases may contribute to permeability enhancement in the seismic clouds accompanying hydraulic fracking. Permeability enhancement by seismic waves could potentially be engineered and the experiments suggest the process will be most effective at a preferred frequency. We have recently observed similar processes inside active fault zones after major earthquakes. A borehole observatory in the fault that generated the M9.0 2011 Tohoku earthquake reveals a sequence of temperature pulses during the secondary aftershock sequence of an M7.3 aftershock. The pulses are attributed to fluid advection by a flow through a zone of transiently increased permeability. Directly after the M7.3 earthquake, the newly damaged fault zone is highly susceptible to further permeability enhancement, but ultimately heals within a month and becomes no longer as sensitive. The observation suggests that the newly damaged fault zone is more prone to fluid pulsing than would be expected based on the long-term permeability structure. Even longer term healing is seen inside the fault zone of the 2008 M7.9 Wenchuan earthquake. The competition between damage and healing (or clogging and unclogging) results in dynamically controlled permeability, storage and hydraulic diffusivity. Recent measurements of in situ fault zone architecture at the 1-10 meter scale suggest that active fault zones often have hydraulic diffusivities near 10-2 m2/s. This uniformity is true even within the damage zone of the San Andreas fault where permeability and storage increases balance each other to achieve this value of diffusivity over a 400 m wide region. We speculate that fault zones

  14. Tweeting Earthquakes using TensorFlow

    Science.gov (United States)

    Casarotti, E.; Comunello, F.; Magnoni, F.

    2016-12-01

    The use of social media is emerging as a powerful tool for disseminating trusted information about earthquakes. Since 2009, the Twitter account @INGVterremoti provides constant and timely details about M2+ seismic events detected by the Italian National Seismic Network, directly connected with the seismologists on duty at Istituto Nazionale di Geofisica e Vulcanologia (INGV). Currently, it updates more than 150,000 followers. Nevertheless, since it provides only the manual revision of seismic parameters, the timing (approximately between 10 and 20 minutes after an event) has started to be under evaluation. Undeniably, mobile internet, social network sites and Twitter in particular require a more rapid and "real-time" reaction. During the last 36 months, INGV tested the tweeting of the automatic detection of M3+ earthquakes, studying the reliability of the information both in term of seismological accuracy that from the point of view of communication and social research. A set of quality parameters (i.e. number of seismic stations, gap, relative error of the location) has been recognized to reduce false alarms and the uncertainty of the automatic detection. We present an experiment to further improve the reliability of this process using TensorFlow™ (an open source software library originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization).

  15. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  16. Mapping Tectonic Stress Using Earthquakes

    International Nuclear Information System (INIS)

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-01-01

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust

  17. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  18. Large earthquakes and creeping faults

    Science.gov (United States)

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  19. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  20. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  1. The Strain Energy, Seismic Moment and Magnitudes of Large Earthquakes

    Science.gov (United States)

    Purcaru, G.

    2004-12-01

    The strain energy Est, as potential energy, released by an earthquake and the seismic moment Mo are two fundamental physical earthquake parameters. The earthquake rupture process ``represents'' the release of the accumulated Est. The moment Mo, first obtained in 1966 by Aki, revolutioned the quantification of earthquake size and led to the elimination of the limitations of the conventional magnitudes (originally ML, Richter, 1930) mb, Ms, m, MGR. Both Mo and Est, not in a 1-to-1 correspondence, are uniform measures of the size, although Est is presently less accurate than Mo. Est is partitioned in seismic- (Es), fracture- (Eg) and frictional-energy Ef, and Ef is lost as frictional heat energy. The available Est = Es + Eg (Aki and Richards (1980), Kostrov and Das, (1988) for fundamentals on Mo and Est). Related to Mo, Est and Es, several modern magnitudes were defined under various assumptions: the moment magnitude Mw (Kanamori, 1977), strain energy magnitude ME (Purcaru and Berckhemer, 1978), tsunami magnitude Mt (Abe, 1979), mantle magnitude Mm (Okal and Talandier, 1987), seismic energy magnitude Me (Choy and Boatright, 1995, Yanovskaya et al, 1996), body-wave magnitude Mpw (Tsuboi et al, 1998). The available Est = (1/2μ )Δ σ Mo, Δ σ ~=~average stress drop, and ME is % \\[M_E = 2/3(\\log M_o + \\log(\\Delta\\sigma/\\mu)-12.1) ,\\] % and log Est = 11.8 + 1.5 ME. The estimation of Est was modified to include Mo, Δ and μ of predominant high slip zones (asperities) to account for multiple events (Purcaru, 1997): % \\[E_{st} = \\frac{1}{2} \\sum_i {\\frac{1}{\\mu_i} M_{o,i} \\Delta\\sigma_i} , \\sum_i M_{o,i} = M_o \\] % We derived the energy balance of Est, Es and Eg as: % \\[ E_{st}/M_o = (1+e(g,s)) E_s/M_o , e(g,s) = E_g/E_s \\] % We analyzed a set of about 90 large earthquakes and found that, depending on the goal these magnitudes quantify differently the rupture process, thus providing complementary means of earthquake characterization. Results for some

  2. Disaster mitigation science for Earthquakes and Tsunamis -For resilience society against natural disasters-

    Science.gov (United States)

    Kaneda, Y.; Takahashi, N.; Hori, T.; Kawaguchi, K.; Isouchi, C.; Fujisawa, K.

    2017-12-01

    Destructive natural disasters such as earthquakes and tsunamis have occurred frequently in the world. For instance, 2004 Sumatra Earthquake in Indonesia, 2008 Wenchuan Earthquake in China, 2010 Chile Earthquake and 2011 Tohoku Earthquake in Japan etc., these earthquakes generated very severe damages. For the reduction and mitigation of damages by destructive natural disasters, early detection of natural disasters and speedy and proper evacuations are indispensable. And hardware and software developments/preparations for reduction and mitigation of natural disasters are quite important. In Japan, DONET as the real time monitoring system on the ocean floor is developed and deployed around the Nankai trough seismogenic zone southwestern Japan. So, the early detection of earthquakes and tsunamis around the Nankai trough seismogenic zone will be expected by DONET. The integration of the real time data and advanced simulation researches will lead to reduce damages, however, in the resilience society, the resilience methods will be required after disasters. Actually, methods on restorations and revivals are necessary after natural disasters. We would like to propose natural disaster mitigation science for early detections, evacuations and restorations against destructive natural disasters. This means the resilience society. In natural disaster mitigation science, there are lots of research fields such as natural science, engineering, medical treatment, social science and literature/art etc. Especially, natural science, engineering and medical treatment are fundamental research fields for natural disaster mitigation, but social sciences such as sociology, geography and psychology etc. are very important research fields for restorations after natural disasters. Finally, to realize and progress disaster mitigation science, human resource cultivation is indispensable. We already carried out disaster mitigation science under `new disaster mitigation research project on Mega

  3. International Civil and Infrastructure Engineering Conference 2013

    CERN Document Server

    Yusoff, Marina; Ismail, Zulhabri; Amin, Norliyati; Fadzil, Mohd

    2014-01-01

    The special focus of this proceedings is to cover the areas of infrastructure engineering and sustainability management. The state-of-the art information in infrastructure and sustainable issues in engineering covers earthquake, bioremediation, synergistic management, timber engineering, flood management and intelligent transport systems. It provides precise information with regards to innovative research development in construction materials and structures in addition to a compilation of interdisciplinary finding combining nano-materials and engineering.

  4. International Civil and Infrastructure Engineering Conference 2014

    CERN Document Server

    Yusoff, Marina; Alisibramulisi, Anizahyati; Amin, Norliyati; Ismail, Zulhabri

    2015-01-01

    The special focus of this proceedings is to cover the areas of infrastructure engineering and sustainability management. The state-of-the art information in infrastructure and sustainable issues in engineering covers earthquake, bioremediation, synergistic management, timber engineering, flood management and intelligent transport systems. It provides precise information with regards to innovative research development in construction materials and structures in addition to a compilation of interdisciplinary finding combining nano-materials and engineering.

  5. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  6. The Fusion of Financial Analysis and Seismology: Statistical Methods from Financial Market Analysis Applied to Earthquake Data

    Science.gov (United States)

    Ohyanagi, S.; Dileonardo, C.

    2013-12-01

    As a natural phenomenon earthquake occurrence is difficult to predict. Statistical analysis of earthquake data was performed using candlestick chart and Bollinger Band methods. These statistical methods, commonly used in the financial world to analyze market trends were tested against earthquake data. Earthquakes above Mw 4.0 located on shore of Sanriku (37.75°N ~ 41.00°N, 143.00°E ~ 144.50°E) from February 1973 to May 2013 were selected for analysis. Two specific patterns in earthquake occurrence were recognized through the analysis. One is a spread of candlestick prior to the occurrence of events greater than Mw 6.0. A second pattern shows convergence in the Bollinger Band, which implies a positive or negative change in the trend of earthquakes. Both patterns match general models for the buildup and release of strain through the earthquake cycle, and agree with both the characteristics of the candlestick chart and Bollinger Band analysis. These results show there is a high correlation between patterns in earthquake occurrence and trend analysis by these two statistical methods. The results of this study agree with the appropriateness of the application of these financial analysis methods to the analysis of earthquake occurrence.

  7. Rupture distribution of the 1977 western Argentina earthquake

    Science.gov (United States)

    Langer, C.J.; Hartzell, S.

    1996-01-01

    Teleseismic P and SH body waves are used in a finite-fault, waveform inversion for the rupture history of the 23 November 1977 western Argentina earthquake. This double event consists of a smaller foreshock (M0 = 5.3 ?? 1026 dyn-cm) followed about 20 s later by a larger main shock (M0 = 1.5 ?? 1027 dyn-cm). Our analysis indicates that these two events occurred on different fault segments: with the foreshock having a strike, dip, and average rake of 345??, 45??E, and 50??, and the main shock 10??, 45??E, and 80??, respectively. The foreshock initiated at a depth of 17 km and propagated updip and to the north. The main shock initiated at the southern end of the foreshock zone at a depth of 25 to 30 km, and propagated updip and unilaterally to the south. The north-south separation of the centroids of the moment release for the foreshock and main shock is about 60 km. The apparent triggering of the main shock by the foreshock is similar to other earthquakes that have involved the failure of multiple fault segments, such as the 1992 Landers, California, earthquake. Such occurrences argue against the use of individual, mapped, surface fault or fault-segment lengths in the determination of the size and frequency of future earthquakes.

  8. Teleseismic analysis of the 1990 and 1991 earthquakes near Potenza

    Directory of Open Access Journals (Sweden)

    G. Ekstrom

    1994-06-01

    Full Text Available Analysis of the available teleseismic data for two moderate earthquakes near the town of Potenza in the Southern Apennines shows that both involve strike-slip faulting on a plane oriented approximately east-west. Only the larger, 5 May 1990, earthquake is sufficiently large for analysis by conventional teleseismic waveform inversion methods, and is seen to consist of a foreshock followed 11 seconds later by the main release of moment. The focal mechanism and seismic moment of the 26 May 1991 earthquake is determined by quantitative comparison of its 15-60 s period surface waves with those generated by the 5 May 1990 event. The focal mechanisms for the two events are found to be very similar. The 1991 earthquake has a scalar moment that is approximately 18% that of the 1990 mainshock. Comparison of higher frequency P waves for the two events, recorded at regional distance, shows that the ratio of trace amplitudes is smaller than the ratio of scalar moments, suggesting that the stress drop for the 1991 event is distinctly smaller than for the 1990 mainshock.

  9. Understanding dynamic friction through spontaneously evolving laboratory earthquakes.

    Science.gov (United States)

    Rubino, V; Rosakis, A J; Lapusta, N

    2017-06-29

    Friction plays a key role in how ruptures unzip faults in the Earth's crust and release waves that cause destructive shaking. Yet dynamic friction evolution is one of the biggest uncertainties in earthquake science. Here we report on novel measurements of evolving local friction during spontaneously developing mini-earthquakes in the laboratory, enabled by our ultrahigh speed full-field imaging technique. The technique captures the evolution of displacements, velocities and stresses of dynamic ruptures, whose rupture speed range from sub-Rayleigh to supershear. The observed friction has complex evolution, featuring initial velocity strengthening followed by substantial velocity weakening. Our measurements are consistent with rate-and-state friction formulations supplemented with flash heating but not with widely used slip-weakening friction laws. This study develops a new approach for measuring local evolution of dynamic friction and has important implications for understanding earthquake hazard since laws governing frictional resistance of faults are vital ingredients in physically-based predictive models of the earthquake source.

  10. Complex rupture during the 12 January 2010 Haiti earthquake

    Science.gov (United States)

    Hayes, G.P.; Briggs, R.W.; Sladen, A.; Fielding, E.J.; Prentice, C.; Hudnut, K.; Mann, P.; Taylor, F.W.; Crone, A.J.; Gold, R.; Ito, T.; Simons, M.

    2010-01-01

    Initially, the devastating Mw 7.0, 12 January 2010 Haiti earthquake seemed to involve straightforward accommodation of oblique relative motion between the Caribbean and North American plates along the Enriquillog-Plantain Garden fault zone. Here, we combine seismological observations, geologic field data and space geodetic measurements to show that, instead, the rupture process may have involved slip on multiple faults. Primary surface deformation was driven by rupture on blind thrust faults with only minor, deep, lateral slip along or near the main Enriquillog-Plantain Garden fault zone; thus the event only partially relieved centuries of accumulated left-lateral strain on a small part of the plate-boundary system. Together with the predominance of shallow off-fault thrusting, the lack of surface deformation implies that remaining shallow shear strain will be released in future surface-rupturing earthquakes on the Enriquillog-Plantain Garden fault zone, as occurred in inferred Holocene and probable historic events. We suggest that the geological signature of this earthquakeg-broad warping and coastal deformation rather than surface rupture along the main fault zoneg-will not be easily recognized by standard palaeoseismic studies. We conclude that similarly complex earthquakes in tectonic environments that accommodate both translation and convergenceg-such as the San Andreas fault through the Transverse Ranges of Californiag-may be missing from the prehistoric earthquake record. ?? 2010 Macmillan Publishers Limited. All rights reserved.

  11. Methane release

    International Nuclear Information System (INIS)

    Seifert, M.

    1999-01-01

    The Swiss Gas Industry has carried out a systematic, technical estimate of methane release from the complete supply chain from production to consumption for the years 1992/1993. The result of this survey provided a conservative value, amounting to 0.9% of the Swiss domestic output. A continuation of the study taking into account new findings with regard to emission factors and the effect of the climate is now available, which provides a value of 0.8% for the target year of 1996. These results show that the renovation of the network has brought about lower losses in the local gas supplies, particularly for the grey cast iron pipelines. (author)

  12. Earthquake Prediction in a Big Data World

    Science.gov (United States)

    Kossobokov, V. G.

    2016-12-01

    The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance

  13. ELER software - a new tool for urban earthquake loss assessment

    Science.gov (United States)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    Rapid loss estimation after potentially damaging earthquakes is critical for effective emergency response and public information. A methodology and software package, ELER-Earthquake Loss Estimation Routine, for rapid estimation of earthquake shaking and losses throughout the Euro-Mediterranean region was developed under the Joint Research Activity-3 (JRA3) of the EC FP6 Project entitled "Network of Research Infrastructures for European Seismology-NERIES". Recently, a new version (v2.0) of ELER software has been released. The multi-level methodology developed is capable of incorporating regional variability and uncertainty originating from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. Although primarily intended for quasi real-time estimation of earthquake shaking and losses, the routine is also equally capable of incorporating scenario-based earthquake loss assessments. This paper introduces the urban earthquake loss assessment module (Level 2) of the ELER software which makes use of the most detailed inventory databases of physical and social elements at risk in combination with the analytical vulnerability relationships and building damage-related casualty vulnerability models for the estimation of building damage and casualty distributions, respectively. Spectral capacity-based loss assessment methodology and its vital components are presented. The analysis methods of the Level 2 module, i.e. Capacity Spectrum Method (ATC-40, 1996), Modified Acceleration-Displacement Response Spectrum Method (FEMA 440, 2005), Reduction Factor Method (Fajfar, 2000) and Coefficient Method (ASCE 41-06, 2006), are applied to the selected building types for validation and verification purposes. The damage estimates are compared to the results obtained from the other studies available in the literature, i.e. SELENA v4.0 (Molina et al., 2008) and

  14. Evidence for Ancient Mesoamerican Earthquakes

    Science.gov (United States)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  15. Correlation between Earthquakes and AE Monitoring of Historical Buildings in Seismic Areas

    Directory of Open Access Journals (Sweden)

    Giuseppe Lacidogna

    2015-12-01

    Full Text Available In this contribution a new method for evaluating seismic risk in regional areas based on the acoustic emission (AE technique is proposed. Most earthquakes have precursors, i.e., phenomena of changes in the Earth’s physical-chemical properties that take place prior to an earthquake. Acoustic emissions in materials and earthquakes in the Earth’s crust, despite the fact that they take place on very different scales, are very similar phenomena; both are caused by a release of elastic energy from a source located in a medium. For the AE monitoring, two important constructions of Italian cultural heritage are considered: the chapel of the “Sacred Mountain of Varallo” and the “Asinelli Tower” of Bologna. They were monitored during earthquake sequences in their relative areas. By using the Grassberger-Procaccia algorithm, a statistical method of analysis was developed that detects AEs as earthquake precursors or aftershocks. Under certain conditions it was observed that AEs precede earthquakes. These considerations reinforce the idea that the AE monitoring can be considered an effective tool for earthquake risk evaluation.

  16. Injection-induced moment release can also be aseismic

    Science.gov (United States)

    McGarr, Arthur; Barbour, Andrew J.

    2018-01-01

    The cumulative seismic moment is a robust measure of the earthquake response to fluid injection for injection volumes ranging from 3100 to about 12 million m3. Over this range, the moment release is limited to twice the product of the shear modulus and the volume of injected fluid. This relation also applies at the much smaller injection volumes of the field experiment in France reported by Guglielmi, et al. (2015) and laboratory experiments to simulate hydraulic fracturing described by Goodfellow, et al. (2015). In both of these studies, the relevant moment release for comparison with the fluid injection was aseismic and consistent with the scaling that applies to the much larger volumes associated with injection-induced earthquakes with magnitudes extending up to 5.8. Neither the micro-earthquakes, at the site in France, nor the acoustic emission in the laboratory samples contributed significantly to the deformation due to fluid injection.

  17. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  18. New streams and springs after the 2014 Mw6.0 South Napa earthquake.

    Science.gov (United States)

    Wang, Chi-Yuen; Manga, Michael

    2015-07-09

    Many streams and springs, which were dry or nearly dry before the 2014 Mw6.0 South Napa earthquake, started to flow after the earthquake. A United States Geological Survey stream gauge also registered a coseismic increase in discharge. Public interest was heightened by a state of extreme drought in California. Since the new flows were not contaminated by pre-existing surface water, their composition allowed unambiguous identification of their origin. Following the earthquake we repeatedly surveyed the new flows, collecting data to test hypotheses about their origin. We show that the new flows originated from groundwater in nearby mountains released by the earthquake. The estimated total amount of new water is ∼ 10(6) m(3), about 1/40 of the annual water use in the Napa-Sonoma area. Our model also makes a testable prediction of a post-seismic decrease of seismic velocity in the shallow crust of the affected region.

  19. Geodetic constraints on afterslip characteristics following the March 9, 2011, Sanriku-oki earthquake, Japan

    Science.gov (United States)

    Ohta, Yusaku; Hino, Ryota; Inazu, Daisuke; Ohzono, Mako; Ito, Yoshihiro; Mishina, Masaaki; Iinuma, Takeshi; Nakajima, Junichi; Osada, Yukihito; Suzuki, Kensuke; Fujimoto, Hiromi; Tachibana, Kenji; Demachi, Tomotsugu; Miura, Satoshi

    2012-08-01

    A magnitude 7.3 foreshock occurred at the subducting Pacific plate interface on March 9, 2011, 51 h before the magnitude 9.0 Tohoku earthquake off the Pacific coast of Japan. We propose a coseismic and postseismic afterslip model of the magnitude 7.3 event based on a global positioning system network and ocean bottom pressure gauge sites. The estimated coseismic slip and afterslip areas show complementary spatial distributions; the afterslip distribution is located up-dip of the coseismic slip for the foreshock and northward of hypocenter of the Tohoku earthquake. The slip amount for the afterslip is roughly consistent with that determined by repeating earthquake analysis carried out in a previous study. The estimated moment release for the afterslip reached magnitude 6.8, even within a short time period of 51h. A volumetric strainmeter time series also suggests that this event advanced with a rapid decay time constant compared with other typical large earthquakes.

  20. Probabilistic risk assessment of earthquakes at the Rocky Flats Plant and subsequent upgrade to reduce risk

    International Nuclear Information System (INIS)

    Day, S.A.

    1989-01-01

    An analysis to determine the risk associated with earthquakes at the Rocky Flats Plant was performed. Seismic analyses and structural evaluations were used to postulate building and equipment damage and radiological releases to the environment from various magnitudes of earthquakes. Dispersion modeling and dose assessment to the public were then calculated. The frequency of occurrence of various magnitudes of earthquakes were determined from the Department of Energy natural Phenomena Hazards Modeling Project. Risk to the public was probabilistically assessed for each magnitude of earthquake and for overall seismic risk. Based on the results of this Probabilistic Risk Assessment and a cost/benefit analysis, seismic upgrades are being implemented for several plutonium-handling facilities for the purpose of risk reduction

  1. Rapid acceleration leads to rapid weakening in earthquake-like laboratory experiments

    Science.gov (United States)

    Chang, Jefferson C.; Lockner, David A.; Reches, Z.

    2012-01-01

    After nucleation, a large earthquake propagates as an expanding rupture front along a fault. This front activates countless fault patches that slip by consuming energy stored in Earth’s crust. We simulated the slip of a fault patch by rapidly loading an experimental fault with energy stored in a spinning flywheel. The spontaneous evolution of strength, acceleration, and velocity indicates that our experiments are proxies of fault-patch behavior during earthquakes of moment magnitude (Mw) = 4 to 8. We show that seismically determined earthquake parameters (e.g., displacement, velocity, magnitude, or fracture energy) can be used to estimate the intensity of the energy release during an earthquake. Our experiments further indicate that high acceleration imposed by the earthquake’s rupture front quickens dynamic weakening by intense wear of the fault zone.

  2. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  3. Precursory earthquakes of the 1943 eruption of Paricutin volcano, Michoacan, Mexico

    Science.gov (United States)

    Yokoyama, I.; de la Cruz-Reyna, S.

    1990-12-01

    Paricutin volcano is a monogenetic volcano whose birth and growth were observed by modern volcanological techniques. At the time of its birth in 1943, the seismic activity in central Mexico was mainly recorded by the Wiechert seismographs at the Tacubaya seismic station in Mexico City about 320 km east of the volcano area. In this paper we aim to find any characteristics of precursory earthquakes of the monogenetic eruption. Though there are limits in the available information, such as imprecise location of hypocenters and lack of earthquake data with magnitudes under 3.0. The available data show that the first precursory earthquake occurred on January 7, 1943, with a magnitude of 4.4. Subsequently, 21 earthquakes ranging from 3.2 to 4.5 in magnitude occurred before the outbreak of the eruption on February 20. The (S - P) durations of the precursory earthquakes do not show any systematic changes within the observational errors. The hypocenters were rather shallow and did not migrate. The precursory earthquakes had a characteristic tectonic signature, which was retained through the whole period of activity. However, the spectra of the P-waves of the Paricutin earthquakes show minor differences from those of tectonic earthquakes. This fact helped in the identification of Paricutin earthquakes. Except for the first shock, the maximum earthquake magnitudes show an increasing tendency with time towards the outbreak. The total seismic energy released by the precursory earthquakes amounted to 2 × 10 19 ergs. Considering that statistically there is a threshold of cumulative seismic energy release (10 17-18ergs) by precursory earthquakes in polygenetic volcanoes erupting after long quiescence, the above cumulative energy is exceptionally large. This suggests that a monogenetic volcano may need much more energy to clear the way of magma passage to the earth surface than a polygenetic one. The magma ascent before the outbreak of Paricutin volcano is interpretable by a model

  4. Earthquake prediction in Japan and natural time analysis of seismicity

    Science.gov (United States)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  5. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  6. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  7. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  8. Awareness and understanding of earthquake hazards at school

    Science.gov (United States)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  9. [Medical rescue of China National Earthquake Disaster Emergency Search and Rescue Team in Lushan earthquake].

    Science.gov (United States)

    Liu, Ya-hua; Yang, Hui-ning; Liu, Hui-liang; Wang, Fan; Hu, Li-bin; Zheng, Jing-chen

    2013-05-01

    To summarize and analyze the medical mission of China National Earthquake Disaster Emergency Search and Rescue Team (CNESAR) in Lushan earthquake, to promote the medical rescue effectiveness incorporated with search and rescue. Retrospective analysis of medical work data by CNESAR from April 21th, 2013 to April 27th during Lushan earthquake rescue, including the medical staff dispatch and the wounded case been treated. The reasonable medical corps was composed by 22 members, including 2 administrators, 11 doctors [covering emergency medicine, orthopedics (joints and limbs, spinal), obstetrics and gynecology, gastroenterology, cardiology, ophthalmology, anesthesiology, medical rescue, health epidemic prevention, clinical laboratory of 11 specialties], 1 ultrasound technician, 5 nurses, 1 pharmacist, 1 medical instrument engineer and 1 office worker for propaganda. There were two members having psychological consultants qualifications. The medical work were carried out in seven aspects, including medical care assurance for the CNESAR members, first aid cooperation with search and rescue on site, clinical work in refugees' camp, medical round service for scattered village people, evacuation for the wounded, mental intervention, and the sanitary and anti-epidemic work. The medical work covered 24 small towns, and medical staff established 3 medical clinics at Taiping Town, Shuangshi Town of Lushan County and Baoxing County. Medical rescue, mental intervention for the old and kids, and sanitary and anti-epidemic were performed at the above sites. The medical corps had successful evacuated 2 severe wounded patients and treated the wounded over thousands. Most of the wounded were soft tissue injuries, external injury, respiratory tract infections, diarrhea, and heat stroke. Compared with the rescue action in 2008 Wenchuan earthquake, the aggregation and departure of rescue team in Lushan earthquake, the traffic control order in disaster area, the self-aid and buddy aid

  10. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  11. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    Science.gov (United States)

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  12. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes

    Science.gov (United States)

    Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai

    2018-01-01

    Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.

  13. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  14. Book review: Earthquakes and water

    Science.gov (United States)

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  15. Earthquake recurrence and magnitude and seismic deformation of the northwestern Okhotsk plate, northeast Russia

    Science.gov (United States)

    Hindle, D.; Mackey, K.

    2011-02-01

    Recorded seismicity from the northwestern Okhotsk plate, northeast Asia, is currently insufficient to account for the predicted slip rates along its boundaries due to plate tectonics. However, the magnitude-frequency relationship for earthquakes from the region suggests that larger earthquakes are possible in the future and that events of ˜Mw 7.5 which should occur every ˜100-350 years would account for almost all the slip of the plate along its boundaries due to Eurasia-North America convergence. We use models for seismic slip distribution along the bounding faults of Okhotsk to conclude that relatively little aseismic strain release is occurring and that larger future earthquakes are likely in the region. Our models broadly support the idea of a single Okhotsk plate, with the large majority of tectonic strain released along its boundaries.

  16. Geodetic characteristic of the postseismic deformation following the interplate large earthquake along the Japan Trench (Invited)

    Science.gov (United States)

    Ohta, Y.; Hino, R.; Ariyoshi, K.; Matsuzawa, T.; Mishina, M.; Sato, T.; Inazu, D.; Ito, Y.; Tachibana, K.; Demachi, T.; Miura, S.

    2013-12-01

    On March 9, 2011 at 2:45 (UTC), an M7.3 interplate earthquake (hereafter foreshock) occurred ~45 km northeast of the epicenter of the M9.0 2011 Tohoku earthquake. This foreshock preceded the 2011 Tohoku earthquake by 51 hours. Ohta et al., (2012, GRL) estimated co- and postseismic afterslip distribution based on a dense GPS network and ocean bottom pressure gauge sites. They found the afterslip distribution was mainly concentrated in the up-dip extension of the coseismic slip. The coseismic slip and afterslip distribution of the foreshock were also located in the slip deficit region (between 20-40m slip) of the coiseismic slip of the M9.0 mainshock. The slip amount for the afterslip is roughly consistent with that determined by repeating earthquake analysis carried out in a previous study (Kato et al., 2012, Science). The estimated moment release for the afterslip reached magnitude 6.8, even within a short time period of 51 hours. They also pointed out that a volumetric strainmeter time series suggests that this event advanced with a rapid decay time constant (4.8 h) compared with other typical large earthquakes. The decay time constant of the afterslip may reflect the frictional property of the plate interface, especially effective normal stress controlled by fluid. For verification of the short decay time constant of the foreshock, we investigated the postseismic deformation characteristic following the 1989 and 1992 Sanriku-Oki earthquakes (M7.1 and M6.9), 2003 and 2005 Miyagi-Oki earthquakes (M6.8 and M7.2), and 2008 Fukushima-Oki earthquake (M6.9). We used four components extensometer at Miyako (39.59N, 141.98E) on the Sanriku coast for 1989 and 1992 event. For 2003, 2005 and 2008 events, we used volumetric strainmeter at Kinka-zan (38.27N, 141.58E) and Enoshima (38.27N, 141.60E). To extract the characteristics of the postseismic deformation, we fitted the logarithmic function. The estimated decay time constants for each earthquake had almost similar range (1

  17. Urgent Safety Measures in Japan after Great East Japan Earthquake

    International Nuclear Information System (INIS)

    Taniura, Wataru; Otani, Hiroyasu

    2012-01-01

    Due to tsunami triggered by the Great East Japan Earthquake, the operating and refueling reactor facilities at Fukushima Dai-ichi and Dai-ni Nuclear Power Plants caused a nuclear hazard. Given the fact, Japanese electric power companies voluntarily began to compile various urgent measures against tsunami. And then the Nuclear and Industrial Safety Agency (NISA) ordered the licensees to put into practice the voluntarily compiled urgent safety measures, in order to ensure the effectiveness of the means for recovering cooling functions along with avoiding the release of radioactive substances to the possible minimum, even if a huge tsunami following a severe earthquake hits nuclear power plants. The following describes the state and the effect of the urgent safety measures implemented for 44 reactors (under operation) and 1 reactor (under construction) in Japan and also describes the measures to be implemented by the licensees of reactor operation in the future.

  18. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  19. Unbonded Prestressed Columns for Earthquake Resistance

    Science.gov (United States)

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  20. A global building inventory for earthquake loss estimation and risk management

    Science.gov (United States)

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  1. Earthquake disaster simulation of civil infrastructures from tall buildings to urban areas

    CERN Document Server

    Lu, Xinzheng

    2017-01-01

    Based on more than 12 years of systematic investigation on earthquake disaster simulation of civil infrastructures, this book covers the major research outcomes including a number of novel computational models, high performance computing methods and realistic visualization techniques for tall buildings and urban areas, with particular emphasize on collapse prevention and mitigation in extreme earthquakes, earthquake loss evaluation and seismic resilience. Typical engineering applications to several tallest buildings in the world (e.g., the 632 m tall Shanghai Tower and the 528 m tall Z15 Tower) and selected large cities in China (the Beijing Central Business District, Xi'an City, Taiyuan City and Tangshan City) are also introduced to demonstrate the advantages of the proposed computational models and techniques. The high-fidelity computational model developed in this book has proven to be the only feasible option to date for earthquake-induced collapse simulation of supertall buildings that are higher than 50...

  2. Earthquake focal mechanism forecasting in Italy for PSHA purposes

    Science.gov (United States)

    Roselli, Pamela; Marzocchi, Warner; Mariucci, Maria Teresa; Montone, Paola

    2018-01-01

    In this paper, we put forward a procedure that aims to forecast focal mechanism of future earthquakes. One of the primary uses of such forecasts is in probabilistic seismic hazard analysis (PSHA); in fact, aiming at reducing the epistemic uncertainty, most of the newer ground motion prediction equations consider, besides the seismicity rates, the forecast of the focal mechanism of the next large earthquakes as input data. The data set used to this purpose is relative to focal mechanisms taken from the latest stress map release for Italy containing 392 well-constrained solutions of events, from 1908 to 2015, with Mw ≥ 4 and depths from 0 down to 40 km. The data set considers polarity focal mechanism solutions until to 1975 (23 events), whereas for 1976-2015, it takes into account only the Centroid Moment Tensor (CMT)-like earthquake focal solutions for data homogeneity. The forecasting model is rooted in the Total Weighted Moment Tensor concept that weighs information of past focal mechanisms evenly distributed in space, according to their distance from the spatial cells and magnitude. Specifically, for each cell of a regular 0.1° × 0.1° spatial grid, the model estimates the probability to observe a normal, reverse, or strike-slip fault plane solution for the next large earthquakes, the expected moment tensor and the related maximum horizontal stress orientation. These results will be available for the new PSHA model for Italy under development. Finally, to evaluate the reliability of the forecasts, we test them with an independent data set that consists of some of the strongest earthquakes with Mw ≥ 3.9 occurred during 2016 in different Italian tectonic provinces.

  3. How complete is the ISC-GEM Global Earthquake Catalog?

    Science.gov (United States)

    Michael, Andrew J.

    2014-01-01

    The International Seismological Centre, in collaboration with the Global Earthquake Model effort, has released a new global earthquake catalog, covering the time period from 1900 through the end of 2009. In order to use this catalog for global earthquake studies, I determined the magnitude of completeness (Mc) as a function of time by dividing the earthquakes shallower than 60 km into 7 time periods based on major changes in catalog processing and data availability and applying 4 objective methods to determine Mc, with uncertainties determined by non-parametric bootstrapping. Deeper events were divided into 2 time periods. Due to differences between the 4 methods, the final Mc was determined subjectively by examining the features that each method focused on in both the cumulative and binned magnitude frequency distributions. The time periods and Mc values for shallow events are: 1900-1917, Mc=7.7; 1918-1939, Mc=7.0; 1940-1954, Mc=6.8; 1955-1963, Mc=6.5; 1964-1975, Mc=6.0; 1976-2003, Mc=5.8; and 2004-2009, Mc=5.7. Using these Mc values for the longest time periods they are valid for (e.g. 1918-2009, 1940-2009,…) the shallow data fits a Gutenberg-Richter distribution with b=1.05 and a=8.3, within 1 standard deviation, with no declustering. The exception is for time periods that include 1900-1917 in which there are only 33 events with M≥ Mc and for those few data b=2.15±0.46. That result calls for further investigations for this time period, ideally having a larger number of earthquakes. For deep events, the results are Mc=7.1 for 1900-1963, although the early data are problematic; and Mc=5.7 for 1964-2009. For that later time period, b=0.99 and a=7.3.

  4. The Need for More Earthquake Science in Southeast Asia

    Science.gov (United States)

    Sieh, K.

    2015-12-01

    Many regions within SE Asia have as great a density of active seismic structures as does the western US - Sumatra, Myanmar, Bangladesh, New Guinea and the Philippines come first to mind. Much of Earth's release of seismic energy in the current millennium has, in fact, come from these regions, with great losses of life and livelihoods. Unfortunately, the scientific progress upon which seismic-risk reduction in SE Asia ultimately depends has been and continues to be slow. Last year at AGU, for example, I counted 57 talks about the M6 Napa earthquake. In contrast, I can't recall hearing any talk on a SE Asian M6 earthquake at any venue in the past many years. In fact, even M7+ earthquakes often go unstudied. Not uncommonly, the region's earthquake scientists face high financial and political impediments to conducting earthquake research. Their slow speed in the development of scientific knowledge doesn't bode well for speedy progress in the science of seismic hazards, the sine qua non for substantially reducing seismic risk. There are two basic necessities for the region to evolve significantly from the current state of affairs. Both involve the development of regional infrastructure: 1) Data: Robust and accessible geophysical monitoring systems would need to be installed, maintained and utilized by the region's earth scientists and their results shared internationally. Concomitantly, geological mapping (sensu lato) would need to be undertaken. 2) People: The training, employment, and enduring support of a new, young, international corps of earth scientists would need to accelerate markedly. The United States could play an important role in achieving the goal of significant seismic risk reduction in the most seismically active countries of SE Asia by taking the lead in establishing a coalition to robustly fund a multi-decadal program that supports scientists and their research institutions to work alongside local expertise.

  5. Coseismic and postseismic deformation associated with the 2016 Mw 7.8 Kaikoura earthquake, New Zealand: fault movement investigation and seismic hazard analysis

    Science.gov (United States)

    Jiang, Zhongshan; Huang, Dingfa; Yuan, Linguo; Hassan, Abubakr; Zhang, Lupeng; Yang, Zhongrong

    2018-04-01

    The 2016 moment magnitude (Mw) 7.8 Kaikoura earthquake demonstrated that multiple fault segments can undergo rupture during a single seismic event. Here, we employ Global Positioning System (GPS) observations and geodetic modeling methods to create detailed images of coseismic slip and postseismic afterslip associated with the Kaikoura earthquake. Our optimal geodetic coseismic model suggests that rupture not only occurred on shallow crustal faults but also to some extent at the Hikurangi subduction interface. The GPS-inverted moment release during the earthquake is equivalent to a Mw 7.9 event. The near-field postseismic deformation is mainly derived from right-lateral strike-slip motions on shallow crustal faults. The afterslip did not only significantly extend northeastward on the Needles fault but also appeared at the plate interface, slowly releasing energy over the past 6 months, equivalent to a Mw 7.3 earthquake. Coulomb stress changes induced by coseismic deformation exhibit complex patterns and diversity at different depths, undoubtedly reflecting multi-fault rupture complexity associated with the earthquake. The Coulomb stress can reach several MPa during coseismic deformation, which can explain the trigger mechanisms of afterslip in two high-slip regions and the majority of aftershocks. Based on the deformation characteristics of the Kaikoura earthquake, interseismic plate coverage, and historical earthquakes, we conclude that Wellington is under higher seismic threat after the earthquake and great attention should be paid to potential large earthquake disasters in the near future.[Figure not available: see fulltext.

  6. Earthquake: Game-based learning for 21st century STEM education

    Science.gov (United States)

    Perkins, Abigail Christine

    To play is to learn. A lack of empirical research within game-based learning literature, however, has hindered educational stakeholders to make informed decisions about game-based learning for 21st century STEM education. In this study, I modified a research and development (R&D) process to create a collaborative-competitive educational board game illuminating elements of earthquake engineering. I oriented instruction- and game-design principles around 21st century science education to adapt the R&D process to develop the educational game, Earthquake. As part of the R&D, I evaluated Earthquake for empirical evidence to support the claim that game-play results in student gains in critical thinking, scientific argumentation, metacognitive abilities, and earthquake engineering content knowledge. I developed Earthquake with the aid of eight focus groups with varying levels of expertise in science education research, teaching, administration, and game-design. After developing a functional prototype, I pilot-tested Earthquake with teacher-participants (n=14) who engaged in semi-structured interviews after their game-play. I analyzed teacher interviews with constant comparison methodology. I used teachers' comments and feedback from content knowledge experts to integrate game modifications, implementing results to improve Earthquake. I added player roles, simplified phrasing on cards, and produced an introductory video. I then administered the modified Earthquake game to two groups of high school student-participants (n = 6), who played twice. To seek evidence documenting support for my knowledge claim, I analyzed videotapes of students' game-play using a game-based learning checklist. My assessment of learning gains revealed increases in all categories of students' performance: critical thinking, metacognition, scientific argumentation, and earthquake engineering content knowledge acquisition. Players in both student-groups improved mostly in critical thinking, having

  7. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  8. EARTHQUAKE RESEARCH PROBLEMS OF NUCLEAR POWER GENERATORS

    Energy Technology Data Exchange (ETDEWEB)

    Housner, G. W.; Hudson, D. E.

    1963-10-15

    Earthquake problems associated with the construction of nuclear power generators require a more extensive and a more precise knowledge of earthquake characteristics and the dynamic behavior of structures than was considered necessary for ordinary buildings. Economic considerations indicate the desirability of additional research on the problems of earthquakes and nuclear reactors. The nature of these earthquake-resistant design problems is discussed and programs of research are recommended. (auth)

  9. Historical earthquake investigations in Greece

    Directory of Open Access Journals (Sweden)

    K. Makropoulos

    2004-06-01

    Full Text Available The active tectonics of the area of Greece and its seismic activity have always been present in the country?s history. Many researchers, tempted to work on Greek historical earthquakes, have realized that this is a task not easily fulfilled. The existing catalogues of strong historical earthquakes are useful tools to perform general SHA studies. However, a variety of supporting datasets, non-uniformly distributed in space and time, need to be further investigated. In the present paper, a review of historical earthquake studies in Greece is attempted. The seismic history of the country is divided into four main periods. In each one of them, characteristic examples, studies and approaches are presented.

  10. Crowdsourcing earthquake damage assessment using remote sensing imagery

    Directory of Open Access Journals (Sweden)

    Stuart Gill

    2011-06-01

    Full Text Available This paper describes the evolution of recent work on using crowdsourced analysis of remote sensing imagery, particularly high-resolution aerial imagery, to provide rapid, reliable assessments of damage caused by earthquakes and potentially other disasters. The initial effort examined online imagery taken after the 2008 Wenchuan, China, earthquake. A more recent response to the 2010 Haiti earthquake led to the formation of an international consortium: the Global Earth Observation Catastrophe Assessment Network (GEO-CAN. The success of GEO-CAN in contributing to the official damage assessments made by the Government of Haiti, the United Nations, and the World Bank led to further development of a web-based interface. A current initiative in Christchurch, New Zealand, is underway where remote sensing experts are analyzing satellite imagery, geotechnical engineers are marking liquefaction areas, and structural engineers are identifying building damage. The current site includes online training to improve the accuracy of the assessments and make it possible for even novice users to contribute to the crowdsourced solution. The paper discusses lessons learned from these initiatives and presents a way forward for using crowdsourced remote sensing as a tool for rapid assessment of damage caused by natural disasters around the world.

  11. Distinguishing megathrust from intraplate earthquakes using lacustrine turbidites (Laguna Lo Encañado, Central Chile)

    Science.gov (United States)

    Van Daele, Maarten; Araya-Cornejo, Cristian; Pille, Thomas; Meyer, Inka; Kempf, Philipp; Moernaut, Jasper; Cisternas, Marco

    2017-04-01

    triggered by megathrust earthquakes. These findings are an important step forward in the interpretation of lacustrine turbidites in subduction settings, and will eventually improve hazard assessments based on such paleoseismic records in the study area, and in other subduction zones. References Howarth et al., 2014. Lake sediments record high intensity shaking that provides insight into the location and rupture length of large earthquakes on the Alpine Fault, New Zealand. Earth and Planetary Science Letters 403, 340-351. Lomnitz, 1960. A study of the Maipo Valley earthquakes of September 4, 1958, Second World Conference on Earthquake Engineering, Tokyo and Kyoto, Japan, pp. 501-520. Sepulveda et al., 2008. New Findings on the 1958 Las Melosas Earthquake Sequence, Central Chile: Implications for Seismic Hazard Related to Shallow Crustal Earthquakes in Subduction Zones. Journal of Earthquake Engineering 12, 432-455. Van Daele et al., 2015. A comparison of the sedimentary records of the 1960 and 2010 great Chilean earthquakes in 17 lakes: Implications for quantitative lacustrine palaeoseismology. Sedimentology 62, 1466-1496.

  12. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  13. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  14. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  15. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    Science.gov (United States)

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and

  16. Radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Planinic, J.; Radolic, V.; Vukovic, B.

    2004-01-01

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude ≥3 at epicentral distances ≤200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined

  17. Radon as an earthquake precursor

    Energy Technology Data Exchange (ETDEWEB)

    Planinic, J. E-mail: planinic@pedos.hr; Radolic, V.; Vukovic, B

    2004-09-11

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude {>=}3 at epicentral distances {<=}200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined.

  18. Earthquake location in island arcs

    Science.gov (United States)

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  19. Subduction zone earthquake probably triggered submarine hydrocarbon seepage offshore Pakistan

    Science.gov (United States)

    Fischer, David; José M., Mogollón; Michael, Strasser; Thomas, Pape; Gerhard, Bohrmann; Noemi, Fekete; Volkhard, Spiess; Sabine, Kasten

    2014-05-01

    Seepage of methane-dominated hydrocarbons is heterogeneous in space and time, and trigger mechanisms of episodic seep events are not well constrained. It is generally found that free hydrocarbon gas entering the local gas hydrate stability field in marine sediments is sequestered in gas hydrates. In this manner, gas hydrates can act as a buffer for carbon transport from the sediment into the ocean. However, the efficiency of gas hydrate-bearing sediments for retaining hydrocarbons may be corrupted: Hypothesized mechanisms include critical gas/fluid pressures beneath gas hydrate-bearing sediments, implying that these are susceptible to mechanical failure and subsequent gas release. Although gas hydrates often occur in seismically active regions, e.g., subduction zones, the role of earthquakes as potential triggers of hydrocarbon transport through gas hydrate-bearing sediments has hardly been explored. Based on a recent publication (Fischer et al., 2013), we present geochemical and transport/reaction-modelling data suggesting a substantial increase in upward gas flux and hydrocarbon emission into the water column following a major earthquake that occurred near the study sites in 1945. Calculating the formation time of authigenic barite enrichments identified in two sediment cores obtained from an anticlinal structure called "Nascent Ridge", we find they formed 38-91 years before sampling, which corresponds well to the time elapsed since the earthquake (62 years). Furthermore, applying a numerical model, we show that the local sulfate/methane transition zone shifted upward by several meters due to the increased methane flux and simulated sulfate profiles very closely match measured ones in a comparable time frame of 50-70 years. We thus propose a causal relation between the earthquake and the amplified gas flux and present reflection seismic data supporting our hypothesis that co-seismic ground shaking induced mechanical fracturing of gas hydrate-bearing sediments

  20. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    Science.gov (United States)

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  1. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    Science.gov (United States)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  2. Earthquake Energy Distribution along the Earth Surface and Radius

    International Nuclear Information System (INIS)

    Varga, P.; Krumm, F.; Riguzzi, F.; Doglioni, C.; Suele, B.; Wang, K.; Panza, G.F.

    2010-07-01

    The global earthquake catalog of seismic events with M W ≥ 7.0, for the time interval from 1950 to 2007, shows that the depth distribution of earthquake energy release is not uniform. The 90% of the total earthquake energy budget is dissipated in the first ∼30km, whereas most of the residual budget is radiated at the lower boundary of the transition zone (410 km - 660 km), above the upper-lower mantle boundary. The upper border of the transition zone at around 410 km of depth is not marked by significant seismic energy release. This points for a non-dominant role of the slabs in the energy budged of plate tectonics. Earthquake number and energy release, although not well correlated, when analysed with respect to the latitude, show a decrease toward the polar areas. Moreover, the radiated energy has the highest peak close to