WorldWideScience

Sample records for operating basis earthquake

  1. Determination of Design Basis Earthquake ground motion

    International Nuclear Information System (INIS)

    Kato, Muneaki

    1997-01-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  2. Determination of Design Basis Earthquake ground motion

    Energy Technology Data Exchange (ETDEWEB)

    Kato, Muneaki [Japan Atomic Power Co., Tokyo (Japan)

    1997-03-01

    This paper describes principle of determining of Design Basis Earthquake following the Examination Guide, some examples on actual sites including earthquake sources to be considered, earthquake response spectrum and simulated seismic waves. In sppendix of this paper, furthermore, seismic safety review for N.P.P designed before publication of the Examination Guide was summarized with Check Basis Earthquake. (J.P.N.)

  3. Seismic methodology in determining basis earthquake for nuclear installation

    International Nuclear Information System (INIS)

    Ameli Zamani, Sh.

    2008-01-01

    Design basis earthquake ground motions for nuclear installations should be determined to assure the design purpose of reactor safety: that reactors should be built and operated to pose no undue risk to public health and safety from earthquake and other hazards. Regarding the influence of seismic hazard to a site, large numbers of earthquake ground motions can be predicted considering possible variability among the source, path, and site parameters. However, seismic safety design using all predicted ground motions is practically impossible. In the determination of design basis earthquake ground motions it is therefore important to represent the influences of the large numbers of earthquake ground motions derived from the seismic ground motion prediction methods for the surrounding seismic sources. Viewing the relations between current design basis earthquake ground motion determination and modem earthquake ground motion estimation, a development of risk-informed design basis earthquake ground motion methodology is discussed for insight into the on going modernization of the Examination Guide for Seismic Design on NPP

  4. Data base pertinent to earthquake design basis

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1988-01-01

    Mitigation of earthquake risk from impending strong earthquakes is possible provided the hazard can be assessed, and translated into appropriate design inputs. This requires defining the seismic risk problem, isolating the risk factors and quantifying risk in terms of physical parameters, which are suitable for application in design. Like all other geological phenomena, past earthquakes hold the key to the understanding of future ones. Quantificatio n of seismic risk at a site calls for investigating the earthquake aspects of the site region and building a data base. The scope of such investigations is il lustrated in Figure 1 and 2. A more detailed definition of the earthquake problem in engineering design is given elsewhere (Sharma, 1987). The present document discusses the earthquake data base, which is required to support a seismic risk evaluation programme in the context of the existing state of the art. (author). 8 tables, 10 figs., 54 refs

  5. Japan Catastrophic Earthquake and Tsunami in Fukushima Daiichi NPP; Is it Beyond Design Basis Accident or a Design Deficiency and Operator Unawareness?

    International Nuclear Information System (INIS)

    Gaafar, M.A.; Refeat, R.M.; EL-Kady, A.A.

    2012-01-01

    On March 11, 2011 a catastrophic earthquake and tsunami struck the north east coast of Japan. This catastrophe damaged fully or partially the six units of the Fukushima Daiichi Nuclear Power Plant.Questions were raised following the aftermath, whether it is beyond design basis accident caused by severe natural event or a failure by the Japanese authorities to plan to deal with such accident. There are many indications that the Utility of Fukushima Daiichi NPP, Tokyo Electric Power Company (TEPCO), did not pay enough attention to numerous facts about the incompatibility of the site and several design defects in the plant units. In fact there are three other NPP sites nearby Fukushima Daiichi Plant (about 30 to 60 Km far from Fukushima Daiichi NPP), with different site characteristics, which survived the same catastrophic earthquake and tsunami, but they were automatically turned into a safe shutdown state. These plants sites are Fukushima Daini Plant (4 units), Onagawa Plant (3 units) and Tokai Daini (II) Plant (one unit). In this paper, the aftermath Fukushima Daiichi plant integrity is pointed out. Some facts about the site and design concerns which could have implications on the accident are discussed. The response of Japan Authority is outlined and some remarks about their actions are underlined. The impacts of this disaster on the Nuclear Power Program worldwide are also discussed.

  6. On the plant operators performance during earthquake

    International Nuclear Information System (INIS)

    Kitada, Y.; Yoshimura, S.; Abe, M.; Niwa, H.; Yoneda, T.; Matsunaga, M.; Suzuki, T.

    1994-01-01

    There is little data on which to judge the performance of plant operators during and after strong earthquakes. In order to obtain such data to enhance the reliability on the plant operation, a Japanese utility and a power plant manufacturer carried out a vibration test using a shaking table. The purpose of the test was to investigate operator performance, i.e., the quickness and correctness in switch handling and panel meter read-out. The movement of chairs during earthquake as also of interest, because if the chairs moved significantly or turned over during a strong earthquake, some arresting mechanism would be required for the chair. Although there were differences between the simulated earthquake motions used and actual earthquakes mainly due to the specifications of the shaking table, the earthquake motions had almost no influence on the operators of their capability (performance) for operating the simulated console and the personal computers

  7. The earthquake problem in engineering design: generating earthquake design basis information

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1987-01-01

    Designing earthquake resistant structures requires certain design inputs specific to the seismotectonic status of the region, in which a critical facility is to be located. Generating these inputs requires collection of earthquake related information using present day techniques in seismology and geology, and processing the collected information to integrate it to arrive at a consolidated picture of the seismotectonics of the region. The earthquake problem in engineering design has been outlined in the context of a seismic design of nuclear power plants vis a vis current state of the art techniques. The extent to which the accepted procedures of assessing seismic risk in the region and generating the design inputs have been adherred to determine to a great extent the safety of the structures against future earthquakes. The document is a step towards developing an aproach for generating these inputs, which form the earthquake design basis. (author)

  8. On operator diagnosis aid in severe earthquakes

    International Nuclear Information System (INIS)

    Lee, S.H.; Okrent, D.

    1988-01-01

    During a severe earthquake, any component, system, or structure may fail; the plant may be driven into a very complex situation in which instrumentaion and control systems may also fail and provide operators with unreliable information about the processing parameters crucial to plant safety. What can operators do when faced with such complexity. Even though the likelihood of such a severe earthquake may be very low, its consequence may be more serious if mitigative measures are not thought out and implemented in advance. The objectives of the present study is related to the measures to protect the plant from severe damage due to large earthquakes, namely, the improvement of operator capability to respond to seismic damage through the use of Emergency Procedure Guidelines (EPGs). The fact that the symptoms presented to operators may be unreliable in severe earthquakes endangers the validity of actions in EPGs. It is the purpose of this study to design a tool through which study may be done so that the weakness of EPGs may be identified in advance then, if possible, according to the practice results some learning may be obtained so that EPGs may be improved to accomodate the complexity to a maximum. In other words, the present study intends to provide a tool which may simulate available signals, including false ones, such that EPGs may be examined and operator actions may be studied. It is hoped to develop some knowledge needed to complement the currently available knowledge. The final product of this study shall be a program which may provide users the rationale on how it reachs conclusions such that users may improve their knowledge, as well as a program whose knowledge may be updated via user interfacing

  9. Design basis earthquakes for critical industrial facilities and their characteristics, and the Southern Hyogo prefecture earthquake, 17 January 1995

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Heki

    1998-12-01

    This paper deals with how to establish the concept of the design basis earthquake (DBE) for critical industrial facilities such as nuclear power plants in consideration of disasters such as the Southern Hyogo prefecture earthquake, the so-called Kobe earthquake in 1995. The author once discussed various DBEs at the 7th World Conference on Earthquake Engineering. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared the values of accelerations of a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo prefecture earthquake in 1995 exceeded the previous assumption of the author, even though the results of the previous paper had been pessimistic. According to the experience of the Kobe event, the author will point out the necessity of the third earthquake S{sub s} adding to S{sub 1} and S{sub 2} of previous DBEs.

  10. Assessment of precast beam-column using capacity demand response spectrum subject to design basis earthquake and maximum considered earthquake

    Science.gov (United States)

    Ghani, Kay Dora Abd.; Tukiar, Mohd Azuan; Hamid, Nor Hayati Abdul

    2017-08-01

    Malaysia is surrounded by the tectonic feature of the Sumatera area which consists of two seismically active inter-plate boundaries, namely the Indo-Australian and the Eurasian Plates on the west and the Philippine Plates on the east. Hence, Malaysia experiences tremors from far distant earthquake occurring in Banda Aceh, Nias Island, Padang and other parts of Sumatera Indonesia. In order to predict the safety of precast buildings in Malaysia under near field ground motion the response spectrum analysis could be used for dealing with future earthquake whose specific nature is unknown. This paper aimed to develop of capacity demand response spectrum subject to Design Basis Earthquake (DBE) and Maximum Considered Earthquake (MCE) in order to assess the performance of precast beam column joint. From the capacity-demand response spectrum analysis, it can be concluded that the precast beam-column joints would not survive when subjected to earthquake excitation with surface-wave magnitude, Mw, of more than 5.5 Scale Richter (Type 1 spectra). This means that the beam-column joint which was designed using the current code of practice (BS8110) would be severely damaged when subjected to high earthquake excitation. The capacity-demand response spectrum analysis also shows that the precast beam-column joints in the prototype studied would be severely damaged when subjected to Maximum Considered Earthquake (MCE) with PGA=0.22g having a surface-wave magnitude of more than 5.5 Scale Richter, or Type 1 spectra.

  11. Facts learnt from the Hanshin-Awaji disaster and consideration on design basis earthquake

    International Nuclear Information System (INIS)

    Shibata, Heki

    1997-01-01

    This paper will deal with how to establish the concept of the design basis earthquake for critical industrial facilities such as nuclear power plants in consideration of disasters induced by the 1995 Hyogoken-Nanbu Earthquake (Southern Hyogo-prefecture Earthquake-1995), so-called Kobe earthquake. The author once discussed various DBEs at 7 WCEE. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared to the values of accelerations to a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo-pref. Earthquake-1995 exceeded the previous assumption of the author, even though the evaluation results of the previous paper had been pessimistic. According to the experience of Kobe event, the author will point out the necessity of the third earthquake S s adding to S 1 and S 2 , previous DBEs. (author)

  12. Facts learnt from the Hanshin-Awaji disaster and consideration on design basis earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Shibata, Heki [Yokohama National Univ. (Japan). Faculty of Engineering

    1997-03-01

    This paper will deal with how to establish the concept of the design basis earthquake for critical industrial facilities such as nuclear power plants in consideration of disasters induced by the 1995 Hyogoken-Nanbu Earthquake (Southern Hyogo-prefecture Earthquake-1995), so-called Kobe earthquake. The author once discussed various DBEs at 7 WCEE. At that time, the author assumed that the strongest effective PGA would be 0.7 G, and compared to the values of accelerations to a structure obtained by various codes in Japan and other countries. The maximum PGA observed by an instrument at the Southern Hyogo-pref. Earthquake-1995 exceeded the previous assumption of the author, even though the evaluation results of the previous paper had been pessimistic. According to the experience of Kobe event, the author will point out the necessity of the third earthquake S{sub s} adding to S{sub 1} and S{sub 2}, previous DBEs. (author)

  13. Basis of valve operator selection for SMART

    International Nuclear Information System (INIS)

    Kang, H. S.; Lee, D. J.; See, J. K.; Park, C. K.; Choi, B. S.

    2000-05-01

    SMART, an integral reactor with enhanced safety and operability, is under development for use of the nuclear energy. The valve operator of SMART system were selected through the data survey and technical review of potential valve fabrication vendors, and it will provide the establishment and optimization of the basic system design of SMART. In order to establish and optimize the basic system design of SMART, the basis of selection for the valve operator type were provided based on the basic design requirements. The basis of valve operator selection for SMART will be used as a basic technical data for the SMART basic and detail design and a fundamental material for the new reactor development in the future

  14. Basis of valve operator selection for SMART

    Energy Technology Data Exchange (ETDEWEB)

    Kang, H. S.; Lee, D. J.; See, J. K.; Park, C. K.; Choi, B. S

    2000-05-01

    SMART, an integral reactor with enhanced safety and operability, is under development for use of the nuclear energy. The valve operator of SMART system were selected through the data survey and technical review of potential valve fabrication vendors, and it will provide the establishment and optimization of the basic system design of SMART. In order to establish and optimize the basic system design of SMART, the basis of selection for the valve operator type were provided based on the basic design requirements. The basis of valve operator selection for SMART will be used as a basic technical data for the SMART basic and detail design and a fundamental material for the new reactor development in the future.

  15. Ground motion following selection of SRS design basis earthquake and associated deterministic approach

    International Nuclear Information System (INIS)

    1991-03-01

    This report summarizes the results of a deterministic assessment of earthquake ground motions at the Savannah River Site (SRS). The purpose of this study is to assist the Environmental Sciences Section of the Savannah River Laboratory in reevaluating the design basis earthquake (DBE) ground motion at SRS during approaches defined in Appendix A to 10 CFR Part 100. This work is in support of the Seismic Engineering Section's Seismic Qualification Program for reactor restart

  16. Development of Probabilistic Design Basis Earthquake (DBE) Parameters for Moderate and High Hazard Facilities at INEEL

    International Nuclear Information System (INIS)

    Payne, S. M.; Gorman, V. W.; Jensen, S. A.; Nitzel, M. E.; Russell, M. J.; Smith, R. P.

    2000-01-01

    Design Basis Earthquake (DBE) horizontal and vertical response spectra are developed for moderate and high hazard facilities or Performance Categories (PC) 3 and 4, respectively, at the Idaho National Engineering and Environmental Laboratory (INEEL). The probabilistic DBE response spectra will replace the deterministic DBE response spectra currently in the U.S. Department of Energy Idaho Operations Office (DOE-ID) Architectural Engineering Standards that govern seismic design criteria for several facility areas at the INEEL. Probabilistic DBE response spectra are recommended to DOE Naval Reactors for use at the Naval Reactor Facility at INEEL. The site-specific Uniform Hazard Spectra (UHS) developed by URS Greiner Woodward Clyde Federal Services are used as the basis for developing the DBE response spectra. In 1999, the UHS for all INEEL facility areas were recomputed using more appropriate attenuation relationships for the Basin and Range province. The revised UHS have lower ground motions than those produced in the 1996 INEEL site-wide probabilistic ground motion study. The DBE response spectra were developed by incorporating smoothed broadened regions of the peak accelerations, velocities, and displacements defined by the site-specific UHS. Portions of the DBE response spectra were adjusted to ensure conservatism for the structural design process

  17. Minimization of Basis Risk in Parametric Earthquake Cat Bonds

    Science.gov (United States)

    Franco, G.

    2009-12-01

    A catastrophe -cat- bond is an instrument used by insurance and reinsurance companies, by governments or by groups of nations to cede catastrophic risk to the financial markets, which are capable of supplying cover for highly destructive events, surpassing the typical capacity of traditional reinsurance contracts. Parametric cat bonds, a specific type of cat bonds, use trigger mechanisms or indices that depend on physical event parameters published by respected third parties in order to determine whether a part or the entire bond principal is to be paid for a certain event. First generation cat bonds, or cat-in-a-box bonds, display a trigger mechanism that consists of a set of geographic zones in which certain conditions need to be met by an earthquake’s magnitude and depth in order to trigger payment of the bond principal. Second generation cat bonds use an index formulation that typically consists of a sum of products of a set of weights by a polynomial function of the ground motion variables reported by a geographically distributed seismic network. These instruments are especially appealing to developing countries with incipient insurance industries wishing to cede catastrophic losses to the financial markets because the payment trigger mechanism is transparent and does not involve the parties ceding or accepting the risk, significantly reducing moral hazard. In order to be successful in the market, however, parametric cat bonds have typically been required to specify relatively simple trigger conditions. The consequence of such simplifications is the increase of basis risk. This risk represents the possibility that the trigger mechanism fails to accurately capture the actual losses of a catastrophic event, namely that it does not trigger for a highly destructive event or vice versa, that a payment of the bond principal is caused by an event that produced insignificant losses. The first case disfavors the sponsor who was seeking cover for its losses while the

  18. Varenna workshop report. Operational earthquake forecasting and decision making

    Directory of Open Access Journals (Sweden)

    Warner Marzocchi

    2015-09-01

    Full Text Available A workshop on Operational earthquake forecasting and decision making was convened in Varenna, Italy, on June 8-11, 2014, under the sponsorship of the EU FP 7 REAKT (Strategies and tools for Real-time EArthquake risK reducTion project, the Seismic Hazard Center at the Istituto Nazionale di Geofisica e Vulcanologia (INGV, and the Southern California Earthquake Center (SCEC. The main goal was to survey the interdisciplinary issues of operational earthquake forecasting (OEF, including the problems that OEF raises for decision making and risk communication. The workshop was attended by 64 researchers from universities, research centers, and governmental institutions in 11 countries. Participants and the workshop agenda are listed in the appendix.The workshop comprised six topical sessions structured around three main themes: the science of operational earthquake forecasting, decision making in a low-probability environment, and communicating hazard and risk. Each topic was introduced by a moderator and surveyed by a few invited speakers, who were then empaneled for an open discussion. The presentations were followed by poster sessions. During a wrap-up session on the last day, the reporters for each topical session summarized the main points that they had gleaned from the talks and open discussions. This report attempts to distill this workshop record into a brief overview of the workshop themes and to describe the range of opinions expressed during the discussions.

  19. Operational Earthquake Forecasting: Proposed Guidelines for Implementation (Invited)

    Science.gov (United States)

    Jordan, T. H.

    2010-12-01

    The goal of operational earthquake forecasting (OEF) is to provide the public with authoritative information about how seismic hazards are changing with time. During periods of high seismic activity, short-term earthquake forecasts based on empirical statistical models can attain nominal probability gains in excess of 100 relative to the long-term forecasts used in probabilistic seismic hazard analysis (PSHA). Prospective experiments are underway by the Collaboratory for the Study of Earthquake Predictability (CSEP) to evaluate the reliability and skill of these seismicity-based forecasts in a variety of tectonic environments. How such information should be used for civil protection is by no means clear, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing formal procedures for OEF in this sort of “low-probability environment.” Nevertheless, the need to move more quickly towards OEF has been underscored by recent experiences, such as the 2009 L’Aquila earthquake sequence and other seismic crises in which an anxious public has been confused by informal, inconsistent earthquake forecasts. Whether scientists like it or not, rising public expectations for real-time information, accelerated by the use of social media, will require civil protection agencies to develop sources of authoritative information about the short-term earthquake probabilities. In this presentation, I will discuss guidelines for the implementation of OEF informed by my experience on the California Earthquake Prediction Evaluation Council, convened by CalEMA, and the International Commission on Earthquake Forecasting, convened by the Italian government following the L’Aquila disaster. (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and

  20. The Doctrinal Basis for Medical Stability Operations

    Science.gov (United States)

    2010-01-01

    lead actor, preferably a HN agency, but sometimes the military must take the lead in medical stability operations when overwhelming violence prevents...34 Assessment Tasks Administration of hospital Communications Obstetrics , Pediatrics, Emergency room. Operating room Nursing procedures Medical supply

  1. FB Line Basis for Interim Operation

    International Nuclear Information System (INIS)

    Shedrow, B.

    1998-01-01

    The safety analysis of the FB-Line Facility indicates that the operation of FB-Line to support the current mission does not present undue risk to the facility and co-located workers, general public, or the environment

  2. Earthquakes

    Science.gov (United States)

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  3. Design basis for the NRC Operations Center

    Energy Technology Data Exchange (ETDEWEB)

    Lindell, M.K.; Wise, J.A.; Griffin, B.N.; Desrosiers, A.E.; Meitzler, W.D.

    1983-05-01

    This report documents the development of a design for a new NRC Operations Center (NRCOC). The project was conducted in two phases: organizational analysis and facility design. In order to control the amount of traffic, congestion and noise within the facility, it is recommended that information flow in the new NRCOC be accomplished by means of an electronic Status Information Management System. Functional requirements and a conceptual design for this system are described. An idealized architectural design and a detailed design program are presented that provide the appropriate amount of space for operations, equipment and circulation within team areas. The overall layout provides controlled access to the facility and, through the use of a zoning concept, provides each team within the NRCOC the appropriate balance of ready access and privacy determined from the organizational analyses conducted during the initial phase of the project.

  4. Design basis for the NRC Operations Center

    International Nuclear Information System (INIS)

    Lindell, M.K.; Wise, J.A.; Griffin, B.N.; Desrosiers, A.E.; Meitzler, W.D.

    1983-05-01

    This report documents the development of a design for a new NRC Operations Center (NRCOC). The project was conducted in two phases: organizational analysis and facility design. In order to control the amount of traffic, congestion and noise within the facility, it is recommended that information flow in the new NRCOC be accomplished by means of an electronic Status Information Management System. Functional requirements and a conceptual design for this system are described. An idealized architectural design and a detailed design program are presented that provide the appropriate amount of space for operations, equipment and circulation within team areas. The overall layout provides controlled access to the facility and, through the use of a zoning concept, provides each team within the NRCOC the appropriate balance of ready access and privacy determined from the organizational analyses conducted during the initial phase of the project

  5. Solid waste retrieval. Phase 1, Operational basis

    International Nuclear Information System (INIS)

    Johnson, D.M.

    1994-01-01

    This Document describes the operational requirements, procedures, and options for execution of the retrieval of the waste containers placed in buried storage in Burial Ground 218W-4C, Trench 04 as TRU waste or suspect TRU waste under the activity levels defining this waste in effect at the time of placement. Trench 04 in Burial Ground 218W-4C is totally dedicated to storage of retrievable TRU waste containers or retrievable suspect TRU waste containers and has not been used for any other purpose

  6. Solid waste retrieval. Phase 1, Operational basis

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, D.M.

    1994-09-30

    This Document describes the operational requirements, procedures, and options for execution of the retrieval of the waste containers placed in buried storage in Burial Ground 218W-4C, Trench 04 as TRU waste or suspect TRU waste under the activity levels defining this waste in effect at the time of placement. Trench 04 in Burial Ground 218W-4C is totally dedicated to storage of retrievable TRU waste containers or retrievable suspect TRU waste containers and has not been used for any other purpose.

  7. Lessons of L'Aquila for Operational Earthquake Forecasting

    Science.gov (United States)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  8. PFP total operating efficiency calculation and basis of estimate

    International Nuclear Information System (INIS)

    SINCLAIR, J.C.

    1999-01-01

    The purpose of the Plutonium Finishing Plant (PFP) Total Operating Efficiency Calculation and Basis of Estimate document is to provide the calculated value and basis of estimate for the Total Operating Efficiency (TOE) for the material stabilization operations to be conducted in 234-52 Building. This information will be used to support both the planning and execution of the Plutonium Finishing Plant (PFP) Stabilization and Deactivation Project's (hereafter called the Project) resource-loaded, integrated schedule

  9. Circuit breaker operation and potential failure modes during an earthquake

    International Nuclear Information System (INIS)

    Lambert, H.E.; Budnitz, R.J.

    1987-01-01

    This study addresses the effect of a strong-motion earthquake on circuit breaker operation. It focuses on the loss of offsite power (LOSP) transient caused by a strong-motion earthquake at the Zion Nuclear Power Plant. This paper also describes the operator action necessary to prevent core melt if the above circuit breaker failure modes occur simultaneously on three 4.16 KV buses. Numerous circuit breakers important to plant safety, such as circuit breakers to diesel generators and engineered safety systems (ESS), must open and/or close during this transient while strong motion is occurring. Potential seismically-induced circuit-breaker failures modes were uncovered while the study was conducted. These failure modes include: circuit breaker fails to close; circuit breaker trips inadvertently; circuit breaker fails to reclose after trip. The causes of these failure modes include: Relay chatter causes the circuit breaker to trip; Relay chatter causes anti-pumping relays to seal-in which prevents automatic closure of circuit breakers; Load sequencer failures. The incorporation of these failure modes as well as other instrumentation and control failures into a limited scope seismic probabilistic risk assessment is also discussed in this paper

  10. 1/f and the Earthquake Problem: Scaling constraints that facilitate operational earthquake forecasting

    Science.gov (United States)

    yoder, M. R.; Rundle, J. B.; Turcotte, D. L.

    2012-12-01

    The difficulty of forecasting earthquakes can fundamentally be attributed to the self-similar, or "1/f", nature of seismic sequences. Specifically, the rate of occurrence of earthquakes is inversely proportional to their magnitude m, or more accurately to their scalar moment M. With respect to this "1/f problem," it can be argued that catalog selection (or equivalently, determining catalog constraints) constitutes the most significant challenge to seismicity based earthquake forecasting. Here, we address and introduce a potential solution to this most daunting problem. Specifically, we introduce a framework to constrain, or partition, an earthquake catalog (a study region) in order to resolve local seismicity. In particular, we combine Gutenberg-Richter (GR), rupture length, and Omori scaling with various empirical measurements to relate the size (spatial and temporal extents) of a study area (or bins within a study area) to the local earthquake magnitude potential - the magnitude of earthquake the region is expected to experience. From this, we introduce a new type of time dependent hazard map for which the tuning parameter space is nearly fully constrained. In a similar fashion, by combining various scaling relations and also by incorporating finite extents (rupture length, area, and duration) as constraints, we develop a method to estimate the Omori (temporal) and spatial aftershock decay parameters as a function of the parent earthquake's magnitude m. From this formulation, we develop an ETAS type model that overcomes many point-source limitations of contemporary ETAS. These models demonstrate promise with respect to earthquake forecasting applications. Moreover, the methods employed suggest a general framework whereby earthquake and other complex-system, 1/f type, problems can be constrained from scaling relations and finite extents.; Record-breaking hazard map of southern California, 2012-08-06. "Warm" colors indicate local acceleration (elevated hazard

  11. Design basis programs and improvements in plant operation

    International Nuclear Information System (INIS)

    Metcalf, M.F.

    1991-01-01

    Public Service Electric and Gas (PSE and G) Company operates three commercial nuclear power plants in southern New Jersey. The three plants are of different designs and vintages (two pressurized water reactors licensed in 1976 and 1980 and one boiling water reactor licensed in 1986). As the industry recognized the need to develop design basis programs, PSE and G also realized the need after a voluntary 52-day shutdown of one unit because of electrical design basis problems. In its drive to be a premier electric utility, PSE and G has been aggressively active in developing design basis documents (DBDs) with supporting projects and refined uses to obtain the expected value and see the return on investment. Progress on Salem is nearly 75% complete, while Hope Creek is 20% complete. To data, PSE and G has experienced success in the use of DBDs in areas such as development of plant modifications, development of the reliability-centered maintenance program, procedure upgrades, improved document retrieval, resolution of regulatory issues, and training. The paper examines the design basis development process, supporting projects, and expected improvements in plant operations as a result of these efforts

  12. Indian Ocean Earthquake and Tsunami: Humanitarian Assistance and Relief Operations

    National Research Council Canada - National Science Library

    Margesson, Rhoda

    2005-01-01

    On December 26, 2004, a magnitude 9.0 undersea earthquake off the west coast of northern Sumatra, Indonesia, unleashed a tsunami that affected more than 12 countries throughout south and southeast Asia and stretched as far...

  13. Conjugate schema and basis representation of crossover and mutation operators.

    Science.gov (United States)

    Kazadi, S T

    1998-01-01

    In genetic search algorithms and optimization routines, the representation of the mutation and crossover operators are typically defaulted to the canonical basis. We show that this can be influential in the usefulness of the search algorithm. We then pose the question of how to find a basis for which the search algorithm is most useful. The conjugate schema is introduced as a general mathematical construct and is shown to separate a function into smaller dimensional functions whose sum is the original function. It is shown that conjugate schema, when used on a test suite of functions, improves the performance of the search algorithm on 10 out of 12 of these functions. Finally, a rigorous but abbreviated mathematical derivation is given in the appendices.

  14. A spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3‐ETAS): Toward an operational earthquake forecast

    Science.gov (United States)

    Field, Edward; Milner, Kevin R.; Hardebeck, Jeanne L.; Page, Morgan T.; van der Elst, Nicholas; Jordan, Thomas H.; Michael, Andrew J.; Shaw, Bruce E.; Werner, Maximillan J.

    2017-01-01

    We, the ongoing Working Group on California Earthquake Probabilities, present a spatiotemporal clustering model for the Third Uniform California Earthquake Rupture Forecast (UCERF3), with the goal being to represent aftershocks, induced seismicity, and otherwise triggered events as a potential basis for operational earthquake forecasting (OEF). Specifically, we add an epidemic‐type aftershock sequence (ETAS) component to the previously published time‐independent and long‐term time‐dependent forecasts. This combined model, referred to as UCERF3‐ETAS, collectively represents a relaxation of segmentation assumptions, the inclusion of multifault ruptures, an elastic‐rebound model for fault‐based ruptures, and a state‐of‐the‐art spatiotemporal clustering component. It also represents an attempt to merge fault‐based forecasts with statistical seismology models, such that information on fault proximity, activity rate, and time since last event are considered in OEF. We describe several unanticipated challenges that were encountered, including a need for elastic rebound and characteristic magnitude–frequency distributions (MFDs) on faults, both of which are required to get realistic triggering behavior. UCERF3‐ETAS produces synthetic catalogs of M≥2.5 events, conditioned on any prior M≥2.5 events that are input to the model. We evaluate results with respect to both long‐term (1000 year) simulations as well as for 10‐year time periods following a variety of hypothetical scenario mainshocks. Although the results are very plausible, they are not always consistent with the simple notion that triggering probabilities should be greater if a mainshock is located near a fault. Important factors include whether the MFD near faults includes a significant characteristic earthquake component, as well as whether large triggered events can nucleate from within the rupture zone of the mainshock. Because UCERF3‐ETAS has many sources of uncertainty, as

  15. A subleading power operator basis for the scalar quark current

    Science.gov (United States)

    Chang, Cyuan-Han; Stewart, Iain W.; Vita, Gherardo

    2018-04-01

    Factorization theorems play a crucial role in our understanding of the strong interaction. For collider processes they are typically formulated at leading power and much less is known about power corrections in the λ ≪ 1 expansion. Here we present a complete basis of power suppressed operators for a scalar quark current at O({λ}^2) in the amplitude level power expansion in the Soft Collinear Effective Theory, demonstrating that helicity selection rules significantly simplify the construction. This basis applies for the production of any color singlet scalar in q\\overline{q} annihilation (such as b\\overline{b}\\to H ). We also classify all operators which contribute to the cross section at O({λ}^2) and perform matching calculations to determine their tree level Wilson coefficients. These results can be exploited to study power corrections in both resummed and fixed order perturbation theory, and for analyzing the factorization properties of gauge theory amplitudes and cross sections at subleading power.

  16. Original earthquake design basis in light of recent seismic hazard studies

    International Nuclear Information System (INIS)

    Petrovski, D.

    1993-01-01

    For the purpose of conceiving the framework within which efforts have been made in the eastern countries to construct earthquake resistant nuclear power plants, a review of the development and application of the seismic zoning map of USSR is given. The normative values of seismic intensity and acceleration are discussed from the aspect of recent probabilistic seismic hazard studies. To that effect, presented briefly in this paper is the methodology of probabilistic seismic hazard analysis. (author)

  17. [Operating room during natural disaster: lessons from the 2011 Tohoku earthquake].

    Science.gov (United States)

    Fukuda, Ikuo; Hashimoto, Hiroshi; Suzuki, Yasuyuki; Satomi, Susumu; Unno, Michiaki; Ohuchi, Noriaki; Nakaji, Shigeyuki

    2012-03-01

    Objective of this study is to clarify damages in operating rooms after the 2011 Tohoku Earthquake. To survey structural and non-structural damage in operating theaters, we sent questionnaires to 155 acute care hospitals in Tohoku area. Questionnaires were sent back from 105 hospitals (70.3%). Total of 280 patients were undergoing any kinds of operations during the earthquake and severe seismic tremor greater than JMA Seismic Intensity 6 hit 49 hospitals. Operating room staffs experienced life-threatening tremor in 41 hospitals. Blackout occurred but emergency electronic supply unit worked immediately in 81 out of 90 hospitals. However, emergency power plant did not work in 9 hospitals. During earthquake some materials fell from shelves in 44 hospitals and medical instruments fell down in 14 hospitals. In 5 hospitals, they experienced collapse of operating room wall or ceiling causing inability to maintain sterile operative field. Damage in electric power and water supply plus damage in logistics made many operating rooms difficult to perform routine surgery for several days. The 2011 Tohoku earthquake affected medical supply in wide area of Tohoku district and induced dysfunction of operating room. Supply-chain management of medical goods should be reconsidered to prepare severe natural disaster.

  18. Large Earthquakes at the Ibero-Maghrebian Region: Basis for an EEWS

    Science.gov (United States)

    Buforn, Elisa; Udías, Agustín; Pro, Carmen

    2015-09-01

    Large earthquakes (Mw > 6, Imax > VIII) occur at the Ibero-Maghrebian region, extending from a point (12ºW) southwest of Cape St. Vincent to Tunisia, with different characteristics depending on their location, which cause considerable damage and casualties. Seismic activity at this region is associated with the boundary between the lithospheric plates of Eurasia and Africa, which extends from the Azores Islands to Tunisia. The boundary at Cape St. Vincent, which has a clear oceanic nature in the westernmost part, experiences a transition from an oceanic to a continental boundary, with the interaction of the southern border of the Iberian Peninsula, the northern border of Africa, and the Alboran basin between them, corresponding to a wide area of deformation. Further to the east, the plate boundary recovers its oceanic nature following the northern coast of Algeria and Tunisia. The region has been divided into four zones with different seismic characteristics. From west to east, large earthquake occurrence, focal depth, total seismic moment tensor, and average seismic slip velocities for each zone along the region show the differences in seismic release of deformation. This must be taken into account in developing an EEWS for the region.

  19. Basis for Interim Operation for Fuel Supply Shutdown Facility

    International Nuclear Information System (INIS)

    BENECKE, M.W.

    2003-01-01

    This document establishes the Basis for Interim Operation (BIO) for the Fuel Supply Shutdown Facility (FSS) as managed by the 300 Area Deactivation Project (300 ADP) organization in accordance with the requirements of the Project Hanford Management Contract procedure (PHMC) HNF-PRO-700, ''Safety Analysis and Technical Safety Requirements''. A hazard classification (Benecke 2003a) has been prepared for the facility in accordance with DOE-STD-1027-92 resulting in the assignment of Hazard Category 3 for FSS Facility buildings that store N Reactor fuel materials (303-B, 3712, and 3716). All others are designated Industrial buildings. It is concluded that the risks associated with the current and planned operational mode of the FSS Facility (uranium storage, uranium repackaging and shipment, cleanup, and transition activities, etc.) are acceptable. The potential radiological dose and toxicological consequences for a range of credible uranium storage building have been analyzed using Hanford accepted methods. Risk Class designations are summarized for representative events in Table 1.6-1. Mitigation was not considered for any event except the random fire event that exceeds predicted consequences based on existing source and combustible loading because of an inadvertent increase in combustible loading. For that event, a housekeeping program to manage transient combustibles is credited to reduce the probability. An additional administrative control is established to protect assumptions regarding source term by limiting inventories of fuel and combustible materials. Another is established to maintain the criticality safety program. Additional defense-in-depth controls are established to perform fire protection system testing, inspection, and maintenance to ensure predicted availability of those systems, and to maintain the radiological control program. It is also concluded that because an accidental nuclear criticality is not credible based on the low uranium enrichment

  20. Relay chatter and operator response after a large earthquake: An improved PRA methodology with case studies

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.; Hill, E.E.

    1987-08-01

    The purpose of this project has been to develop and demonstrate improvements in the PRA methodology used for analyzing earthquake-induced accidents at nuclear power reactors. Specifically, the project addresses methodological weaknesses in the PRA systems analysis used for studying post-earthquake relay chatter and for quantifying human response under high stress. An improved PRA methodology for relay-chatter analysis is developed, and its use is demonstrated through analysis of the Zion-1 and LaSalle-2 reactors as case studies. This demonstration analysis is intended to show that the methodology can be applied in actual cases, and the numerical values of core-damage frequency are not realistic. The analysis relies on SSMRP-based methodologies and data bases. For both Zion-1 and LaSalle-2, assuming that loss of offsite power (LOSP) occurs after a large earthquake and that there are no operator recovery actions, the analysis finds very many combinations (Boolean minimal cut sets) involving chatter of three or four relays and/or pressure switch contacts. The analysis finds that the number of min-cut-set combinations is so large that there is a very high likelihood (of the order of unity) that at least one combination will occur after earthquake-caused LOSP. This conclusion depends in detail on the fragility curves and response assumptions used for chatter. Core-damage frequencies are calculated, but they are probably pessimistic because assuming zero credit for operator recovery is pessimistic. The project has also developed an improved PRA methodology for quantifying operator error under high-stress conditions such as after a large earthquake. Single-operator and multiple-operator error rates are developed, and a case study involving an 8-step procedure (establishing feed-and-bleed in a PWR after an earthquake-initiated accident) is used to demonstrate the methodology

  1. Operational Efficiency And Customer Satisfaction of Restaurants: Basis For Business Operation Enhancement

    Directory of Open Access Journals (Sweden)

    Annie Gay Barlan-Espino

    2017-02-01

    Full Text Available Restaurants’ primary objective is to provide comfort and satisfaction to guest without compromising the operational efficiency of the business. This research aimed to determine the operational efficiency and customer satisfaction of restaurants as a basis for business operation enhancement. Specifically to determine the operational efficiency of the restaurant in terms of kitchen operations and dining operations and the level of customer satisfaction of the restaurant business in terms of: Product, Policies, People, Processes and Proactivity as well as the problems encountered by the restaurant in their operation and customer service. Descriptive research design was used with managers and customers as respondents of the study. It was concluded that majority of the restaurants are operating for more than a year with sufficient number of employees having enough seating capacity that accommodate large volume of customers. Restaurants are efficient on the aspect of kitchen and dining operations and sometimes encountered problems. Customers are satisfied in terms of 5 P’s. It was found out that there is no significant difference in the operational efficiency of restaurant when grouped according to profile variables. An action plan for continuous business operation enhancement on operational efficiency and customer satisfaction was proposed.

  2. Plutonium uranium extraction (PUREX) end state basis for interim operation (BIO) for surveillance and maintenance

    International Nuclear Information System (INIS)

    DODD, E.N.

    1999-01-01

    This Basis for Interim Operation (BIO) was developed for the PUREX end state condition following completion of the deactivation project. The deactivation project has removed or stabilized the hazardous materials within the facility structure and equipment to reduce the hazards posed by the facility during the surveillance and maintenance (S and M) period, and to reduce the costs associated with the S and M. This document serves as the authorization basis for the PUREX facility, excluding the storage tunnels, railroad cut, and associated tracks, for the deactivated end state condition during the S and M period. The storage tunnels, and associated systems and areas, are addressed in WHC-SD-HS-SAR-001, Rev. 1, PUREX Final Safety Analysis Report. During S and M, the mission of the facility is to maintain the conditions and equipment in a manner that ensures the safety of the workers, environment, and the public. The S and M phase will continue until the final decontamination and decommissioning (D and D) project and activities are begun. Based on the methodology of DOE-STD-1027-92, Hazards Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports, the final facility hazards category is identified as hazards category This considers the remaining material inventories, form and distribution of the material, and the energies present to initiate events of concern. Given the current facility configuration, conditions, and authorized S and M activities, there are no operational events identified resulting in significant hazard to any of the target receptor groups (e.g., workers, public, environment). The only accident scenarios identified with consequences to the onsite co-located workers were based on external natural phenomena, specifically an earthquake. The dose consequences of these events are within the current risk evaluation guidelines and are consistent with the expectations for a hazards category 2

  3. Plutonium uranium extraction (PUREX) end state basis for interim operation (BIO) for surveillance and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    DODD, E.N.

    1999-05-12

    This Basis for Interim Operation (BIO) was developed for the PUREX end state condition following completion of the deactivation project. The deactivation project has removed or stabilized the hazardous materials within the facility structure and equipment to reduce the hazards posed by the facility during the surveillance and maintenance (S and M) period, and to reduce the costs associated with the S and M. This document serves as the authorization basis for the PUREX facility, excluding the storage tunnels, railroad cut, and associated tracks, for the deactivated end state condition during the S and M period. The storage tunnels, and associated systems and areas, are addressed in WHC-SD-HS-SAR-001, Rev. 1, PUREX Final Safety Analysis Report. During S and M, the mission of the facility is to maintain the conditions and equipment in a manner that ensures the safety of the workers, environment, and the public. The S and M phase will continue until the final decontamination and decommissioning (D and D) project and activities are begun. Based on the methodology of DOE-STD-1027-92, Hazards Categorization and Accident Analysis Techniques for Compliance with DOE Order 5480.23, Nuclear Safety Analysis Reports, the final facility hazards category is identified as hazards category This considers the remaining material inventories, form and distribution of the material, and the energies present to initiate events of concern. Given the current facility configuration, conditions, and authorized S and M activities, there are no operational events identified resulting in significant hazard to any of the target receptor groups (e.g., workers, public, environment). The only accident scenarios identified with consequences to the onsite co-located workers were based on external natural phenomena, specifically an earthquake. The dose consequences of these events are within the current risk evaluation guidelines and are consistent with the expectations for a hazards category 2

  4. Operational Status of PF-Ring and PF-AR after the Earthquake

    International Nuclear Information System (INIS)

    Honda, T; Asaoka, S; Haga, K; Harada, K; Honda, Y; Izawa, M; Kamiya, Y; Kobayashi, Y; Miyajima, T; Miyauchi, H; Nagahashi, S; Nakamura, N; Nogami, T; Obina, T; Ozaki, T; Sagehashi, H; Sakai, H; Sakanaka, S; Sasaki, H; Sato, Y

    2013-01-01

    In 2011, two SR sources of KEK, PF-ring and PF-AR, needed to change the operation schedule because of the unprecedented earthquake on March 11. Though the injector linac and the storage rings suffered a serious damage, temporary recovery was accomplished quickly and the trial operation started in May. The regular user operation could be resumed in October 2011. In the restoration work after the earthquake, some old vacuum components were removed from PF-ring. This work fortunately brought an effect of settling the quadrupole-mode longitudinal instability. For the top-up injection of PF-ring, the pulsed sextupole magnet has been used instead of the conventional kicker magnets since 2011. The hybrid-fill mode in place of the single-bunch mode has become available. Recently, the 10-Hz orbit switching for the tandem circularly polarized undulators has been developed for the user operation.

  5. Operational Earthquake Forecasting and Decision-Making in a Low-Probability Environment

    Science.gov (United States)

    Jordan, T. H.; the International Commission on Earthquake ForecastingCivil Protection

    2011-12-01

    Operational earthquake forecasting (OEF) is the dissemination of authoritative information about the time dependence of seismic hazards to help communities prepare for potentially destructive earthquakes. Most previous work on the public utility of OEF has anticipated that forecasts would deliver high probabilities of large earthquakes; i.e., deterministic predictions with low error rates (false alarms and failures-to-predict) would be possible. This expectation has not been realized. An alternative to deterministic prediction is probabilistic forecasting based on empirical statistical models of aftershock triggering and seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains in excess of 100 relative to long-term forecasts. The utility of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing OEF in this sort of "low-probability environment." The need to move more quickly has been underscored by recent seismic crises, such as the 2009 L'Aquila earthquake sequence, in which an anxious public was confused by informal and inaccurate earthquake predictions. After the L'Aquila earthquake, the Italian Department of Civil Protection appointed an International Commission on Earthquake Forecasting (ICEF), which I chaired, to recommend guidelines for OEF utilization. Our report (Ann. Geophys., 54, 4, 2011; doi: 10.4401/ag-5350) concludes: (a) Public sources of information on short-term probabilities should be authoritative, scientific, open, and timely, and need to convey epistemic uncertainties. (b) Earthquake probabilities should be based on operationally qualified, regularly updated forecasting systems. (c) All operational models should be evaluated

  6. A basis for sound management-plant-operator interface

    International Nuclear Information System (INIS)

    Oak, H.D.

    1983-01-01

    Sound management-plant-operator interface is based on the application of suitable quality assurance principles. Quality assurance can aptly be termed ''putting priority and emphasis where such are due''. Accordingly, the application of suitable quality assurance principles achieves the all-important combination of both safety and production. Neither of these is mutually exclusive of the other, and both together establish the prime foundation for long-term nuclear power plant operation. This paper presents the application of suitable quality assurance principles to the management, the plant, the operators, and the interface between them. (author)

  7. Pakistan Earthquake Relief Operations: Leveraging Humanitarian Missions for Strategic Success

    Science.gov (United States)

    2010-12-01

    PRISM 2, no. 1 leSSoNS leaRNed | 131 On Christmas morning 2005, at Saint Patrick’s Catholic Church in Auckland , New Zealand, a priest stepped up to... economically difficult to sustain. However, the HA/DR cam- paign in Pakistan, Operation Lifeline, provides a useful model of how humanitarian...35 The two field hospitals became symbols of the American-Pakistani military partnership and an asymmetric advantage for the United States as

  8. Design basis for the operational modelling of the atmospheric dispersion

    International Nuclear Information System (INIS)

    Doury, A.

    1987-10-01

    Based on the latest practices at the Institut de Protection et de Surete Nucleaire of the Commissariat a l'Energie Atomique (CEA), we shall first present the basis elements used for a simple and adequate modelling method for assessing hypothetical atmospheric pollution from transient or continuous discharge with any given kinetics under various weather conditions which are not necessarily stationary or uniform, which are likely to occur even with little or no wind. Discharges shall be considered as sequences of instantaneous successive puffs. The parameters deduced experimentally or from observations are functions of the transfer time and cover all time and space scales. The restrictions of use are indicated, especially concerning heavy gases. Finally, simple formulas are proposed for concentrations and depositions so as to be able to make a rapid estimation of the orders of magnitude with almost no computation [fr

  9. Design basis for the operational modelling of the atmospheric dispersion

    International Nuclear Information System (INIS)

    Doury, A.

    1987-11-01

    Based on the latest practices at the Institut de Protection et de Surete Nucleaire of the Commissariat a l'Energie Atomique (CEA), we shall first present the basis elements used for a simple and adequate modelling method for assessing hypothetical atmospheric pollution from transient or continuous discharge with any given kinetics under various weather conditions which are not necessarily stationary or uniform, which are likely to occur even with little or no wind. Discharges shall be considered as sequences of instantaneous successive puffs. The parameters deduced experimentally or from observations are functions of the transfer time and cover all time and space scales. The restrictions of use are indicated, especially concerning heavy gases. Finally, simple formulas are proposed for concentrations and depositions so as to be able to make a rapid estimation of the orders of magnitude with almost no computation [fr

  10. Performance of wire-type Rn detectors operated with gas gain in ambient air in view of its possible application to early earthquake predictions

    CERN Document Server

    Charpak, Georges; Breuil, P; Nappi, E; Martinengo, P; Peskov, V

    2010-01-01

    We describe a detector of alpha particles based on wire type counters (single-wire and multiwire) operating in ambient air at high gas gains (100-1000). The main advantages of these detectors are: low cost, robustness and ability to operate in humid air. The minimum detectable activity achieved with the multiwire detector for an integration time of 1 min is 140 Bq per m3, which is comparable to that featured by commercial devices. Owing to such features the detector is suited for massive application, for example for continuous monitoring of Rn or Po contaminations or, as discussed in the paper, its use in a network of Rn counters in areas affected by earth-quakes in order to verify, on a solid statistical basis, the envisaged correlation between the sudden Rn appearance and a forthcoming earthquake.

  11. Human reliability assessment on the basis of operating experience

    International Nuclear Information System (INIS)

    Straeter, O.

    1997-01-01

    For development of methodology, available models for qualitative assessment of human errors (e.g. by Swain, Hacker, Rasmussen) and a variety of known systematic approaches for quantitiative assessment of inadequate human action (e.g. THERP, ASEP, HCR, SLIM) were taken as a basis to establish a job specification, which in turn was used for developing a method for acquisition, characterisation and evaluation of errors. This method encompasses the two processes of event analysis and event evaluation: The first step comprises analysis of events by analysis of information describing the conditions and scenarios of relevance to the inadequate human action examined. In addition to the description of process sequences, information is taken into account on possible conditions that may bring about failure. As an assessment of human reliability requires manifold approaches for evaluation, a connectionistic procedure was developed for evaluation of the compilation of events based on a debate about various approaches from the domain of artificial intelligence (AI). This procedure yields both qualitative and quantitative information through a homogenous approach. (orig./GL) [de

  12. TECHNICAL BASIS FOR VENTILATION REQUIREMENTS IN TANK FARMS OPERATING SPECIFICATIONS DOCUMENTS

    Energy Technology Data Exchange (ETDEWEB)

    BERGLIN, E J

    2003-06-23

    This report provides the technical basis for high efficiency particulate air filter (HEPA) for Hanford tank farm ventilation systems (sometimes known as heating, ventilation and air conditioning [HVAC]) to support limits defined in Process Engineering Operating Specification Documents (OSDs). This technical basis included a review of older technical basis and provides clarifications, as necessary, to technical basis limit revisions or justification. This document provides an updated technical basis for tank farm ventilation systems related to Operation Specification Documents (OSDs) for double-shell tanks (DSTs), single-shell tanks (SSTs), double-contained receiver tanks (DCRTs), catch tanks, and various other miscellaneous facilities.

  13. Probabilistic model to forecast earthquakes in the Zemmouri (Algeria) seismoactive area on the basis of moment magnitude scale distribution functions

    Science.gov (United States)

    Baddari, Kamel; Makdeche, Said; Bellalem, Fouzi

    2013-02-01

    Based on the moment magnitude scale, a probabilistic model was developed to predict the occurrences of strong earthquakes in the seismoactive area of Zemmouri, Algeria. Firstly, the distributions of earthquake magnitudes M i were described using the distribution function F 0(m), which adjusts the magnitudes considered as independent random variables. Secondly, the obtained result, i.e., the distribution function F 0(m) of the variables M i was used to deduce the distribution functions G(x) and H(y) of the variables Y i = Log M 0,i and Z i = M 0,i , where (Y i)i and (Z i)i are independent. Thirdly, some forecast for moments of the future earthquakes in the studied area is given.

  14. Ground motion for the design basis earthquake at the Savannah River Site, South Carolina based on a deterministic approach

    International Nuclear Information System (INIS)

    Youngs, R.R.; Coppersmith, K.J.; Silva, W.J.; Stephenson, D.E.

    1991-01-01

    Ground motion assessments are presented for evaluation of the seismic safety of K-Reactor at the Savannah River Site. Two earthquake sources were identified as the most significant to seismic hazard at the site, a M 7.5 earthquake occurring at Charleston, South Carolina, and a M 5 event occurring in the site vicinity. These events control the low frequency and high frequency portions of the spectrum, respectively. Three major issues were identified in the assessment of ground motions for the Savannah River site; specification of the appropriate stress drop for the Charleston source earthquake, specification of the appropriate levels of soil damping at large depths for site response analyses, and the appropriateness of western US recordings for specification of ground motions in the eastern US

  15. Flammable gas deflagration consequence calculations for the tank waste remediation system basis for interim operation

    Energy Technology Data Exchange (ETDEWEB)

    Van Vleet, R.J., Westinghouse Hanford

    1996-08-13

    This paper calculates the radiological dose consequences and the toxic exposures for deflagration accidents at various Tank Waste Remediation System facilities. These will be used in support of the Tank Waste Remediation System Basis for Interim Operation.The attached SD documents the originator`s analysis only. It shall not be used as the final or sole document for effecting changes to an authorization basis or safety basis for a facility or activity.

  16. Basis for Interim Operation for the K-Reactor in Cold Standby

    Energy Technology Data Exchange (ETDEWEB)

    Shedrow, B.

    1998-10-19

    The Basis for Interim Operation (BIO) document for K Reactor in Cold Standby and the L- and P-Reactor Disassembly Basins was prepared in accordance with the draft DOE standard for BIO preparation (dated October 26, 1993).

  17. Basis for Interim Operation for the K-Reactor in Cold Standby

    International Nuclear Information System (INIS)

    Shedrow, B.

    1998-01-01

    The Basis for Interim Operation (BIO) document for K Reactor in Cold Standby and the L- and P-Reactor Disassembly Basins was prepared in accordance with the draft DOE standard for BIO preparation (dated October 26, 1993)

  18. Multiparameter monitoring of short-term earthquake precursors and its physical basis. Implementation in the Kamchatka region

    Directory of Open Access Journals (Sweden)

    Pulinets Sergey

    2016-01-01

    Full Text Available We apply experimental approach of the multiparameter monitoring of short-term earthquake precursors which reliability was confirmed by the Lithosphere-Atmosphere-Ionosphere Coupling (LAIC model created recently [1]. A key element of the model is the process of Ion induced Nucleation (IIN and formation of cluster ions occurring as a result of the ionization of near surface air layer by radon emanating from the Earth's crust within the earthquake preparation zone. This process is similar to the formation of droplet’s embryos for cloud formation under action of galactic cosmic rays. The consequence of this process is the generation of a number of precursors that can be divided into two groups: a thermal and meteorological, and b electromagnetic and ionospheric. We demonstrate elements of prospective monitoring of some strong earthquakes in Kamchatka region and statistical results for the Chemical potential correction parameter for more than 10 years of observations for earthquakes with M≥6. As some experimental attempt, the data of Kamchatka volcanoes monitoring will be demonstrated.

  19. An operator basis for the Standard Model with an added scalar singlet

    Energy Technology Data Exchange (ETDEWEB)

    Gripaios, Ben [Cavendish Laboratory, J.J. Thomson Avenue, Cambridge (United Kingdom); Sutherland, Dave [Cavendish Laboratory, J.J. Thomson Avenue, Cambridge (United Kingdom); Kavli Institute for Theoretical Physics, UCSB Kohn Hall, Santa Barbara CA (United States)

    2016-08-17

    Motivated by the possible di-gamma resonance at 750 GeV, we present a basis of effective operators for the Standard Model plus a scalar singlet at dimensions 5, 6, and 7. We point out that an earlier list at dimensions 5 and 6 contains two redundant operators at dimension 5.

  20. Viability of Event Management Business in Batangas City, Philippine: Basis for Business Operation Initiatives

    OpenAIRE

    Jeninah Christia D. Borbon

    2016-01-01

    The research study on Viability of Event Management Business in Batangas City: Basis for Business Operation Initiatives aimed to assess the viability of this type of business using Thompson’s (2005) Dimension of Business Viability as its tool in order to create business operation initiatives. It provided a good framework for defining success factors in entrepreneurial operation initiatives in a specific business type – event management. This study utilized event organizers based i...

  1. Comparative Analysis of Emergency Response Operations: Haiti Earthquake in January 2010 and Pakistan’s Flood in 2010

    Science.gov (United States)

    2011-09-01

    Earthquake, Pakistan, Flood, Emergency Response Operations, International Community, HA/DR, United Nations , FRC, NDMA , ICT 16. PRICE CODE 17. SECURITY...Registration Authority NATO North Atlantic Treaty Organization NDMA National Disaster and Management Authority NDMC National Disaster Management...complicates relief efforts. 6 NDMA Pakistan, “Pakistan Floods-Summary of Damages,” No Author. Accessed 24

  2. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    Science.gov (United States)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  3. A probabilistic risk assessment of the LLNL Plutonium facility's evaluation basis fire operational accident

    International Nuclear Information System (INIS)

    Brumburgh, G.

    1994-01-01

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous involving plutonium to include device fabrication, development of fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed rational safety and acceptable risk to employees, the public, government property, and the environment. This paper outlines the PRA analysis of the Evaluation Basis Fire (EDF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility

  4. Preventive maintenance basis: Volume 1 -- Air-operated valves. Final report

    International Nuclear Information System (INIS)

    Worledge, D.; Hinchcliffe, G.

    1997-07-01

    US nuclear plants are implementing preventive maintenance (PM) tasks with little documented basis beyond fundamental vendor information to support the tasks or their intervals. The Preventive Maintenance Basis project provides utilities with the technical basis for PM tasks and task intervals associated with 40 specific components such as valves, electric motors, pumps, and HVAC equipment. This report provides an overview of the PM Basis project and describes use of the PM Basis database. This document provides a program of PM tasks suitable for application to Air Operated Valves (AOV's) in nuclear power plants. The PM tasks that are recommended provide a cost-effective way to intercept the causes and mechanisms that lead to degradation and failure. They can be used, in conjunction with material from other sources, to develop a complete PM program or to improve an existing program. Users of this information will be utility managers, supervisors, craft technicians, and training instructors responsible for developing, optimizing, or fine-tuning PM programs

  5. Preventive maintenance basis: Volume 16 -- Power operated relief valves, solenoid actuated. Final report

    International Nuclear Information System (INIS)

    Worledge, D.; Hinchcliffe, G.

    1997-07-01

    US nuclear plants are implementing preventive maintenance (PM) tasks with little documented basis beyond fundamental vendor information to support the tasks or their intervals. The Preventive Maintenance Basis project provides utilities with the technical basis for PM tasks and task intervals associated with 40 specific components such as valves, electric motors, pumps, and HVAC equipment. This report provides an overview of the PM Basis project and describes use of the PM Basis database. This volume 16 of the report provides a program of PM tasks suitable for application to power operated relief valves (PORV's) that are solenoid actuated. The PM tasks that are recommended provide a cost-effective way to intercept the causes and mechanisms that lead to degradation and failure. They can be used, in conjunction with material from other sources, to develop a complete PM program or to improve an existing program. Users of this information will be utility managers, supervisors, craft technicians, and training instructors responsible for developing, optimizing, or fine-tuning PM programs

  6. A probabilistic risk assessment of the LLNL Plutonium Facility's evaluation basis fire operational accident. Revision 1

    International Nuclear Information System (INIS)

    Brumburgh, G.P.

    1995-01-01

    The Lawrence Livermore National Laboratory (LLNL) Plutonium Facility conducts numerous programmatic activities involving plutonium to include device fabrication, development of improved and/or unique fabrication techniques, metallurgy research, and laser isotope separation. A Safety Analysis Report (SAR) for the building 332 Plutonium Facility was completed in July 1994 to address operational safety and acceptable risk to employees, the public, government property, and the environmental. This paper outlines the PRA analysis of the Evaluation Basis Fire (EBF) operational accident. The EBF postulates the worst-case programmatic impact event for the Plutonium Facility

  7. An evaluation of an operating BWR piping system damping during earthquake by applying auto regressive analysis

    International Nuclear Information System (INIS)

    Kitada, Y.; Makiguchi, M.; Komori, A.; Ichiki, T.

    1985-01-01

    The records of three earthquakes which had induced significant earthquake response to the piping system were obtained with the earthquake observation system. In the present paper, first, the eigenvalue analysis results for the natural piping system based on the piping support (boundary) conditions are described and second, the frequency and the damping factor evaluation results for each vibrational mode are described. In the present study, the Auto Regressive (AR) analysis method is used in the evaluation of natural frequencies and damping factors. The AR analysis applied here has a capability of direct evaluation of natural frequencies and damping factors from earthquake records observed on a piping system without any information on the input motions to the system. (orig./HP)

  8. Computing single step operators of logic programming in radial basis function neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong [School of Mathematical Sciences, Universiti Sains Malaysia, 11800 USM, Penang (Malaysia)

    2014-07-10

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T{sub p}:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  9. Computing single step operators of logic programming in radial basis function neural networks

    Science.gov (United States)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-07-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (Tp:I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks.

  10. Computing single step operators of logic programming in radial basis function neural networks

    International Nuclear Information System (INIS)

    Hamadneh, Nawaf; Sathasivam, Saratha; Choon, Ong Hong

    2014-01-01

    Logic programming is the process that leads from an original formulation of a computing problem to executable programs. A normal logic program consists of a finite set of clauses. A valuation I of logic programming is a mapping from ground atoms to false or true. The single step operator of any logic programming is defined as a function (T p :I→I). Logic programming is well-suited to building the artificial intelligence systems. In this study, we established a new technique to compute the single step operators of logic programming in the radial basis function neural networks. To do that, we proposed a new technique to generate the training data sets of single step operators. The training data sets are used to build the neural networks. We used the recurrent radial basis function neural networks to get to the steady state (the fixed point of the operators). To improve the performance of the neural networks, we used the particle swarm optimization algorithm to train the networks

  11. The Mixed Waste Management Facility. Design basis integrated operations plan (Title I design)

    International Nuclear Information System (INIS)

    1994-12-01

    The Mixed Waste Management Facility (MWMF) will be a fully integrated, pilotscale facility for the demonstration of low-level, organic-matrix mixed waste treatment technologies. It will provide the bridge from bench-scale demonstrated technologies to the deployment and operation of full-scale treatment facilities. The MWMF is a key element in reducing the risk in deployment of effective and environmentally acceptable treatment processes for organic mixed-waste streams. The MWMF will provide the engineering test data, formal evaluation, and operating experience that will be required for these demonstration systems to become accepted by EPA and deployable in waste treatment facilities. The deployment will also demonstrate how to approach the permitting process with the regulatory agencies and how to operate and maintain the processes in a safe manner. This document describes, at a high level, how the facility will be designed and operated to achieve this mission. It frequently refers the reader to additional documentation that provides more detail in specific areas. Effective evaluation of a technology consists of a variety of informal and formal demonstrations involving individual technology systems or subsystems, integrated technology system combinations, or complete integrated treatment trains. Informal demonstrations will typically be used to gather general operating information and to establish a basis for development of formal demonstration plans. Formal demonstrations consist of a specific series of tests that are used to rigorously demonstrate the operation or performance of a specific system configuration

  12. Chapter 8: Plasma operation and control [Progress in the ITER Physics Basis (PIPB)

    International Nuclear Information System (INIS)

    Gribov, Y.; Humphreys, D.; Kajiwara, K.; Lazarus, E.A.; Lister, J.B.; Ozeki, T.; Portone, A.; Shimada, M.; Sips, A.C.C.; Wesley, J.C.

    2007-01-01

    The ITER plasma control system has the same functional scope as the control systems in present tokamaks. These are plasma operation scenario sequencing, plasma basic control (magnetic and kinetic), plasma advanced control (control of RWMs, NTMs, ELMs, error fields, etc) and plasma fast shutdown. This chapter considers only plasma initiation and plasma basic control. This chapter describes the progress achieved in these areas in the tokamak experiments since the ITER Physics Basis (1999 Nucl. Fusion 39 2577) was written and the results of assessment of ITER to provide the plasma initiation and basic control. This assessment was done for the present ITER design (15 MA machine) at a more detailed level than it was done for the ITER design 1998 (21 MA machine) described in the ITER Physics Basis (1999 Nucl. Fusion 39 2577). The experiments on plasma initiation performed in DIII-D and JT-60U, as well as the theoretical studies performed for ITER, have demonstrated that, within specified assumptions on the plasma confinement and the impurity influx, ITER can produce plasma initiation in a low toroidal electric field (0.3 V m -1 ), if it is assisted by about 2 MW of ECRF heating. The plasma basic control includes control of the plasma current, position and shape-the plasma magnetic control, as well as control of other plasma global parameters or their profiles-the plasma performance control. The magnetic control is based on more reliable and simpler models of the control objects than those available at present for the plasma kinetic control. Moreover the real time diagnostics used for the magnetic control in many cases are more precise than those used for the kinetic control. Because of these reasons, the plasma magnetic control was developed for modern tokamaks and assessed for ITER better than the kinetic control. However, significant progress has been achieved in the plasma performance control during the last few years. Although the physics basis of plasma operation

  13. The investigation of the impacts of major disasters, on the basis of the Van earthquake (October 23, 2011, Turkey), on the profile of the injuries due to occupational accidents.

    Science.gov (United States)

    Hekimoglu, Yavuz; Dursun, Recep; Karadas, Sevdegul; Asirdizer, Mahmut

    2015-10-01

    The purpose of this study is to identify the impacts of major disasters, on the basis of the Van earthquake (October 23, 2011, Turkey), on the profile of the injuries due to occupational accidents. In this study, we evaluated 245 patients of occupational accidents who were admitted to emergency services of Van city hospitals in the 1-year periods including pre-earthquake and post-earthquake. We determined that there was a 63.4% (P accidents in the post-earthquake period compared to the pre-earthquake period. Also, injuries due to occupational accidents increased 211% (P accidents. In this study, the impact of disasters such as earthquakes on the accidents at work was evaluated as we have not seen in literature. This study emphasizes that governments should make regulations and process relating to the post-disaster business before the emergence of disaster by taking into account factors that may increase their work-related accidents. Copyright © 2015 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  14. Ecological Equivalence Assessment Methods: What Trade-Offs between Operationality, Scientific Basis and Comprehensiveness?

    Science.gov (United States)

    Bezombes, Lucie; Gaucherand, Stéphanie; Kerbiriou, Christian; Reinert, Marie-Eve; Spiegelberger, Thomas

    2017-08-01

    In many countries, biodiversity compensation is required to counterbalance negative impacts of development projects on biodiversity by carrying out ecological measures, called offset when the goal is to reach "no net loss" of biodiversity. One main issue is to ensure that offset gains are equivalent to impact-related losses. Ecological equivalence is assessed with ecological equivalence assessment methods taking into account a range of key considerations that we summarized as ecological, spatial, temporal, and uncertainty. When equivalence assessment methods take into account all considerations, we call them "comprehensive". Equivalence assessment methods should also aim to be science-based and operational, which is challenging. Many equivalence assessment methods have been developed worldwide but none is fully satisfying. In the present study, we examine 13 equivalence assessment methods in order to identify (i) their general structure and (ii) the synergies and trade-offs between equivalence assessment methods characteristics related to operationality, scientific-basis and comprehensiveness (called "challenges" in his paper). We evaluate each equivalence assessment methods on the basis of 12 criteria describing the level of achievement of each challenge. We observe that all equivalence assessment methods share a general structure, with possible improvements in the choice of target biodiversity, the indicators used, the integration of landscape context and the multipliers reflecting time lags and uncertainties. We show that no equivalence assessment methods combines all challenges perfectly. There are trade-offs between and within the challenges: operationality tends to be favored while scientific basis are integrated heterogeneously in equivalence assessment methods development. One way of improving the challenges combination would be the use of offset dedicated data-bases providing scientific feedbacks on previous offset measures.

  15. Viability of Event Management Business in Batangas City, Philippine: Basis for Business Operation Initiatives

    Directory of Open Access Journals (Sweden)

    Jeninah Christia D. Borbon

    2016-11-01

    Full Text Available The research study on Viability of Event Management Business in Batangas City: Basis for Business Operation Initiatives aimed to assess the viability of this type of business using Thompson’s (2005 Dimension of Business Viability as its tool in order to create business operation initiatives. It provided a good framework for defining success factors in entrepreneurial operation initiatives in a specific business type – event management. This study utilized event organizers based in Batangas, a southern popular province, which also is a great popular destination for many types of events. Findings showed that the event management business in Batangas City is generally a personal event type of business whose year of operation ranges from one to three years, mostly link to church or reception venues and usually offers on the day coordination. In the assessment of its perceived viability, it was found out that this type of business is moderately viable in terms of market, technical, business model, management model, economic and financial, and exit strategy. Among all the dimensions tested, only market, management model, economic and financial, and exit strategy showed significant relationship with the profile variables of the event management business. From the enumerated problems encountered, those that got the highest rate were demanding clients, overbooking of reservation/exceeding number of guests and failure to meet spectators and/or competitors expectations. And, the recommended business operation initiatives were based on the weaknesses discovered using Thompson’s Dimension of Business Viability Model.

  16. Representation of discrete Steklov-Poincare operator arising in domain decomposition methods in wavelet basis

    Energy Technology Data Exchange (ETDEWEB)

    Jemcov, A.; Matovic, M.D. [Queen`s Univ., Kingston, Ontario (Canada)

    1996-12-31

    This paper examines the sparse representation and preconditioning of a discrete Steklov-Poincare operator which arises in domain decomposition methods. A non-overlapping domain decomposition method is applied to a second order self-adjoint elliptic operator (Poisson equation), with homogeneous boundary conditions, as a model problem. It is shown that the discrete Steklov-Poincare operator allows sparse representation with a bounded condition number in wavelet basis if the transformation is followed by thresholding and resealing. These two steps combined enable the effective use of Krylov subspace methods as an iterative solution procedure for the system of linear equations. Finding the solution of an interface problem in domain decomposition methods, known as a Schur complement problem, has been shown to be equivalent to the discrete form of Steklov-Poincare operator. A common way to obtain Schur complement matrix is by ordering the matrix of discrete differential operator in subdomain node groups then block eliminating interface nodes. The result is a dense matrix which corresponds to the interface problem. This is equivalent to reducing the original problem to several smaller differential problems and one boundary integral equation problem for the subdomain interface.

  17. Perturbation theory of low-dimensional quantum liquids. I. The pseudoparticle-operator basis

    International Nuclear Information System (INIS)

    Carmelo, J.M.P.; Castro Neto, A.H.; Campbell, D.K.

    1994-01-01

    We introduce an operator algebra for the description of the low-energy physics of one-dimensional, integrable, multicomponent quantum liquids. Considering the particular case of the Hubbard chain in a magnetic field and chemical potential, we show that at low energy its Bethe-ansatz solution can be interpreted in terms of a pseudoparticle-operator algebra. Our algebraic approach provides a concise interpretation of, and justification for, several recent studies of low-energy excitations and trasnport which have been based on detailed analyses of specific Bethe-ansatz eigenfunctions and eigenenergies. A central point is that the exact ground state of the interacting many-electron problem is the noninteracting pseudoparticle ground state. Furthermore, in the pseudoparticle basis, the quantum problem becomes perturbative, i.e., the two-pseudoparticle forward-scattering vertices and amplitudes do not diverge, and one can define a many-pseudoparticle perturbation theory. We write the general quantum-liquid Hamiltonian in the pseudoparticle basis and show that the pseudoparticle-perturbation theory leads, in a natural way, to the generalized Landau-liquid approach

  18. A prototype operational earthquake loss model for California based on UCERF3-ETAS – A first look at valuation

    Science.gov (United States)

    Field, Edward; Porter, Keith; Milner, Kevn

    2017-01-01

    We present a prototype operational loss model based on UCERF3-ETAS, which is the third Uniform California Earthquake Rupture Forecast with an Epidemic Type Aftershock Sequence (ETAS) component. As such, UCERF3-ETAS represents the first earthquake forecast to relax fault segmentation assumptions and to include multi-fault ruptures, elastic-rebound, and spatiotemporal clustering, all of which seem important for generating realistic and useful aftershock statistics. UCERF3-ETAS is nevertheless an approximation of the system, however, so usefulness will vary and potential value needs to be ascertained in the context of each application. We examine this question with respect to statewide loss estimates, exemplifying how risk can be elevated by orders of magnitude due to triggered events following various scenario earthquakes. Two important considerations are the probability gains, relative to loss likelihoods in the absence of main shocks, and the rapid decay of gains with time. Significant uncertainties and model limitations remain, so we hope this paper will inspire similar analyses with respect to other risk metrics to help ascertain whether operationalization of UCERF3-ETAS would be worth the considerable resources required.

  19. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    International Nuclear Information System (INIS)

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs

  20. An approach to estimating radiological risk of offsite release from a design basis earthquake for the Process Experimental Pilot Plant (PREPP)

    Energy Technology Data Exchange (ETDEWEB)

    Lucero, V.; Meale, B.M.; Reny, D.A.; Brown, A.N.

    1990-09-01

    In compliance with Department of Energy (DOE) Order 6430.1A, a seismic analysis was performed on DOE's Process Experimental Pilot Plant (PREPP), a facility for processing low-level and transuranic (TRU) waste. Because no hazard curves were available for the Idaho National Engineering Laboratory (INEL), DOE guidelines were used to estimate the frequency for the specified design-basis earthquake (DBE). A dynamic structural analysis of the building was performed, using the DBE parameters, followed by a probabilistic risk assessment (PRA). For the PRA, a functional organization of the facility equipment was effected so that top events for a representative event tree model could be determined. Building response spectra (calculated from the structural analysis), in conjunction with generic fragility data, were used to generate fragility curves for the PREPP equipment. Using these curves, failure probabilities for each top event were calculated. These probabilities were integrated into the event tree model, and accident sequences and respective probabilities were calculated through quantification. By combining the sequences failure probabilities with a transport analysis of the estimated airborne source term from a DBE, onsite and offsite consequences were calculated. The results of the comprehensive analysis substantiated the ability of the PREPP facility to withstand a DBE with negligible consequence (i.e., estimated release was within personnel and environmental dose guidelines). 57 refs., 19 figs., 20 tabs.

  1. Our response to the earthquake at Onagawa Nuclear Power Station

    International Nuclear Information System (INIS)

    Hirakawa, Tomoshi

    2008-01-01

    When the Miyagi Offshore earthquake occurred on August 16, 2005, all three units at the Onagawa NPS were shut down automatically according to the Strong Seismic Acceleration' signal. Our inspection after the earthquake confirmed there was no damage to the equipment of the nuclear power plants, but the analysis of the response spectrum observed at the bedrock showed the earthquake had exceeded the 'design-basis earthquake', at certain periods, so that we implemented a review of the seismic safety of plant facilities. In the review, the ground motion of Miyagi Offshore Earthquake which are predicted to occur in the near future were reexamined based on the observation data, and then 'The Ground Motion for Safety Check' surpassing the supposed ground motion of the largest earthquake was established. The seismic safety of plant facilities, important for safety, was assured. At present, No.1 to No.3 units at Onagawa NPS have returned to normal operation. (author)

  2. Real Time Earthquake Information System in Japan

    Science.gov (United States)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  3. Diagnostics of PF-1000 facility operation and plasma concentration on the basis of spectral measurements

    International Nuclear Information System (INIS)

    Skladnik-Sadowska, E.; Malinowski, K.; Sadowski, M.J.; Scholz, M.; Tsarenko, A.V.

    2005-01-01

    The paper concerns the monitoring of the operation of high-current pulse discharges and the determination of the plasma concentration within the dense magnetized plasma column by means of optical spectroscopy methods. In experiments performed within the large PF-1000 facility, which is operated at IPPLM in Warsaw, particular attention was paid to possibility of the determination of correctness of the operational mode. In order to measure the visible radiation (VR), as emitted from the collapsing current sheath and the dense pinch region, the use was made of the MECHELLE R 900-optical-spectrometer, which was equipped with a CCD measuring head. The spectral measurements were performed at an angle of about 650 to the symmetry axis of the PF electrode system, through an optical window and a special collimator coupled with the quartz optical-cable. The observed VR emission originated from a part of the inner- and outer-electrode surfaces, the collapsing current-sheath layer and a portion of the dense plasma pinch-region (located a distance of 40-50 mm from the electrode ends). Considerable differences were found in the optical spectra recorded for so-called good shots and for cases of some failures. In the case of a breakdown (damage) of the main insulator there were observed different Al-lines, which originated from the eroded insulator material. At so-called bad vacuum conditions there were recorded various C-lines, and at an uncontrolled air-leakage into the experimental chamber there appeared numerous N-lines. The appearance of these characteristic spectral lines made possible to determine whether the operation of the PF-1000 facility was correct or incorrect. The paper reports also on estimates of plasma concentration values, which have been performed on the basis of a quantitative analysis of the Stark broadening of the selected spectral lines. (author)

  4. 29 CFR 780.407 - System must be nonprofit or operated on a share-crop basis.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 3 2010-07-01 2010-07-01 false System must be nonprofit or operated on a share-crop basis... Requirements Under Section 13(b)(12) The Irrigation Exemption § 780.407 System must be nonprofit or operated on... on facilities of any irrigation system unless the ditches, canals, reservoirs, or waterways in...

  5. A chemical basis for the partitioning of radionuclides in incinerator operation

    International Nuclear Information System (INIS)

    Burger, L.L.

    1995-01-01

    Incineration as a method of treating radioactive or mixed waste is attractive because of volume reduction, but may result in high concentrations of some hazardous components. For safety reasons during operation, and because of the environmental impact of the plant, it is important to know how these materials partition between the furnace slay, the fly ash, and the stack emission. The chemistry of about 50 elements is discussed and through consideration of high temperature thermodynamic equilibria, an attempt is made to provide a basis for predicting how various radionuclides and heavy metals behave in a typical incinerator. The chemistry of the individual elements is first considered and a prediction of the most stable chemical species in the typical incinerator atmosphere is made. The treatment emphasizes volatility and the parameters considered are temperature, acidity, oxygen, sulfur, and halogen content, and the presence of several other key non-radioactive elements. A computer model is used to calculate equilibrium concentrations of many species in several systems at temperatures ranging from 500 to 1600 degrees K. It is suggested that deliberate addition of various feed chemicals can have a major impact on the fate of many radionuclides and heavy metals. Several problems concerning limitations and application of the data are considered

  6. The power of simplification: Operator interface with the AP1000{sup R} during design-basis and beyond design-basis events

    Energy Technology Data Exchange (ETDEWEB)

    Williams, M. G.; Mouser, M. R.; Simon, J. B. [Westinghouse Electric Company, 1000 Westinghouse Drive, Cranberry Township, PA 16066 (United States)

    2012-07-01

    The AP1000{sup R} plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance, safety and cost. The passive safety features are designed to function without safety-grade support systems such as component cooling water, service water, compressed air or HVAC. The AP1000 passive safety features achieve and maintain safe shutdown in case of a design-basis accident for 72 hours without need for operator action, meeting the expectations provided in the European Utility Requirements and the Utility Requirement Document for passive plants. Limited operator actions may be required to maintain safe conditions in the spent fuel pool (SFP) via passive means. This safety approach therefore minimizes the reliance on operator action for accident mitigation, and this paper examines the operator interaction with the Human-System Interface (HSI) as the severity of an accident increases from an anticipated transient to a design basis accident and finally, to a beyond-design-basis event. The AP1000 Control Room design provides an extremely effective environment for addressing the first 72 hours of design-basis events and transients, providing ease of information dissemination and minimal reliance upon operator actions. Symptom-based procedures including Emergency Operating Procedures (EOPs), Abnormal Operating Procedures (AOPs) and Alarm Response Procedures (ARPs) are used to mitigate design basis transients and accidents. Use of the Computerized Procedure System (CPS) aids the operators during mitigation of the event. The CPS provides cues and direction to the operators as the event progresses. If the event becomes progressively worse or lasts longer than 72 hours, and depending upon the nature of failures that may have occurred, minimal operator actions may be required outside of the control room in areas that have been designed to be accessible using components that have been

  7. The power of simplification: Operator interface with the AP1000R during design-basis and beyond design-basis events

    International Nuclear Information System (INIS)

    Williams, M. G.; Mouser, M. R.; Simon, J. B.

    2012-01-01

    The AP1000 R plant is an 1100-MWe pressurized water reactor with passive safety features and extensive plant simplifications that enhance construction, operation, maintenance, safety and cost. The passive safety features are designed to function without safety-grade support systems such as component cooling water, service water, compressed air or HVAC. The AP1000 passive safety features achieve and maintain safe shutdown in case of a design-basis accident for 72 hours without need for operator action, meeting the expectations provided in the European Utility Requirements and the Utility Requirement Document for passive plants. Limited operator actions may be required to maintain safe conditions in the spent fuel pool (SFP) via passive means. This safety approach therefore minimizes the reliance on operator action for accident mitigation, and this paper examines the operator interaction with the Human-System Interface (HSI) as the severity of an accident increases from an anticipated transient to a design basis accident and finally, to a beyond-design-basis event. The AP1000 Control Room design provides an extremely effective environment for addressing the first 72 hours of design-basis events and transients, providing ease of information dissemination and minimal reliance upon operator actions. Symptom-based procedures including Emergency Operating Procedures (EOPs), Abnormal Operating Procedures (AOPs) and Alarm Response Procedures (ARPs) are used to mitigate design basis transients and accidents. Use of the Computerized Procedure System (CPS) aids the operators during mitigation of the event. The CPS provides cues and direction to the operators as the event progresses. If the event becomes progressively worse or lasts longer than 72 hours, and depending upon the nature of failures that may have occurred, minimal operator actions may be required outside of the control room in areas that have been designed to be accessible using components that have been designed

  8. Report on Disaster Medical Operations with Acupuncture/Massage Therapy after the Great East Japan Earthquake

    Directory of Open Access Journals (Sweden)

    Shin Takayama

    2012-01-01

    Full Text Available The Great East Japan Earthquake inflicted immense damage over a wide area of eastern Japan with the consequent tsunami. Department of Traditional Asian Medicine, Tohoku University, started providing medical assistance to the disaster-stricken regions mainly employing traditional Asian therapies. We visited seven evacuation centers in Miyagi and Fukushima Prefecture and provided acupuncture/massage therapy. While massage therapy was performed manually, filiform needles and press tack needles were used to administer acupuncture. In total, 553 people were treated (mean age, 54.0 years; 206 men, 347 women. Assessment by interview showed that the most common complaint was shoulder/back stiffness. The rate of therapy satisfaction was 92.3%. Many people answered that they experienced not only physical but also psychological relief. At the time of the disaster, acupuncture/massage therapy, which has both mental and physical soothing effects, may be a therapeutic approach that can be effectively used in combination with Western medical practices.

  9. SENSITIVITY ANALYSIS OF ORDERED WEIGHTED AVERAGING OPERATOR IN EARTHQUAKE VULNERABILITY ASSESSMENT

    Directory of Open Access Journals (Sweden)

    M. Moradi

    2013-09-01

    Full Text Available The main objective of this research is to find the extent to which the minimal variability Ordered Weighted Averaging (OWA model of seismic vulnerability assessment is sensitive to variation of optimism degree. There are a variety of models proposed for seismic vulnerability assessment. In order to examine the efficiency of seismic vulnerability assessment models, the stability of results could be analysed. Seismic vulnerability assessment is done to estimate the probable losses in the future earthquake. Multi-Criteria Decision Making (MCDM methods have been applied by a number of researchers to estimate the human, physical and financial losses in urban areas. The study area of this research is Tehran Metropolitan Area (TMA which has more than eight million inhabitants. In addition, this paper assumes that North Tehran Fault (NTF is activated and caused an earthquake in TMA. 1996 census data is used to extract the attribute values for six effective criteria in seismic vulnerability assessment. The results demonstrate that minimal variability OWA model of Seismic Loss Estimation (SLE is more stable where the aggregated seismic vulnerability degree has a lower value. Moreover, minimal variability OWA is very sensitive to optimism degree in northern areas of Tehran. A number of statistical units in southern areas of the city also indicate considerable sensitivity to optimism degree due to numerous non-standard buildings. In addition, the change of seismic vulnerability degree caused by variation of optimism degree does not exceed 25 % of the original value which means that the overall accuracy of the model is acceptable.

  10. Proposal on data collection for an international earthquake experience data

    International Nuclear Information System (INIS)

    Masopust, R.

    2001-01-01

    Earthquake experience data was recognized as an efficient basis for verification of seismic adequacy of equipment installed on NPPs. This paper is meant to initiate a database setup in order to use the seismic experience to establish the generic seismic resistance of NPPs equipment applicable namely to the Middle and East European countries. Such earthquake experience database should be then compared to the already existing and well-known SQUG-GIP database. To set up such an operational earthquake database will require an important amount of effort. It must be understood that this goal can be achieved only based on a long term permanent activities and coordinated cooperation of various institutions. (author)

  11. The matrix elements of the potential energy operator between the Sp(2,R) basis generating functions. Near-magic nuclei

    International Nuclear Information System (INIS)

    Filippov, G.F.; Ovcharenko, V.I.; Teryoshin, Yu.V.

    1980-01-01

    For near-magnetic nuclei, the matrix elements of the central exchange nucleon-nucleon interaction potential energy operator between the generating functions of the total basis of the Sn are obtained. The basis states are highest weigt vectorsp(2,R) irreducible representatio of the SO(3) irredicible representation and in addition, have a definite O(A-1) symmetry. The Sp(2,R) basis generating matrix elements simplify essentially the problem of calculating the spectrum of collective excitations of the atomic nucleus over an intrinsic function of definite O(A-1) symmetry

  12. From Tornadoes to Earthquakes: Forecast Verification for Binary Events Applied to the 1999 Chi-Chi, Taiwan,Earthquake

    Directory of Open Access Journals (Sweden)

    Chien-Chih Chen

    2006-01-01

    Full Text Available Forecast verification procedures for statistical events with binary outcomes typically rely on the use of contingency tables and Relative Operating Characteristic (ROC diagrams. Originally developed for the statistical evaluation of tornado forecasts on a county-by-county basis, these methods can be adapted to the evaluation of competing earthquake forecasts. Here we apply these methods retrospectively to two forecasts for the M 7.3 1999 Chi-Chi, Taiwan, earthquake. We show that a previously proposed forecast method that is based on evaluating changes in seismic intensity on a regional basis is superior to a forecast based only on the magnitude of seismic intensity in the same region. Our results confirm earlier suggestions that the earthquake preparation process for events such as the Chi-Chi earthquake involves anomalous activation or quiescence, and that signatures of these processes can be detected in seismicity data using appropriate methods.

  13. Mathematical basis for the process of model simulation of drilling operations

    Energy Technology Data Exchange (ETDEWEB)

    Lipovetskiy, G M; Lebedinskiy, G L

    1979-01-01

    The authors describe the application of a method for the model simulation of drilling operations and for the solution of problems concerned with the planning and management of such operations. A description is offered for an approach to the simulator process when the drilling operations are part of a large system. An algorithm is provided for calculating complex events.

  14. Depository of ampoule ionizing radiation sources on the basis of stand complex Baikal-I. Operation experience and application perspectives

    International Nuclear Information System (INIS)

    Ganzha, V.V.; Boltovskij, S.A.; Kolbaenkov, A.N.; Meshin, M.M.; Nasonov, S.G.; Pivovarov, O.S.; Storozhenko, A.S.; Yakovlev, V.V.

    2001-01-01

    Depository of ampoule sources of ionizing radiation (ASIR) on the basis of stand complex Baikal-I was founded an put into operation in 1995. It is intended for prolonged storage of the spent ASIR from a different institutions of Kazakhstan. To the present time a more than 10000 spent ASIR with activity more than 2000 Ci were taken and placed for storage

  15. Earthquakes and Schools

    Science.gov (United States)

    National Clearinghouse for Educational Facilities, 2008

    2008-01-01

    Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…

  16. Duration and predictors of emergency surgical operations - basis for medical management of mass casualty incidents

    Directory of Open Access Journals (Sweden)

    Huber-Wagner S

    2009-12-01

    Full Text Available Abstract Background Hospitals have a critically important role in the management of mass causality incidents (MCI, yet there is little information to assist emergency planners. A significantly limiting factor of a hospital's capability to treat those affected is its surgical capacity. We therefore intended to provide data about the duration and predictors of life saving operations. Methods The data of 20,815 predominantly blunt trauma patients recorded in the Trauma Registry of the German-Trauma-Society was retrospectively analyzed to calculate the duration of life-saving operations as well as their predictors. Inclusion criteria were an ISS ≥ 16 and the performance of relevant ICPM-coded procedures within 6 h of admission. Results From 1,228 patients fulfilling the inclusion criteria 1,793 operations could be identified as life-saving operations. Acute injuries to the abdomen accounted for 54.1% followed by head injuries (26.3%, pelvic injuries (11.5%, thoracic injuries (5.0% and major amputations (3.1%. The mean cut to suture time was 130 min (IQR 65-165 min. Logistic regression revealed 8 variables associated with an emergency operation: AIS of abdomen ≥ 3 (OR 4,00, ISS ≥ 35 (OR 2,94, hemoglobin level ≤ 8 mg/dL (OR 1,40, pulse rate on hospital admission 120/min (OR 1,39, blood pressure on hospital admission Conclusions The mean operation time of 130 min calculated for emergency life-saving surgical operations provides a realistic guideline for the prospective treatment capacity which can be estimated and projected into an actual incident admission capacity. Knowledge of predictive factors for life-saving emergency operations helps to identify those patients that need most urgent operative treatment in case of blunt MCI.

  17. Symplectic Group Representation of the Two-Mode Squeezing Operator in the Coherent State Basis

    Science.gov (United States)

    Fan, Hong-Yi; Chen, Jun-Hua

    2003-11-01

    We find that the coherent state projection operator representation of the two-mode squeezing operator constitutes a loyal group representation of symplectic group, which is a remarkable property of the coherent state. As a consequence, the resultant effect of successively applying two-mode squeezing operators are equivalent to a single squeezing in the two-mode Fock space. Generalization of this property to the 2n-mode case is also discussed. The project supported by National Natural Science Foundation of China under Grant No. 10575057

  18. Batangas Heavy Fabrication Yard Multi-Purpose Cooperative: Basis for Business Operation

    Directory of Open Access Journals (Sweden)

    JENNIFER D. MASICAT

    2014-08-01

    Full Text Available This research aimed to determine the proposed business initiatives to enhance the operation of Batangas Heavy Fabrication Yard Multi-Purpose Cooperative for the long survival and growth. More specifically, it shall answer the following objectives to describe the profile of the respondents in terms of their age, gender, type of membership and shared capital; to assess the business operation of the cooperative in the aspects of its management, marketing, finances, facilities and technology and their delivery of services; to identify the problems encountered by the cooperative in its business operation; to determine the significant relationship between the profile of the respondents and their assessment to its business operation; and to propose an action plan and to assess the business operation of BHFY Multi-Purpose Cooperative. The researcher used the descriptive correlation design in the study to obtain information concerning the current status of the BHFY-MPC cooperative; to describe what exists with respect to the variables or conditions in a situation. Based on the result, majority of the members are aged 51 to 55 years old, holding the regular type of membership and have a shared capital ranging from 51,001 to 100,000.The finding of the study states that the BHFY Multi-purpose cooperative performs well in terms of its management, marketing, finances, facilities and technology and delivery of services. Also, there are problems seldom encountered in the operation of the cooperative but the cooperative never encountered problems like overinvestment, ineffective leadership of management team and board of directors, inadequate source of fund, income of cooperative affected by negative issues and mismanagement of funds by the officers. Also, the type of membership influences the members’ assessment on the type of delivery of services; moreover, age contributes to the assessment of the business operation in terms of management and delivery of

  19. Research to Operations: From Point Positions, Earthquake and Tsunami Modeling to GNSS-augmented Tsunami Early Warning

    Science.gov (United States)

    Stough, T.; Green, D. S.

    2017-12-01

    This collaborative research to operations demonstration brings together the data and algorithms from NASA research, technology, and applications-funded projects to deliver relevant data streams, algorithms, predictive models, and visualization tools to the NOAA National Tsunami Warning Center (NTWC) and Pacific Tsunami Warning Center (PTWC). Using real-time GNSS data and models in an operational environment, we will test and evaluate an augmented capability for tsunami early warning. Each of three research groups collect data from a selected network of real-time GNSS stations, exchange data consisting of independently processed 1 Hz station displacements, and merge the output into a single, more accurate and reliable set. The resulting merged data stream is delivered from three redundant locations to the TWCs with a latency of 5-10 seconds. Data from a number of seismogeodetic stations with collocated GPS and accelerometer instruments are processed for displacements and seismic velocities and also delivered. Algorithms for locating and determining the magnitude of earthquakes as well as algorithms that compute the source function of a potential tsunami using this new data stream are included in the demonstration. The delivered data, algorithms, models and tools are hosted on NOAA-operated machines at both warning centers, and, once tested, the results will be evaluated for utility in improving the speed and accuracy of tsunami warnings. This collaboration has the potential to dramatically improve the speed and accuracy of the TWCs local tsunami information over the current seismometer-only based methods. In our first year of this work, we have established and deployed an architecture for data movement and algorithm installation at the TWC's. We are addressing data quality issues and porting algorithms into the TWCs operating environment. Our initial module deliveries will focus on estimating moment magnitude (Mw) from Peak Ground Displacement (PGD), within 2

  20. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo; John Forester; David Gertman; Jeffrey Joe; Heather Medema; Julius Persensky; April Whaley

    2013-04-01

    This report presents preliminary research results from the investigation in to the development of new models and guidance for concepts of operations (ConOps) in advanced small modular reactor (aSMR) designs. In support of this objective, three important research areas were included: operating principles of multi-modular plants, functional allocation models and strategies that would affect the development of new, non-traditional concept of operations, and the requiremetns for human performance, based upon work domain analysis and current regulatory requirements. As part of the approach for this report, we outline potential functions, including the theoretical and operational foundations for the development of a new functional allocation model and the identification of specific regulatory requirements that will influence the development of future concept of operations. The report also highlights changes in research strategy prompted by confirmationof the importance of applying the work domain analysis methodology to a reference aSMR design. It is described how this methodology will enrich the findings from this phase of the project in the subsequent phases and help in identification of metrics and focused studies for the determination of human performance criteria that can be used to support the design process.

  1. Identification of the seismogenic source of the 1875 Cucuta earthquake on the basis of a combination of neotectonic, paleoseismologic and historic seismicity studies

    Science.gov (United States)

    Rodríguez, Luz; Diederix, Hans; Torres, Eliana; Audemard, Franck; Hernández, Catalina; Singer, André; Bohórquez, Olga; Yepez, Santiago

    2018-03-01

    An interesting variety of field evidence that collectively cover the three branches of Earthquake Geology: Neotectonics, Paleoseismology and Historical seismicity, has been collected in the border area between Venezuela and Colombia, near the town of San José de Cúcuta, as part of a study aimed at establishing the seismic source of the great Cucuta Earthquake, that occurred on May 18th, 1875, and that caused heavy losses of life and destruction on both sides of the border, between the Department of Norte de Santander in Colombia and Táchira state in Venezuela. This region is affected by the activity of several cross-border fault systems that converge in the zone of the so-called Pamplona Indenter. Among these seismic sources, the potential candidates of this destructive seismic event in 1875 are those related to the Boconó Fault System, of the northwestern foothills of the Mérida Andes and in particular it's most northwestern expression, the Aguas Calientes Fault System, as suggested by previous research carried out by FUNVISIS for the Venezuelan oil industry in the late 80s. In order to confirm whether this was the responsible system for the earthquake or not, the following studies were carried out: 1) In Neotectonics, a detailed binational surface mapping of the active faults of this system was carried out. This system consists of three branches referred to in this paper as: the North, Central and South branch respectively; 2) In Paleoseismology, two trenches were excavated. The first trench was excavated across the South branch and the second one across the North branch, which confirmed fault activity during the Holocene epoch; 3) In historical seismicity the direct coseismic surface effects that occurred in the epicentral area of the earthquake were assessed. All evidence collected and integrated in these three lines of research, made it possible to conclude that the Central branch of the Aguas Calientes fault system is the most likely candidate to have

  2. Draft Function Allocation Framework and Preliminary Technical Basis for Advanced SMR Concepts of Operations

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States); Forester, John [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gertman, David [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Medema, Heather [Idaho National Lab. (INL), Idaho Falls, ID (United States); Persensky, Julius [Idaho National Lab. (INL), Idaho Falls, ID (United States); Whaley, April [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-08-01

    This report presents preliminary research results from the investigation into the development of new models and guidance for Concepts of Operations in advanced small modular reactor (AdvSMR) designs. AdvSMRs are nuclear power plants (NPPs), but unlike conventional large NPPs that are constructed on site, AdvSMRs systems and components will be fabricated in a factory and then assembled on site. AdvSMRs will also use advanced digital instrumentation and control systems, and make greater use of automation. Some AdvSMR designs also propose to be operated in a multi-unit configuration with a single central control room as a way to be more cost-competitive with existing NPPs. These differences from conventional NPPs not only pose technical and operational challenges, but they will undoubtedly also have regulatory compliance implications, especially with respect to staffing requirements and safety standards.

  3. Development of Emergency Operating Strategies for Beyond Design Basis External Event(BDBEE)s in Korean WH Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Duk-Joo; Lee, Seung-Chan; Sung, Je-Joong; Ha, Sang-Jun [KHNP CRI, Daejeon (Korea, Republic of); Hong, Soon-Joon; Hwang, Su-Hyun; Lee, Byung-Chul; Park, Kang-Min [FNC Tech. Co., Yongin (Korea, Republic of)

    2015-10-15

    Westinghouse developed and connected emergency operating procedures into a set of FLEX Support Guidelines(FSGs). This paper explains that Korean WH(Westinghouse) type nuclear power plants develop emergency operating strategies for ELAP(extended loss of all AC power), which include guidelines to use permanent and portable equipment as necessary to prevent core damage until AC power is restored from a reliable alternate source of AC power. The Korean emergency operating response strategies were developed to cope with a ELAP such as Fukushima event. The strategies include guidelines to prevent fuel damage using the FLEX equipment. Operators should take actions to prepare FLEX equipment within license basis SBO coping time. The loss of all AC power has been analyzed to identify the behavior of major NSSS process variables using RELAP computer code. The accident analysis showed that the plant does not result in fuel damage in 72 hours after an ELAP if operators take actions to cool RCS with opening of SG ADV in 5 gpm seal leak case. In this scenario, because ELAP is in process and all power cannot be used, operator should operate the FLEX equipment in order to actuate active equipment using the EOP fo SBO response. This strategy will prevent entering SAMG because this actions result in core cooling and stay in core exit temperature less than 650 .deg. C. Korean emergency operating guidelines(EOGs) will be developed using this strategies for response to the BDBEE.

  4. Rosetta: an operator basis translator for standard model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Falkowski, Adam [Laboratoire de Physique Théorique, Bat. 210, Université Paris-Sud, 91405, Orsay (France); Fuks, Benjamin [Département Recherches Subatomiques, Institut Pluridisciplinaire Hubert Curien, Université de Strasbourg/CNRS-IN2P3, 23 rue du Loess, 67037, Strasbourg (France); Mawatari, Kentarou [Theoretische Natuurkunde and IIHE/ELEM, Vrije Universiteit Brussel, and International Solvay Institutes, Pleinlaan 2, 1050, Brussels (Belgium); Mimasu, Ken, E-mail: k.mimasu@sussex.ac.uk [Department of Physics and Astronomy, University of Sussex, BN1 9QH, Brighton (United Kingdom); Riva, Francesco [CERN, Theory Division, 1211, Geneva (Switzerland); Sanz, Verónica [Department of Physics and Astronomy, University of Sussex, BN1 9QH, Brighton (United Kingdom)

    2015-12-10

    We introduce Rosetta, a program allowing for the translation between different bases of effective field theory operators. We present the main functions of the program and provide an example of usage. One of the Lagrangians which Rosetta can translate into has been implemented into FeynRules, which allows Rosetta to be interfaced into various high-energy physics programs such as Monte Carlo event generators. In addition to popular bases choices, such as the Warsaw and Strongly Interacting Light Higgs bases already implemented in the program, we also detail how to add new operator bases into the Rosetta package. In this way, phenomenological studies using an effective field theory framework can be straightforwardly performed.

  5. Rosetta: an operator basis translator for standard model effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Falkowski, Adam [Universite Paris-Sud, Laboratoire de Physique Theorique, Bat. 210, Orsay (France); Fuks, Benjamin [Universite de Strasbourg/CNRS-IN2P3, Departement Recherches Subatomiques, Institut Pluridisciplinaire Hubert Curien, Strasbourg (France); Mawatari, Kentarou [Theoretische Natuurkunde and IIHE/ELEM, Vrije Universiteit Brussel, and International Solvay Institutes, Brussels (Belgium); Mimasu, Ken; Sanz, Veronica [University of Sussex, Department of Physics and Astronomy, Brighton (United Kingdom); Riva, Francesco [CERN, Theory Division, Geneva (Switzerland)

    2015-12-15

    We introduce Rosetta, a program allowing for the translation between different bases of effective field theory operators. We present the main functions of the program and provide an example of usage. One of the Lagrangians which Rosetta can translate into has been implemented into FeynRules, which allows Rosetta to be interfaced into various high-energy physics programs such as Monte Carlo event generators. In addition to popular bases choices, such as the Warsaw and Strongly Interacting Light Higgs bases already implemented in the program, we also detail how to add new operator bases into the Rosetta package. In this way, phenomenological studies using an effective field theory framework can be straightforwardly performed. (orig.)

  6. Multi-station basis for Polar Cap (PC) indices: ensuring credibility and operational reliability

    Science.gov (United States)

    Stauning, Peter

    2018-02-01

    The Polar Cap (PC) indices, PCN (North) and PCS (South) are based on polar geomagnetic observations from Qaanaaq (Thule) and Vostok, respectively, processed to measure the transpolar plasma convection that may seriously affect space weather conditions. To establish reliable space weather forecasts based on PC indices, and also to ensure credibility of their use for scientific analyses of solar wind-magnetosphere interactions, additional sources of data for the PC indices are investigated. In the search for alternative index sources, objective quality criteria are established here to be used for the selection among potential candidates. These criteria are applied to existing PC index series to establish a quality scale. In the Canadian region, the data from Resolute Bay magnetometer are shown to provide alternative PCN indices of adequate quality. In Antarctica, the data from Concordia Dome-C observatory are shown to provide basis for alternative PCS indices. In examples to document the usefulness of these alternative index sources it is shown that PCN indices in a real-time version based on magnetometer data from Resolute Bay could have given 6 h of early warning, of which the last 2 h were "red alert", up to the onset of the strong substorm event on 13 March 1989 that caused power outage in Quebec. The alternative PCS indices based on data from Dome-C have helped to disclose that presently available Vostok-based PCS index values are corrupted throughout most of 2011.

  7. Guide line for operator in beyond design basis events for AHWR

    International Nuclear Information System (INIS)

    Kumar, Mithilesh; Mukhopadhyay, D.; Lele, H.G.; Vaze, K.K.

    2011-01-01

    Enhanced defence-in-depth is incorporated in the proposed Advanced Heavy Water Reactor (AHWR) as a part of their fundamental safety approach to ensure that the levels of protection in defence-in-depth shall be more independent from each other than in existing installation. Safety is enhanced by incorporating into their designs, increased emphasis on inherently safe characteristics and passive systems as a part of their fundamental safety approach. It is ensured that the risk from radiation exposures to workers, the public and the environment during construction/commissioning, operation, and decommissioning, shall be comparable to that of other industrial facilities used for similar purposes. This implies that there will be no need for relocation or evacuation measures outside the plant site, apart from those generic emergency measures developed for any industrial facility. It has been demonstrated by analyses that there is no core damage for PIEs with frequencies more than 10- 10 /year. However some scenarios in residual risk domain are considered to demonstrate that dose at plant boundary is within prescribed acceptable limit. It is also possible to arrest core damage progression at various stages of event progression, by incorporating certain operating procedures, without any release. This paper discusses analyses of such low frequency event with multiple failure under the category of 'Decrease in MHT inventory' where plant related symptoms like channel exit temperature, channel component temperatures, moderator level etc. with respect to time are quantified. The operator guide line has been given for case like Loss of coolant without Emergency core coolant system (ECCS) and loss moderator heat sink. It has been observed that 3.0 kg/s mass flow rate is adequate to capture the rising trend of clad surface temperature. (author)

  8. Lifetime Management Programs as a basis for the long term operation of nuclear installations

    Energy Technology Data Exchange (ETDEWEB)

    López González, Manuel; Lobato Galeote, Carlos, E-mail: mlopezg@idom.com, E-mail: carlos.lobato@idom.com [IDOM - Consulting, Engineering & Architecture SAU, Madrid (Spain)

    2017-07-01

    From the licensing standpoint there are several approaches worldwide to obtain an authorization to operate a NPP beyond its design life. According to the License Renewal Application (LRA) approach, followed in the United States of America and another countries, plants need to develop a Life Time Management Program (LTMP) with which manage the potential aging processes (corrosion, erosion, erosion-corrosion, radiation and thermally induced embrittlement, fatigue, corrosion fatigue, creep, binding and wear) associated to the Structures, Systems and Components. A LTMP is composed of several tasks which represents a technical challenge for a nuclear installation. (author)

  9. Electricity storages - optimised operation based on spot market prices; Stromspeicher. Optimierte Fahrweise auf Basis der Spotmarktpreise

    Energy Technology Data Exchange (ETDEWEB)

    Bernhard, Dominik; Roon, Serafin von [FfE Forschungsstelle fuer Energiewirtschaft e.V., Muenchen (Germany)

    2010-06-15

    With its integrated energy and climate package the last federal government set itself ambitious goals for the improvement of energy efficiency and growth of renewable energy production. These goals were confirmed by the new government in its coalition agreement. However, they can only be realised if the supply of electricity from fluctuating renewable sources can be made to coincide with electricity demand. Electricity storages are therefore an indispensable component of the future energy supply system. This article studies the optimised operation of an electricity storage based on spot market prices and the influence of wind power production up to the year 2020.

  10. Lifetime Management Programs as a basis for the long term operation of nuclear installations

    International Nuclear Information System (INIS)

    López González, Manuel; Lobato Galeote, Carlos

    2017-01-01

    From the licensing standpoint there are several approaches worldwide to obtain an authorization to operate a NPP beyond its design life. According to the License Renewal Application (LRA) approach, followed in the United States of America and another countries, plants need to develop a Life Time Management Program (LTMP) with which manage the potential aging processes (corrosion, erosion, erosion-corrosion, radiation and thermally induced embrittlement, fatigue, corrosion fatigue, creep, binding and wear) associated to the Structures, Systems and Components. A LTMP is composed of several tasks which represents a technical challenge for a nuclear installation. (author)

  11. STATEMENT OF CASH FLOWS - A MEASURE OF OPERATIONAL PERFORMANCE ON AN ACCRUAL BASIS

    Directory of Open Access Journals (Sweden)

    GHEORGHE LEPADATU

    2011-04-01

    Full Text Available Statement of cash flows presents useful information about changing the company's financial position, allowing to assess the enterprise’s ability to generate future cash flows and cash equivalents in the operating, investing and financing activities and their appropriate use. Treasury of an economic entity can be considered its strong point. The manner in which they manage money and financial flows, the final outcome will depend on the respective entity. Treasury is also an essential and main restriction of the financial management of the enterprise. Treasury embodies the results of operations and how to achieve financial balance of compliance. Not always an entity that ends year with benefits, has a positive cash (cash at bank and in availability. And this, because the gap between the recording and accounting of revenue and expenditure receipts and payments as they fall due, that gap can be decisive for the fate of the enterprise. This is a major requirement of the accrual. Therefore, an efficient management of the economic entity comprises both the asset management flows (revenues / expenses and cash management, i.e. the flows of receipts and payments. The statistical evidence shows that most of the failures are due to weaknesses in treasury management.

  12. The use of economic criteria in providing a basis for safe reactor operation

    International Nuclear Information System (INIS)

    Graham, J.

    1989-01-01

    Probabilistic criteria based upon an acceptance measure of protection for owner investment can complete the range of design probabilistic criteria between those set by acceptance public safety and those set by acceptable reliability in plant operation. Criteria which address the protection of owner investment have the benefit of lowering risk in adjacent risk regions by providing greater reliability in operation as well as less risk to the safety of the public and the environment. Such investment protection criteria are currently being used to extend plant life but they could also be used very beneficially as part of the initial design process. In this paper trial criteria are suggested which address the risk of extended plant shutdown with the resultant necessity to purchase replacement power, and the risk of replacement of expensive plant components. Additional financial assessment is required to ensure that there is a proper correlation between acceptable measures of owner-investment protection and the levels of probabilistic defence suggested, but the trial criteria proposed can be used as important practical design criteria

  13. Guidelines for nuclear plant response to an earthquake

    International Nuclear Information System (INIS)

    1989-12-01

    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. The objectives of the earthquake response procedures are to determine (1) the immediate effects of an earthquake on the physical condition of the nuclear power plant, (2) if shutdown of the plant is appropriate based on the observed damage to the plant or because the OBE has been exceeded, and (3) the readiness of the plant to resume operation following shutdown due to an earthquake. Readiness of a nuclear power plant to restart is determined on the basis of visual inspections of nuclear plant equipment and structures, and the successful completion of surveillance tests which demonstrate that the limiting conditions for operation as defined in the plant Technical Specifications are met. The guidelines are based on information obtained from a review of earthquake response procedures from numerous US and foreign nuclear power plants, interviews with nuclear plant operations personnel, and a review of reports of damage to industrial equipment and structures in actual earthquakes. 7 refs., 4 figs., 4 tabs

  14. Regulatory standpoints on the design-basis capability of safety-related motor-operated valves(MOVs) and power-operated gate valves(POGVs)

    International Nuclear Information System (INIS)

    Kim, W. T.; Kum, O. H.

    1999-01-01

    The weakness in the design-basis capability of Motor-Operated Valves(MOVs) and the susceptibility to Pressure Locking and Thermal Binding phenomena of Power-Operated Gate Valves(POGVs) have been major concerns to be resolved in the nuclear society in and abroad since Three Mile Island accident occurred in the USA in 1979. Through detailed analysis of operating experience and regulatory activities, some MOVs and POGVs have been found to be unreliable in performing their safety functions when they are required to do so under certain conditions, especially under design-basis accident conditions. Further, it is well understood that these safety problems may not be identified by the typical valve in-service testing(IST). USNRC has published three Generic Letters, GL 89-10, GL 95-07, and GL 96-05, requiring nuclear plant licensees to take appropriate actions to resolve the problems mentioned above. Korean nuclear regulatory body has made public an administration measure called 'Regulatory recommendation to verify safety functions of the safety-related MOVs and POGVs' on June 13, 1997, and in this administration measure Korean utility is asked to submit written documents to show how it assure design-basis capability of these valves. The following are among the major concerns being considered from a regulation standpoint. Program scope and implementation priority, dynamic tests under differential pressure conditions, accuracy of diagnostic equipment, torque switch setting and torque bypass percentage, weak link analysis, motor actuator sizing, corrective actions taken to resolve pressure locking and thermal binding susceptibility, and a periodic verification program for the valves once design-basis capability has been verified

  15. Earthquake prediction

    International Nuclear Information System (INIS)

    Ward, P.L.

    1978-01-01

    The state of the art of earthquake prediction is summarized, the possible responses to such prediction are examined, and some needs in the present prediction program and in research related to use of this new technology are reviewed. Three basic aspects of earthquake prediction are discussed: location of the areas where large earthquakes are most likely to occur, observation within these areas of measurable changes (earthquake precursors) and determination of the area and time over which the earthquake will occur, and development of models of the earthquake source in order to interpret the precursors reliably. 6 figures

  16. radiation safety culture for developing country: Basis for s minimum operational radiation protection programme

    International Nuclear Information System (INIS)

    Rozental, J. J.

    1997-01-01

    The purpose of this document is to present a methodology for an integrated strategy aiming at establishing an adequate radiation Safety infrastructure for developing countries, non major power reactor programme. Its implementation will allow these countries, about 50% of the IAEA's Member States, to improve marginal radiation safety, specially to those recipients of technical assistance and do not meet the Minimum radiation Safety Requirements of the IAEA's Basic Safety Standards for radiation protection Progress in the implementation of safety regulations depends on the priority of the government and its understanding and conviction about the basic requirements for protection against the risks associated with exposure to ionizing radiation. There is no doubt to conclude that the reasons for the deficiency of sources control and dose limitation are related to the lack of an appropriate legal and regulatory framework, specially considering the establishment of an adequate legislation; A minimum legal infrastructure; A minimum operational radiation safety programme; Alternatives for a Point of Optimum Contact, to avoid overlap and conflict, that is: A 'Memorandum of Understanding' among Regulatory Authorities in the Country, dealing with similar type of licensing and inspection

  17. Waste Encapsulation and Storage Facility (WESF) Basis for Interim Operation (BIO)

    Energy Technology Data Exchange (ETDEWEB)

    COVEY, L.I.

    2000-11-28

    The Waste Encapsulation and Storage Facility (WESF) is located in the 200 East Area adjacent to B Plant on the Hanford Site north of Richland, Washington. The current WESF mission is to receive and store the cesium and strontium capsules that were manufactured at WESF in a safe manner and in compliance with all applicable rules and regulations. The scope of WESF operations is currently limited to receipt, inspection, decontamination, storage, and surveillance of capsules in addition to facility maintenance activities. The capsules are expected to be stored at WESF until the year 2017, at which time they will have been transferred for ultimate disposition. The WESF facility was designed and constructed to process, encapsulate, and store the extracted long-lived radionuclides, {sup 90}Sr and {sup 137}Cs, from wastes generated during the chemical processing of defense fuel on the Hanford Site thus ensuring isolation of hazardous radioisotopes from the environment. The construction of WESF started in 1971 and was completed in 1973. Some of the {sup 137}Cs capsules were leased by private irradiators or transferred to other programs. All leased capsules have been returned to WESF. Capsules transferred to other programs will not be returned except for the seven powder and pellet Type W overpacks already stored at WESF.

  18. Waste Encapsulation and Storage Facility (WESF) Basis for Interim Operation (BIO)

    International Nuclear Information System (INIS)

    COVEY, L.I.

    2000-01-01

    The Waste Encapsulation and Storage Facility (WESF) is located in the 200 East Area adjacent to B Plant on the Hanford Site north of Richland, Washington. The current WESF mission is to receive and store the cesium and strontium capsules that were manufactured at WESF in a safe manner and in compliance with all applicable rules and regulations. The scope of WESF operations is currently limited to receipt, inspection, decontamination, storage, and surveillance of capsules in addition to facility maintenance activities. The capsules are expected to be stored at WESF until the year 2017, at which time they will have been transferred for ultimate disposition. The WESF facility was designed and constructed to process, encapsulate, and store the extracted long-lived radionuclides, 90 Sr and 137 Cs, from wastes generated during the chemical processing of defense fuel on the Hanford Site thus ensuring isolation of hazardous radioisotopes from the environment. The construction of WESF started in 1971 and was completed in 1973. Some of the 137 Cs capsules were leased by private irradiators or transferred to other programs. All leased capsules have been returned to WESF. Capsules transferred to other programs will not be returned except for the seven powder and pellet Type W overpacks already stored at WESF

  19. A molecular basis for the self-incompatibility system operating in Brassica sp.

    Directory of Open Access Journals (Sweden)

    H. G. Dickinson

    2014-01-01

    Full Text Available Molecules contained in the sporophytically-derived coating of the pollen grain and in the superficial pellicle of the stigmatic papillae control the self-incompatibility response of the breeding system of Brassica. The stigmatic pellicle consists of a lipidic matrix in which float a mosaic of proteins many of which can rapidly be renewed from pools in the papillar cyto-plasm. A fraction of these proteins are involved in facilitating the passage of water to the pollen whilst another, possibly a glycoprotein, suppresses this activity in an incompatible mating. The pollen coating must also contain two sets of active molecules, one for identifying the stigmatic recognition molecules, and another for effecting the changes that take place tothe coat itself on compatible pollination. In essence, the self -incompatibility mechanism appears to operate through the control of water flow from 'the papilla to the grain. Even when incompatible grains manage to germinate by obtaining atmospheric water, their proteins will often stimulate a reaction in the stigmatic papilla once the cuticle has been penetrated.

  20. Modeling of the Reactor Core Isolation Cooling Response to Beyond Design Basis Operations - Interim Report

    International Nuclear Information System (INIS)

    Ross, Kyle; Cardoni, Jeffrey N.; Wilson, Chisom Shawn; Morrow, Charles; Osborn, Douglas; Gauntt, Randall O.

    2015-01-01

    Efforts are being pursued to develop and qualify a system-level model of a reactor core isolation (RCIC) steam-turbine-driven pump. The model is being developed with the intent of employing it to inform the design of experimental configurations for full-scale RCIC testing. The model is expected to be especially valuable in sizing equipment needed in the testing. An additional intent is to use the model in understanding more fully how RCIC apparently managed to operate far removed from its design envelope in the Fukushima Daiichi Unit 2 accident. RCIC modeling is proceeding along two avenues that are expected to complement each other well. The first avenue is the continued development of the system-level RCIC model that will serve in simulating a full reactor system or full experimental configuration of which a RCIC system is part. The model reasonably represents a RCIC system today, especially given design operating conditions, but lacks specifics that are likely important in representing the off-design conditions a RCIC system might experience in an emergency situation such as a loss of all electrical power. A known specific lacking in the system model, for example, is the efficiency at which a flashing slug of water (as opposed to a concentrated jet of steam) could propel the rotating drive wheel of a RCIC turbine. To address this specific, the second avenue is being pursued wherein computational fluid dynamics (CFD) analyses of such a jet are being carried out. The results of the CFD analyses will thus complement and inform the system modeling. The system modeling will, in turn, complement the CFD analysis by providing the system information needed to impose appropriate boundary conditions on the CFD simulations. The system model will be used to inform the selection of configurations and equipment best suitable of supporting planned RCIC experimental testing. Preliminary investigations with the RCIC model indicate that liquid water ingestion by the turbine

  1. Modeling of the Reactor Core Isolation Cooling Response to Beyond Design Basis Operations - Interim Report

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Cardoni, Jeffrey N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wilson, Chisom Shawn [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Morrow, Charles [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Osborn, Douglas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gauntt, Randall O. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Efforts are being pursued to develop and qualify a system-level model of a reactor core isolation (RCIC) steam-turbine-driven pump. The model is being developed with the intent of employing it to inform the design of experimental configurations for full-scale RCIC testing. The model is expected to be especially valuable in sizing equipment needed in the testing. An additional intent is to use the model in understanding more fully how RCIC apparently managed to operate far removed from its design envelope in the Fukushima Daiichi Unit 2 accident. RCIC modeling is proceeding along two avenues that are expected to complement each other well. The first avenue is the continued development of the system-level RCIC model that will serve in simulating a full reactor system or full experimental configuration of which a RCIC system is part. The model reasonably represents a RCIC system today, especially given design operating conditions, but lacks specifics that are likely important in representing the off-design conditions a RCIC system might experience in an emergency situation such as a loss of all electrical power. A known specific lacking in the system model, for example, is the efficiency at which a flashing slug of water (as opposed to a concentrated jet of steam) could propel the rotating drive wheel of a RCIC turbine. To address this specific, the second avenue is being pursued wherein computational fluid dynamics (CFD) analyses of such a jet are being carried out. The results of the CFD analyses will thus complement and inform the system modeling. The system modeling will, in turn, complement the CFD analysis by providing the system information needed to impose appropriate boundary conditions on the CFD simulations. The system model will be used to inform the selection of configurations and equipment best suitable of supporting planned RCIC experimental testing. Preliminary investigations with the RCIC model indicate that liquid water ingestion by the turbine

  2. Long Term Operation R and D to Investigate the Technical Basis for Life Extension and License Renewal Decisions

    International Nuclear Information System (INIS)

    Gaertner, John

    2012-01-01

    Establishing an improved technical basis for long term operation of existing plants is a nuclear industry priority. The Electric Power Research Institute (EPRI) has responded with a comprehensive Long Term Operation (LTO) Program addressing this need for existing nuclear power plants world-wide. The program supports both the business decisions necessary to achieve high performance operation and the licensing requirements for operation beyond 60 years. The program selects its R and D priorities in a structured and objective way with much industry input to provide useful results for decisions in the 2014 to 2019 time frame. The program is highly collaborative with U.S. Department of Energy (DOE) and with EPRI-member utilities. The R and D portfolio includes materials aging (metals, concrete, and cables), modernization of information and control technology, enhanced safety analysis, advanced fuel design, demonstration plant activities, life cycle management, and identification of aging management program need for subsequent license renewal. The program has focussed stakeholders world-wide on the technical issues of long term operation, and it is on-track to provide practical results for life extension and license renewal decisions. (author)

  3. Novel theory of the human brain: information-commutation basis of architecture and principles of operation

    Directory of Open Access Journals (Sweden)

    Bryukhovetskiy AS

    2015-02-01

    Full Text Available Andrey S Bryukhovetskiy Center for Biomedical Technologies, Federal Research and Clinical Center for Specialized Types of Medical Assistance and Medical Technologies of the Federal Medical Biological Agency, NeuroVita Clinic of Interventional and Restorative Neurology and Therapy, Moscow, Russia Abstract: Based on the methodology of the informational approach and research of the genome, proteome, and complete transcriptome profiles of different cells in the nervous tissue of the human brain, the author proposes a new theory of information-commutation organization and architecture of the human brain which is an alternative to the conventional systemic connective morphofunctional paradigm of the brain framework. Informational principles of brain operation are defined: the modular principle, holographic principle, principle of systematicity of vertical commutative connection and complexity of horizontal commutative connection, regulatory principle, relay principle, modulation principle, “illumination” principle, principle of personalized memory and intellect, and principle of low energy consumption. The author demonstrates that the cortex functions only as a switchboard and router of information, while information is processed outside the nervous tissue of the brain in the intermeningeal space. The main structural element of information-commutation in the brain is not the neuron, but information-commutation modules that are subdivided into receiver modules, transmitter modules, and subscriber modules, forming a vertical architecture of nervous tissue in the brain as information lines and information channels, and a horizontal architecture as central, intermediate, and peripheral information-commutation platforms. Information in information-commutation modules is transferred by means of the carriers that are characteristic to the specific information level from inductome to genome, transcriptome, proteome, metabolome, secretome, and magnetome

  4. Basis and algorithms applied in modern neutron flux monitoring equipment for WWER. Some results of its operation

    International Nuclear Information System (INIS)

    Alpatov, A. M.; Kamyshan, A. N.; Louzhnov, A. M.; Sokolov, I. V.

    2007-01-01

    The report presents the principle of operation and description of equipment complex for monitoring, control and protection by power, period, reactivity and local parameters of the core of WWER type reactor. The use in NFME of programmed computing means permitted on basis of signals from ex-core neutron detectors of working range, distributed over IC channel height, to realize operative non-inertia monitoring of mean axial power distribution shape in the core and its main characteristics (axial offset and axial non-uniformity coefficient). With regard for this and due to the use of information on position of control banks and on coolant temperature in the reactor vessel downcomer the equipment for power correction, eliminating the influence of the above factors on resulting signal and permitting to increase significantly the accuracy of power monitoring, was designed(Authors)

  5. Compressed modes for variational problems in mathematical physics and compactly supported multiresolution basis for the Laplace operator

    Science.gov (United States)

    Ozolins, Vidvuds; Lai, Rongjie; Caflisch, Russel; Osher, Stanley

    2014-03-01

    We will describe a general formalism for obtaining spatially localized (``sparse'') solutions to a class of problems in mathematical physics, which can be recast as variational optimization problems, such as the important case of Schrödinger's equation in quantum mechanics. Sparsity is achieved by adding an L1 regularization term to the variational principle, which is shown to yield solutions with compact support (``compressed modes''). Linear combinations of these modes approximate the eigenvalue spectrum and eigenfunctions in a systematically improvable manner, and the localization properties of compressed modes make them an attractive choice for use with efficient numerical algorithms that scale linearly with the problem size. In addition, we introduce an L1 regularized variational framework for developing a spatially localized basis, compressed plane waves (CPWs), that spans the eigenspace of a differential operator, for instance, the Laplace operator. Our approach generalizes the concept of plane waves to an orthogonal real-space basis with multiresolution capabilities. Supported by NSF Award DMR-1106024 (VO), DOE Contract No. DE-FG02-05ER25710 (RC) and ONR Grant No. N00014-11-1-719 (SO).

  6. Detection of ionospheric perturbations associated with Japanese earthquakes on the basis of reception of LF transmitter signals on the satellite DEMETER

    Directory of Open Access Journals (Sweden)

    F. Muto

    2008-02-01

    Full Text Available There have been recently reported a lot of electromagnetic phenomena associated with earthquakes (EQs. Among these, the ground-based reception of subionospheric waves from VLF/LF transmitters, is recognized as a promising tool to investigate the ionospheric perturbations associated with EQs. This paper deals with the corresponding whistler-mode signals in the upper ionosphere from those VLF/LF transmitters, which is the counterpart of subionospheric signals. The whistler-mode VLF/LF transmitter signals are detected on board the French satellite, DEMETER launched on 29 June 2004. We have chosen several large Japanese EQs including the Miyagi-oki EQ (16 August 2005; M=7.2, depth=36 km, and the target transmitter is a Japanese LF transmitter (JJY whose transmitter frequency is 40 kHz. Due to large longitudinal separation of each satellite orbit (2500 km, we have to adopt a statistical analysis over a rather long period (such as 3 weeks or one month to have reliable data set. By analyzing the spatial distribution of JJY signal intensity (in the form of signal to noise ratio SNR during a period of 4 months including the Miyagi-oki EQ, we have found significant changes in the intensity; generally the SNR is significantly depleted before the EQ, which is considered to be a precursory ionospheric signature of the EQ. This abnormal effect is reasonably explained in terms of either (1 enhanced absorption of whistler-mode LF signals in the lower ionosphere due to the lowering of the lower ionosphere, or (2 nonlinear wave-wave scattering. Finally, this analysis suggests an important role of satellite observation in the study of lithosphere-atmosphere-ionosphere coupling.

  7. The installation and operation of the seismic instrumentation in Korean NPPs

    International Nuclear Information System (INIS)

    Lee, Kye Hyun; Baek, Yong Lak; Chung, Yun Suk

    1994-01-01

    Including the 7 October, 1978, Hongsung earthquake, many earthquakes have occurred in our country. The Korean peninsular is no longer a safety zone against earthquakes, and there are possibilities of the damage they cause. So therefore, it is essential to verify the safety of the safety-related facilities in the event of an earthquake. If an earthquake occurs, seismic instrumentation provides information on the vibratory ground motion and resultant vibratory responses of representative safety-related structures and equipment so that an evaluation can be made immediately as to whether or not the design response spectra have been exceeded. In this paper, general descriptions of the seismic instrumentation installed in domestic NPPs will be discussed; this includes instrument type and location, the Operating Basis Earthquake (OBE) exceedance criteria, and processing and evaluation of earthquake response data, and items to be studied for further enhancement of post-earthquake evaluation techniques are presented

  8. Evaluation of economic and technical efficiency of diesel engines operation on the basis of volume combustion rate

    Directory of Open Access Journals (Sweden)

    І. О. Берестовой

    2016-11-01

    Full Text Available The article deals with a new approach to evaluation of complex efficiency of diesel engines. Traditionally, cylinder’s capacity, rotation frequency, average efficient pressure inside cylinder, piston’s stroke, average piston’s velocity, fuel specific consumption and other indices are used as generalizing criteria, characterizing diesel engine’s efficiency, but they do not reflect interrelation between engine’s complex efficiency and a set of economic, mass-dimensional, operational and ecological efficiency. The approach applied in the article makes it possible to reveal the existing and modify the existing methods of solving the problem of improving diesel engine’s efficiency with due regard to interrelation of the parameters, characterizing efficiency of their operation. Statistic analyses were carried out, on the basis of which an assumption regarding the existence of interrelation between specific fuel consumption and the analyzed engine’s parameters was made. Processing of statistical data for various analyzed functions of diesel engines helped offer a function, illustrating the link between volume combustion rate, piston’s area and nominal theoretical specific fuel consumption. Interrelation between volume combustion rate, nominal parameters of diesel operation and efficiency indices, obtained by processing statistical data of more than 500 models of diesels of different series was evaluated, the main feature of it being a mathematical trend. The analysis of the obtained function makes it possible to establish an interrelation between economic efficiency of a diesel, its main index being specific fuel consumption and volume combustion rate and design peculiarities

  9. Historical earthquake research in Austria

    Science.gov (United States)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  10. Extending the application range of a fuel performance code from normal operating to design basis accident conditions

    International Nuclear Information System (INIS)

    Van Uffelen, P.; Gyori, C.; Schubert, A.; Laar, J. van de; Hozer, Z.; Spykman, G.

    2008-01-01

    Two types of fuel performance codes are generally being applied, corresponding to the normal operating conditions and the design basis accident conditions, respectively. In order to simplify the code management and the interface between the codes, and to take advantage of the hardware progress it is favourable to generate a code that can cope with both conditions. In the first part of the present paper, we discuss the needs for creating such a code. The second part of the paper describes an example of model developments carried out by various members of the TRANSURANUS user group for coping with a loss of coolant accident (LOCA). In the third part, the validation of the extended fuel performance code is presented for LOCA conditions, whereas the last section summarises the present status and indicates needs for further developments to enable the code to deal with reactivity initiated accident (RIA) events

  11. Robust and Stable Disturbance Observer of Servo System for Low Speed Operation Using the Radial Basis Function Network

    DEFF Research Database (Denmark)

    Lee, Kyo-Beum; Blaabjerg, Frede

    2005-01-01

    A new scheme to estimate the moment of inertia in the servo motor drive system in very low speed is proposed in this paper. The speed estimation scheme in most servo drive systems for low speed operation is sensitive to the variation of machine parameter, especially the moment of inertia....... To estimate the motor inertia value, the observer using the Radial Basis Function Network (RBFN) is applied. A control law for stabilizing the system and adaptive laws for updating both of the weights in the RBFN and a bounding constant are established so that the whole closed-loop system is stable...... in the sense of Lyapunov. The effectiveness of the proposed inertia estimation is verified by simulations and experiments. It is concluded that the speed control performance in low speed region is improved with the proposed disturbance observer using RBFN....

  12. New photonic devices for ultrafast pulse processing operating on the basis of the diffraction-dispersion analogy

    Energy Technology Data Exchange (ETDEWEB)

    Torres-Company, Victor; Minguez-Vega, Gladys; Climent, Vicent; Lands, Jesus [GROC-UJI, Departament de Fisica, Universitat Jaume I, 12080 Castello (Spain); Andres, Pedro [Departament d' Optica, Universitat de Valencia, 46100 Burjassot (Spain)], E-mail: lancis@fca.uji.es

    2008-11-01

    The space-time analogy is a well-known topic within wave optics that brings together some results from beam diffraction and pulse dispersion. On the above basis, and taking as starting point some classical concepts in Optics, several photonic devices have been proposed during the last few years with application in rapidly evolving fields such as ultrafast (femtosecond) optics or RF and microwave signal processing. In this contribution, we briefly review the above ideas with particular emphasis in the generation of trains of ultrafast pulses from periodic modulation of the phase of a CW laser source. This is the temporal analogue of Fresnel diffraction by a pure phase grating. Finally, we extend the analogy to the partially coherent case, what enables us to design an original technique for wavelength-to-time mapping of the spectrum of a temporally stationary source. Results of laboratory experiments concerning the generation of user-defined radio-frequency waveforms and filtering of microwave signals will be shown. The devices are operated with low-cost incoherent sources.

  13. On elastic limit margins for earthquake design

    International Nuclear Information System (INIS)

    Buchhardt, F.; Matthees, W.; Magiera, G.

    1987-01-01

    In the Federal Republic of Germany KTA rule 2201 being the basis for the design of nuclear power plants against seismic events is now under discussion for revisions. One of the main demands to modify KTA rule 2201 consists in cancelling the existing design philosophy, i.e. design against an operating basis earthquake (AEB) as well as against a safe shutdown earthquake (SEB). When using the present rule the 'lower' earthquake (AEB) can become design-predominant, since for AEB and SEB different types of load cases are to be superimposed with different safety factors. The scope of this study is to quantify by parametric analyses so-called 'elastic bearing capacity limit margins' for seismic events; hereby different seismic input criteria - conventional as well as recently proposed are taken into account to investigate the influence of eventual modifications in seismic design philosophy. This way a relation between AEB and SEB has to be defined so that SEB is just still predominant for the design while AEB still will yield to elastic behaviour. The study covers all German site conditions

  14. Indoor radon and earthquake

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time on the basis of the Spitak earthquake of December 1988 (Armenia, December 1988) experience it is found out that the earthquake causes intensive and prolonged radon splashes which, rapidly dispersing in the open space of close-to-earth atmosphere, are contrastingly displayed in covered premises (dwellings, schools, kindergartens) even if they are at considerable distance from the earthquake epicenter, and this multiplies the radiation influence on the population. The interval of splashes includes the period from the first fore-shock to the last after-shock, i.e. several months. The area affected by radiation is larger vs. Armenia's territory. The scale of this impact on population is 12 times higher than the number of people injured in Spitak, Leninakan and other settlements (toll of injured - 25 000 people, radiation-induced diseases in people - over 300 000). The influence of radiation directly correlates with the earthquake force. Such a conclusion is underpinned by indoor radon monitoring data for Yerevan since 1987 (120 km from epicenter) 5450 measurements and multivariate analysis with identification of cause-and-effect linkages between geo dynamics of indoor radon under stable and conditions of Earth crust, behavior of radon in different geological mediums during earthquakes, levels of room radon concentrations and effective equivalent dose of radiation impact of radiation dose on health and statistical data on public health provided by the Ministry of Health. The following hitherto unexplained facts can be considered as consequences of prolonged radiation influence on human organism: long-lasting state of apathy and indifference typical of the population of Armenia during the period of more than a year after the earthquake, prevalence of malignant cancer forms in disaster zones, dominating lung cancer and so on. All urban territories of seismically active regions are exposed to the threat of natural earthquake-provoked radiation influence

  15. Analog earthquakes

    International Nuclear Information System (INIS)

    Hofmann, R.B.

    1995-01-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed. A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository

  16. Developing an Agent-Based Simulation System for Post-Earthquake Operations in Uncertainty Conditions: A Proposed Method for Collaboration among Agents

    Directory of Open Access Journals (Sweden)

    Navid Hooshangi

    2018-01-01

    Full Text Available Agent-based modeling is a promising approach for developing simulation tools for natural hazards in different areas, such as during urban search and rescue (USAR operations. The present study aimed to develop a dynamic agent-based simulation model in post-earthquake USAR operations using geospatial information system and multi agent systems (GIS and MASs, respectively. We also propose an approach for dynamic task allocation and establishing collaboration among agents based on contract net protocol (CNP and interval-based Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS methods, which consider uncertainty in natural hazards information during agents’ decision-making. The decision-making weights were calculated by analytic hierarchy process (AHP. In order to implement the system, earthquake environment was simulated and the damage of the buildings and a number of injuries were calculated in Tehran’s District 3: 23%, 37%, 24% and 16% of buildings were in slight, moderate, extensive and completely vulnerable classes, respectively. The number of injured persons was calculated to be 17,238. Numerical results in 27 scenarios showed that the proposed method is more accurate than the CNP method in the terms of USAR operational time (at least 13% decrease and the number of human fatalities (at least 9% decrease. In interval uncertainty analysis of our proposed simulated system, the lower and upper bounds of uncertain responses are evaluated. The overall results showed that considering uncertainty in task allocation can be a highly advantageous in the disaster environment. Such systems can be used to manage and prepare for natural hazards.

  17. Water supply facility damage and water resource operation at disaster base hospitals in miyagi prefecture in the wake of the Great East Japan Earthquake.

    Science.gov (United States)

    Matsumura, Takashi; Osaki, Shizuka; Kudo, Daisuke; Furukawa, Hajime; Nakagawa, Atsuhiro; Abe, Yoshiko; Yamanouchi, Satoshi; Egawa, Shinichi; Tominaga, Teiji; Kushimoto, Shigeki

    2015-04-01

    The aim of this study was to shed light on damage to water supply facilities and the state of water resource operation at disaster base hospitals in Miyagi Prefecture (Japan) in the wake of the Great East Japan Earthquake (2011), in order to identify issues concerning the operational continuity of hospitals in the event of a disaster. In addition to interview and written questionnaire surveys to 14 disaster base hospitals in Miyagi Prefecture, a number of key elements relating to the damage done to water supply facilities and the operation of water resources were identified from the chronological record of events following the Great East Japan Earthquake. Nine of the 14 hospitals experienced cuts to their water supplies, with a median value of three days (range=one to 20 days) for service recovery time. The hospitals that could utilize well water during the time that water supply was interrupted were able to obtain water in quantities similar to their normal volumes. Hospitals that could not use well water during the period of interruption, and hospitals whose water supply facilities were damaged, experienced significant disruption to dialysis, sterilization equipment, meal services, sanitation, and outpatient care services, though the extent of disruption varied considerably among hospitals. None of the hospitals had determined the amount of water used for different purposes during normal service or formulated a plan for allocation of limited water in the event of a disaster. The present survey showed that it is possible to minimize the disruption and reduction of hospital functions in the event of a disaster by proper maintenance of water supply facilities and by ensuring alternative water resources, such as well water. It is also clear that it is desirable to conclude water supply agreements and formulate strategic water allocation plans in preparation for the eventuality of a long-term interruption to water services.

  18. Sensing the earthquake

    Science.gov (United States)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  19. Evaluation of earthquake vibration on aseismic design of nuclear power plant judging from recent earthquakes

    International Nuclear Information System (INIS)

    Dan, Kazuo

    2006-01-01

    The Regulatory Guide for Aseismic Design of Nuclear Reactor Facilities was revised on 19 th September, 2006. Six factors for evaluation of earthquake vibration are considered on the basis of the recent earthquakes. They are 1) evaluation of earthquake vibration by method using fault model, 2) investigation and approval of active fault, 3) direct hit earthquake, 4) assumption of the short active fault as the hypocentral fault, 5) locality of the earthquake and the earthquake vibration and 6) remaining risk. A guiding principle of revision required new evaluation method of earthquake vibration using fault model, and evaluation of probability of earthquake vibration. The remaining risk means the facilities and people get into danger when stronger earthquake than the design occurred, accordingly, the scattering has to be considered at evaluation of earthquake vibration. The earthquake belt of Hyogo-Nanbu earthquake and strong vibration pulse in 1995, relation between length of surface earthquake fault and hypocentral fault, and distribution of seismic intensity of off Kushiro in 1993 are shown. (S.Y.)

  20. Rapid estimation of the economic consequences of global earthquakes

    Science.gov (United States)

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    to reduce this time gap to more rapidly and effectively mobilize response. We present here a procedure to rapidly and approximately ascertain the economic impact immediately following a large earthquake anywhere in the world. In principle, the approach presented is similar to the empirical fatality estimation methodology proposed and implemented by Jaiswal and others (2009). In order to estimate economic losses, we need an assessment of the economic exposure at various levels of shaking intensity. The economic value of all the physical assets exposed at different locations in a given area is generally not known and extremely difficult to compile at a global scale. In the absence of such a dataset, we first estimate the total Gross Domestic Product (GDP) exposed at each shaking intensity by multiplying the per-capita GDP of the country by the total population exposed at that shaking intensity level. We then scale the total GDP estimated at each intensity by an exposure correction factor, which is a multiplying factor to account for the disparity between wealth and/or economic assets to the annual GDP. The economic exposure obtained using this procedure is thus a proxy estimate for the economic value of the actual inventory that is exposed to the earthquake. The economic loss ratio, defined in terms of a country-specific lognormal cumulative distribution function of shaking intensity, is derived and calibrated against the losses from past earthquakes. This report describes the development of a country or region-specific economic loss ratio model using economic loss data available for global earthquakes from 1980 to 2007. The proposed model is a potential candidate for directly estimating economic losses within the currently-operating PAGER system. PAGER's other loss models use indirect methods that require substantially more data (such as building/asset inventories, vulnerabilities, and the asset values exposed at the time of earthquake) to implement on a global basis

  1. Development of assessment system for tank earthquake-proof design (ASTEP code) installing automatic operation and knowledge database

    International Nuclear Information System (INIS)

    Maekawa, Akira; Suzuki, Michiaki; Fujii, Yuzo

    2004-01-01

    In a nuclear power station, seismic-proof design of the various tanks classified as auxiliary installation are required to follow technical guideline for the seismic-proof design of nuclear power station, which is called JEAC4601 for short in below. This guideline uses simple mechanical multi-mass model but a rather complicated evaluation method requires designers to have knowledge and experience and consumes both time and labor. On purpose to resolve those difficulties, Assessment System for Tank Earthquake-Proof Design, which is called ASTEP in short, has been developed and equipped with automated process and knowledge database. For this system, the targeted types of tank are a vertical cylindrical tank that has four supports or a skirt support, a horizontal cylindrical tank that has two saddle supports, and vertical cylindrical tank or water storage tank with a flat bottom. The system integrated all the seismic-proof design evaluation related tools and equipped with step by step menus in order of the flowchart, so enables designers to use them easily. In addition, it has a input aid that enables users to input with ease and a tool that automatically calculates input parameters. So this system reduces seismic-proof design evaluation related work load dramatically and also does not require much knowledge and experience related to this field. Further more, this system organized seismic-proof design related past statement and technical documents as a knowledge database so user could obtain the identical output as of the manual calculation results. Comparing output of ASTEP code and the manual calculation results of a typical tank that requires government approval of its design evaluation document, the error was within less than a percent so validity of the system was confirmed. This system has gained favorable comment during the trial run, and it was beyond our expectation. (author)

  2. Connecting slow earthquakes to huge earthquakes

    OpenAIRE

    Obara, Kazushige; Kato, Aitaro

    2016-01-01

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of th...

  3. OBJECTIFICATION OF ERGONOMIC ASSESSMENT OF THE PILOT OPERATION ON THE BASIS OF “PHOTOSHOP – TECHNOLOGY”

    Directory of Open Access Journals (Sweden)

    Volodymyr Gorbunov

    2013-12-01

    Full Text Available On the basis of information technology “Photoshop” the means of the objective ergonomic evaluation of the professional activity of pilot on piloting the plane has been developed. Procedural characteristics, peculiarities and objective criteria if his work on aviation equipment is ergonomically acceptable, as well as the order of revealing the ergonomic shortcomings of the arrangement of his working place that decrease the flight safety from the position of human factor have been determined

  4. Approach to evaluation and prediction of lifetime characteristics of NPP valve on the basis of operation data

    International Nuclear Information System (INIS)

    Emelyanov, V.; Kamyshnikov, O.; Dovgalyuk, V.; Plying, B.

    1994-01-01

    The report contains brief description of the main activity stages for testing, evaluation and prediction of reliability factors (including characteristics and factors of longevity) for NPP operating equipment. Valves equipped with electric drive that are installed in level control system of steam generator in WWER-1000 reactor are taken as an example. Main emphasis is made on classification of failures which had taken place during operation, on detection of prevailing mechanisms of ageing and on assessment of operation factors of reliability and methods of their testing, assessment and prediction. Principles of product ageing parameters selection are briefly described as well as mathematic methods used for quantitative assessment of products reliability factors according to its operation data. The report includes considerations on procedure of operating evaluation, testing and prediction of complex unique equipment based on testing of state vectors path, probabilities of defining parameters to be tested characterizing operability of set components within the assumed boundaries written in design and operation documentation are components of the vectors mentioned above. 9 refs, 4 figs

  5. Selection of operating parameters on the basis of hydrodynamics in centrifugal partition chromatography for the purification of nybomycin derivatives.

    Science.gov (United States)

    Adelmann, S; Baldhoff, T; Koepcke, B; Schembecker, G

    2013-01-25

    The selection of solvent systems in centrifugal partition chromatography (CPC) is the most critical point in setting up a separation. Therefore, lots of research was done on the topic in the last decades. But the selection of suitable operating parameters (mobile phase flow rate, rotational speed and mode of operation) with respect to hydrodynamics and pressure drop limit in CPC is still mainly driven by experience of the chromatographer. In this work we used hydrodynamic analysis for the prediction of most suitable operating parameters. After selection of different solvent systems with respect to partition coefficients for the target compound the hydrodynamics were visualized. Based on flow pattern and retention the operating parameters were selected for the purification runs of nybomycin derivatives that were carried out with a 200 ml FCPC(®) rotor. The results have proven that the selection of optimized operating parameters by analysis of hydrodynamics only is possible. As the hydrodynamics are predictable by the physical properties of the solvent system the optimized operating parameters can be estimated, too. Additionally, we found that dispersion and especially retention are improved if the less viscous phase is mobile. Crown Copyright © 2012. Published by Elsevier B.V. All rights reserved.

  6. Earthquake Facts

    Science.gov (United States)

    ... North Dakota, and Wisconsin. The core of the earth was the first internal structural element to be identified. In 1906 R.D. Oldham discovered it from his studies of earthquake records. The inner core is solid, and the outer core is liquid and so does not transmit ...

  7. Understanding Earthquakes

    Science.gov (United States)

    Davis, Amanda; Gray, Ron

    2018-01-01

    December 26, 2004 was one of the deadliest days in modern history, when a 9.3 magnitude earthquake--the third largest ever recorded--struck off the coast of Sumatra in Indonesia (National Centers for Environmental Information 2014). The massive quake lasted at least 10 minutes and devastated the Indian Ocean. The quake displaced an estimated…

  8. Physics of Earthquake Rupture Propagation

    Science.gov (United States)

    Xu, Shiqing; Fukuyama, Eiichi; Sagy, Amir; Doan, Mai-Linh

    2018-05-01

    A comprehensive understanding of earthquake rupture propagation requires the study of not only the sudden release of elastic strain energy during co-seismic slip, but also of other processes that operate at a variety of spatiotemporal scales. For example, the accumulation of the elastic strain energy usually takes decades to hundreds of years, and rupture propagation and termination modify the bulk properties of the surrounding medium that can influence the behavior of future earthquakes. To share recent findings in the multiscale investigation of earthquake rupture propagation, we held a session entitled "Physics of Earthquake Rupture Propagation" during the 2016 American Geophysical Union (AGU) Fall Meeting in San Francisco. The session included 46 poster and 32 oral presentations, reporting observations of natural earthquakes, numerical and experimental simulations of earthquake ruptures, and studies of earthquake fault friction. These presentations and discussions during and after the session suggested a need to document more formally the research findings, particularly new observations and views different from conventional ones, complexities in fault zone properties and loading conditions, the diversity of fault slip modes and their interactions, the evaluation of observational and model uncertainties, and comparison between empirical and physics-based models. Therefore, we organize this Special Issue (SI) of Tectonophysics under the same title as our AGU session, hoping to inspire future investigations. Eighteen articles (marked with "this issue") are included in this SI and grouped into the following six categories.

  9. 26 CFR 1.832-6 - Policyholders of mutual fire or flood insurance companies operating on the basis of premium...

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 8 2010-04-01 2010-04-01 false Policyholders of mutual fire or flood insurance... Insurance Companies § 1.832-6 Policyholders of mutual fire or flood insurance companies operating on the... taxpayer insured by a mutual fire or flood insurance company under a policy for which the premium deposit...

  10. Design basis knowledge management for the newcomer countries. Relying on the owner/operator as a knowledgeable customer

    International Nuclear Information System (INIS)

    Lepouze, Benoît

    2013-01-01

    Becoming a knowledgeable customer is the first step to manage knowledge: • Vendors, consulting firms, TSOs can assist the future operator but it will remain the sole owner of the decisions; • Future owner operator has to become a knowledgeable customer: • Know what to ask for, • Know how to ask for it, • Know how to check if it got what it asked. • Where should knowledge management belong (management?, HRD?, procurement?) and is it important? What it means for DBKM ? (example): • The owner/operator (the licensee) is responsible in front of the safety agency: it should answer its questions at every stage of the program; • It will often turn back to its vendor/suppliers especially for detailed design questions; • But that means it has to know what to ask for and to check the result before talking to the regulator; • That also means it has to make sure knowledge is managed throughout the life of the program

  11. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    Science.gov (United States)

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  12. On the improvement of the response capability of the control room operator in a pressurized water reactor nuclear power plant in a severe earthquake through the use of emergency response guidelines

    International Nuclear Information System (INIS)

    Lee, S.

    1989-01-01

    Recent probabilistic risk assessment studies indicate that potential accidents initiated by large earthquakes are among the major contributors to public risk from nuclear power plants. During a severe earthquake, the symptoms presented to operators may be unreliable and may endanger the validity of actions in emergency response guidelines (ERGs). The objective of the present study is to improve the operator capability of responding to seismic damage through the use of ERGS. The methods used are to deterministically identify the possible weakness of ERGs, given a severe earthquake, and to probabilistically evaluate those identified weaknesses. Several cases are postulated. Each of them contains system failures with or without indicator failures and leads the core to meltdown conditions if the operator follows the ERGs strictly without any deviation. The likelihood of each case is estimated. A LISP program is developed to estimate the plant seismic risk with which the relative risk contribution of each postulated case is estimated. As a result, ten cases are postulated and possible remedies for each case are discussed. The likelihood of each case is estimated to be not negligible. The identified indicator failures should be considered in future refinement of the ERGS. The development of an expert system to provide remedial procedures should be considered after a more thorough study in which many more cases are postulated

  13. Connecting slow earthquakes to huge earthquakes.

    Science.gov (United States)

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  14. The Effects of Degraded Digital Instrumentation and Control Systems on Human-system Interfaces and Operator Performance: HFE Review Guidance and Technical Basis

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Gunther, W.; Martinez-Guridi, G.

    2010-01-01

    New and advanced reactors will use integrated digital instrumentation and control (I and C) systems to support operators in their monitoring and control functions. Even though digital systems are typically highly reliable, their potential for degradation or failure could significantly affect operator performance and, consequently, impact plant safety. The U.S. Nuclear Regulatory Commission (NRC) supported this research project to investigate the effects of degraded I and C systems on human performance and plant operations. The objective was to develop human factors engineering (HFE) review guidance addressing the detection and management of degraded digital I and C conditions by plant operators. We reviewed pertinent standards and guidelines, empirical studies, and plant operating experience. In addition, we conducted an evaluation of the potential effects of selected failure modes of the digital feedwater system on human-system interfaces (HSIs) and operator performance. The results indicated that I and C degradations are prevalent in plants employing digital systems and the overall effects on plant behavior can be significant, such as causing a reactor trip or causing equipment to operate unexpectedly. I and C degradations can impact the HSIs used by operators to monitor and control the plant. For example, sensor degradations can make displays difficult to interpret and can sometimes mislead operators by making it appear that a process disturbance has occurred. We used the information obtained as the technical basis upon which to develop HFE review guidance. The guidance addresses the treatment of degraded I and C conditions as part of the design process and the HSI features and functions that support operators to monitor I and C performance and manage I and C degradations when they occur. In addition, we identified topics for future research.

  15. The Effects of Degraded Digital Instrumentation and Control Systems on Human-system Interfaces and Operator Performance: HFE Review Guidance and Technical Basis

    Energy Technology Data Exchange (ETDEWEB)

    O' Hara, J.M.; W. Gunther, G. Martinez-Guridi

    2010-02-26

    New and advanced reactors will use integrated digital instrumentation and control (I&C) systems to support operators in their monitoring and control functions. Even though digital systems are typically highly reliable, their potential for degradation or failure could significantly affect operator performance and, consequently, impact plant safety. The U.S. Nuclear Regulatory Commission (NRC) supported this research project to investigate the effects of degraded I&C systems on human performance and plant operations. The objective was to develop human factors engineering (HFE) review guidance addressing the detection and management of degraded digital I&C conditions by plant operators. We reviewed pertinent standards and guidelines, empirical studies, and plant operating experience. In addition, we conducted an evaluation of the potential effects of selected failure modes of the digital feedwater system on human-system interfaces (HSIs) and operator performance. The results indicated that I&C degradations are prevalent in plants employing digital systems and the overall effects on plant behavior can be significant, such as causing a reactor trip or causing equipment to operate unexpectedly. I&C degradations can impact the HSIs used by operators to monitor and control the plant. For example, sensor degradations can make displays difficult to interpret and can sometimes mislead operators by making it appear that a process disturbance has occurred. We used the information obtained as the technical basis upon which to develop HFE review guidance. The guidance addresses the treatment of degraded I&C conditions as part of the design process and the HSI features and functions that support operators to monitor I&C performance and manage I&C degradations when they occur. In addition, we identified topics for future research.

  16. Earthquake accelerations estimation for construction calculating with different responsibility degrees

    International Nuclear Information System (INIS)

    Dolgaya, A.A.; Uzdin, A.M.; Indeykin, A.V.

    1993-01-01

    The investigation object is the design amplitude of accelerograms, which are used in the evaluation of seismic stability of responsible structures, first and foremost, NPS. The amplitude level is established depending on the degree of responsibility of the structure and on the prevailing period of earthquake action on the construction site. The investigation procedure is based on statistical analysis of 310 earthquakes. At the first stage of statistical data-processing we established the correlation dependence of both the mathematical expectation and root-mean-square deviation of peak acceleration of the earthquake on its prevailing period. At the second stage the most suitable law of acceleration distribution about the mean was chosen. To determine of this distribution parameters, we specified the maximum conceivable acceleration, the excess of which is not allowed. Other parameters of distribution are determined according to statistical data. At the third stage the dependencies of design amplitude on the prevailing period of seismic effect for different structures and equipment were established. The obtained data made it possible to recommend to fix the level of safe-shutdown (SSB) and operating basis earthquakes (OBE) for objects of various responsibility categories when designing NPS. (author)

  17. A novel method for flow pattern identification in unstable operational conditions using gamma ray and radial basis function

    International Nuclear Information System (INIS)

    Roshani, G.H.; Nazemi, E.; Roshani, M.M.

    2017-01-01

    Changes of fluid properties (especially density) strongly affect the performance of radiation-based multiphase flow meter and could cause error in recognizing the flow pattern and determining void fraction. In this work, we proposed a methodology based on combination of multi-beam gamma ray attenuation and dual modality densitometry techniques using RBF neural network in order to recognize the flow regime and determine the void fraction in gas-liquid two phase flows independent of the liquid phase changes. The proposed system is consisted of one 137 Cs source, two transmission detectors and one scattering detector. The registered counts in two transmission detectors were used as the inputs of one primary Radial Basis Function (RBF) neural network for recognizing the flow regime independent of liquid phase density. Then, after flow regime identification, three RBF neural networks were utilized for determining the void fraction independent of liquid phase density. Registered count in scattering detector and first transmission detector were used as the inputs of these three RBF neural networks. Using this simple methodology, all the flow patterns were correctly recognized and the void fraction was predicted independent of liquid phase density with mean relative error (MRE) of less than 3.28%. - Highlights: • Flow regime and void fraction were determined in two phase flows independent of the liquid phase density changes. • An experimental structure was set up and the required data was obtained. • 3 detectors and one gamma source were used in detection geometry. • RBF networks were utilized for flow regime and void fraction determination.

  18. The limits of earthquake early warning: Timeliness of ground motion estimates

    OpenAIRE

    Minson, Sarah E.; Meier, Men-Andrin; Baltay, Annemarie S.; Hanks, Thomas C.; Cochran, Elizabeth S.

    2018-01-01

    The basic physics of earthquakes is such that strong ground motion cannot be expected from an earthquake unless the earthquake itself is very close or has grown to be very large. We use simple seismological relationships to calculate the minimum time that must elapse before such ground motion can be expected at a distance from the earthquake, assuming that the earthquake magnitude is not predictable. Earthquake early warning (EEW) systems are in operation or development for many regions aroun...

  19. Defeating Earthquakes

    Science.gov (United States)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  20. Improving the efficiency of heat supply systems on the basis of plants operating on organic Rankine cycle

    Science.gov (United States)

    Solomin, I. N.; Daminov, A. Z.; Sadykov, R. A.

    2017-11-01

    Results of experimental and analytical studies of the plant main element - plant turbomachine (turbo-expander) operating on organic Rankine cycle were obtained for facilities of the heat supply systems of small-scale power generation. At simultaneous mathematical modeling and experimental studies it was found that the best working medium to be used in the turbomachines of these plants is Freon R245fa which has the most suitable calorimetric properties to be used in the cycle. The mathematical model of gas flow in the turbomachine was developed. The main engineering dependencies to calculate the optimal design parameters of the turbomachine were obtained. The engineering problems of providing the minimum axial size of the turbomachine impeller were solved and the main design elements were unified.

  1. An examination of qualitative plant modelling as a basis for knowledge-based operator aids in nuclear power stations

    International Nuclear Information System (INIS)

    Herbert, M.; Williams, G.

    1986-01-01

    New qualitative techniques for representing the behaviour of physical systems have recently been developed. These allow a qualitative representation to be formally derived from a quantitative plant model. One such technique, Incremental Qualitative Analysis, is based on manipulating qualitative differential equations, called confluences, using sign algebra. This is described and its potential for reducing the amount of information presented to the reactor operator is discussed. In order to illustrate the technique, a specific example relating to the influence of failures associated with a pressurized water reactor pressuriser is presented. It is shown that, although failures cannot necessarily be diagnosed unambiguously, the number of possible failures inferred is low. Techniques for discriminating between these possible failures are discussed. (author)

  2. Earthquake Early Warning Systems

    OpenAIRE

    Pei-Yang Lin

    2011-01-01

    Because of Taiwan’s unique geographical environment, earthquake disasters occur frequently in Taiwan. The Central Weather Bureau collated earthquake data from between 1901 and 2006 (Central Weather Bureau, 2007) and found that 97 earthquakes had occurred, of which, 52 resulted in casualties. The 921 Chichi Earthquake had the most profound impact. Because earthquakes have instant destructive power and current scientific technologies cannot provide precise early warnings in advance, earthquake ...

  3. Radiation protection aspects gained from the operation of FBTR. Basis for approach and criteria for future LMFBRs

    International Nuclear Information System (INIS)

    Meenakshisundaram, V.; Jose, M. T.

    2008-01-01

    Health Physics experience gained from the operation of Fast Breeder Test Reactor since more than twenty years is outlined. These include area monitoring, stack monitoring, annual discharge of activity released vis-a-vis technical specification limits, personnel monitoring that include man-rem expenditure, waste disposal etc. Basic aspects of Radiation and Air Activity Monitoring System (RAAMS), meant to monitor and record the radiation and air activity levels at various controlled areas in FBTR complex are given. Installation, calibration and usefulness of special monitors, unique to LMFBRs, such as gas flow ion chambers in the Clad Rupture Detection (CRD) argon circuit for detection of gaseous fission products, fume activity monitors in the ventilation ducts to indicate sodium leak / fire, sodium aerosol detection monitors in the primary double envelop sampling line and gas activity monitors are highlighted. Radiologically significant incidents such as minor sodium leak in the primary purification system in 2002 and special operations are reported. The experience gained during successful handling, treatment, and disposal of active primary sodium and decontamination of active sodium bearing components following steam-nitrogen process is brought out. Towards controlling external exposures to occupational workers during maintenance work, the salient features of the study conducted to assess the deposition of radioactive corrosion and activation products and dose rates in the primary sodium pipelines and various components of FBTR, which are housed in B-cells, are highlighted. The environmental aspects of LMFBRs are also briefly outlined. The lessons learnt from the experience gained such as lowering of alarm limit for particulate activity monitors to enable detection of primary sodium leak within reactor containment building, identification of deposition of 54 Mn in the interiors of primary sodium lines as a major contributor to the external dose component, the

  4. Earthquake evaluation of a substation network

    International Nuclear Information System (INIS)

    Matsuda, E.N.; Savage, W.U.; Williams, K.K.; Laguens, G.C.

    1991-01-01

    The impact of the occurrence of a large, damaging earthquake on a regional electric power system is a function of the geographical distribution of strong shaking, the vulnerability of various types of electric equipment located within the affected region, and operational resources available to maintain or restore electric system functionality. Experience from numerous worldwide earthquake occurrences has shown that seismic damage to high-voltage substation equipment is typically the reason for post-earthquake loss of electric service. In this paper, the authors develop and apply a methodology to analyze earthquake impacts on Pacific Gas and Electric Company's (PG and E's) high-voltage electric substation network in central and northern California. The authors' objectives are to identify and prioritize ways to reduce the potential impact of future earthquakes on our electric system, refine PG and E's earthquake preparedness and response plans to be more realistic, and optimize seismic criteria for future equipment purchases for the electric system

  5. Earthquake damage to underground facilities

    International Nuclear Information System (INIS)

    Pratt, H.R.; Hustrulid, W.A.; Stephenson, D.E.

    1978-11-01

    The potential seismic risk for an underground nuclear waste repository will be one of the considerations in evaluating its ultimate location. However, the risk to subsurface facilities cannot be judged by applying intensity ratings derived from the surface effects of an earthquake. A literature review and analysis were performed to document the damage and non-damage due to earthquakes to underground facilities. Damage from earthquakes to tunnels, s, and wells and damage (rock bursts) from mining operations were investigated. Damage from documented nuclear events was also included in the study where applicable. There are very few data on damage in the subsurface due to earthquakes. This fact itself attests to the lessened effect of earthquakes in the subsurface because mines exist in areas where strong earthquakes have done extensive surface damage. More damage is reported in shallow tunnels near the surface than in deep mines. In mines and tunnels, large displacements occur primarily along pre-existing faults and fractures or at the surface entrance to these facilities.Data indicate vertical structures such as wells and shafts are less susceptible to damage than surface facilities. More analysis is required before seismic criteria can be formulated for the siting of a nuclear waste repository

  6. Consideration for standard earthquake vibration (1). The Niigataken Chuetsu-oki Earthquake in 2007

    International Nuclear Information System (INIS)

    Ishibashi, Katsuhiko

    2007-01-01

    Outline of new guideline of quakeproof design standard of nuclear power plant and the standard earthquake vibration are explained. The improvement points of new guideline are discussed on the basis of Kashiwazaki-Kariwa Nuclear Power Plant incidents. The fundamental limits of new guideline are pointed. Placement of the quakeproof design standard of nuclear power plant, JEAG4601 of Japan Electric Association, new guideline, standard earthquake vibration of new guideline, the Niigataken Chuetsu-oki Earthquake in 2007 and damage of Kashiwazaki-Kariwa Nuclear Power Plant are discussed. The safety criteria of safety review system, organization, standard and guideline should be improved on the basis of this earthquake and nuclear plant accident. The general knowledge, 'a nuclear power plant is not constructed in the area expected large earthquake', has to be realized. Preconditions of all nuclear power plants should not cause damage to anything. (S.Y.)

  7. Successfull operation of biogas plants. Data acquisition as a basis of successful optimization measures; Erfolgreicher Betrieb von Biogasanlagen. Datenerfassung als Grundlage erfolgreicher Optimierungsmassnahmen

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-09-19

    Within the 2nd Bayreuth expert meeting on biomass at 6th June, 2012 in Bayreuth (Federal Republic of Germany), the following lectures were held: (1) Presentation of the activities in the bio energy sector of the Landwirtschaftliche Lehranstalt Bayreuth (Rainer Prischenk); (2) State of the art of utilizing biogas in Oberfranken from the view of FVB e.V. (Wolfgang Holland Goetz); (3) Optimization of the plant operation by means of an intelligent control (Christian Seier); (4) Process optimization by means of identification of losses of biogas and evaluation of the load behaviour and emission behaviour of gas engines (Wolfgang Schreier); (5) Data acquisition and implementation of optimization measures from the point of view of an environmental verifier (Thorsten Grantner); (6) Economic analysis and optimization by means of the Lfl program BZA Biogas (Josef Winkler); (7) Detailed data acquisition as a necessary basis of the process optimization (Timo Herfter); (8) Case examples of the biologic support of biogas plants and their correct evaluation (Birgit Pfeifer); (9) A systematic acquisition of operational data as a basis for the increase of efficiency using the Praxisforschungsbiogasanlage of the University Hohenheim (Hans-Joachim Naegele); (10) Practical report: The biogas plant Sochenberg towards 100% utilization of energy (Uli Bader).

  8. Actions at Kashiwazaki Kariwa Nuclear Power Station after the Niigataken Chuetsu-oki earthquake

    International Nuclear Information System (INIS)

    Orita, Shuichi

    2009-01-01

    'The Niigataken Chuetsu-oki Earthquake in 2007' occurred on July 16, 2007, and seismic motions beyond those of the design basis earthquake were recorded at Kashiwazaki Kariwa nuclear power station located near the epicenter. After the earthquake, inspections and seismic response analyses have been being performed to grasp seismic induced impacts on structures, systems and components (SSCs). In addition, re-definition of design basis earthquake, upgrading, management against disasters have been also being conducted. (author)

  9. A smartphone application for earthquakes that matter!

    Science.gov (United States)

    Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert

    2014-05-01

    Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected

  10. Operation and profits of energy boards. A study of the basis of municipal business activities and the equitableness of the profits of municipal energy boards

    International Nuclear Information System (INIS)

    Karhu, V.; Nissinen, T.; Valkama, P.

    1999-01-01

    The objective of the empirical part of the study (Chapter 6) is to evaluate the equitableness of profits on capital invested of the 16 municipal energy boards selected for this study and, at the same time, to create a general evaluation basis for equity decisions made by the authorities case by case. In this part of the study, answers are sought for the following questions: (1) how has the economic situation of the energy boards studied been recently developing based on various economic parameters? (2) have there been differences in the returns and profitability of energy boards operating as public utilities or energy boards operating in company form? (3) what kind of a price level the energy boards studied have maintained in relation to the national averages of this field? (4) is a city in a weaker economic position more tempted to require higher profits on capital invested than a city with a sound economic basis? (5) how high profits on capital invested can be considered reasonable for the whole energy board and particularly for a network business holding a monopoly? The structure of the study is as follows. Chapter 2 contains a brief description of the energy boards selected for this study and of the economic situation of the cities owning them. The theoretical part of the study is included in Chapter 3 'Municipal Self-Government and Business'. It analyses rather deeply the terminology of the municipal business, norm basis, steering of actions, restructuring of companies into business profit centres and privatisation, as well as application of the Act on Restrictions on Competition from the standpoint of a municipal self-government. Chapter 4 deals with the establishment of energy board activity, the legal basis and the criteria for pricing electricity, network services and district heat. Chapter 5 examines the Act on Restrictions on Competition as a regulator of the energy board activities. After this, there are the presentations of the research results of the

  11. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    Science.gov (United States)

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  12. Magnitudes and frequencies of earthquakes in relation to seismic risk

    International Nuclear Information System (INIS)

    Sharma, R.D.

    1989-01-01

    Estimating the frequencies of occurrence of earthquakes of different magnitudes on a regional basis is an important task in estimating seismic risk at a construction site. Analysis of global earthquake data provides an insight into the magnitudes frequency relationship in a statistical manner. It turns out that, whereas a linear relationship between the logarithm of earthquake occurrence rates and the corresponding earthquake magnitudes fits well in the magnitude range between 5 and 7, a second degree polynomial in M, the earthquake magnitude provides a better description of the frequencies of earthquakes in a much wider range of magnitudes. It may be possible to adopt magnitude frequency relation for regions, for which adequate earthquake data are not available, to carry out seismic risk calculations. (author). 32 refs., 8 tabs., 7 figs

  13. Guidelines for earthquake ground motion definition for the eastern United States

    International Nuclear Information System (INIS)

    Gwaltney, R.C.; Aramayo, G.A.; Williams, R.T.

    1985-01-01

    Guidelines for the determination of earthquake ground-motion definition for the eastern United States are established in this paper. Both far-field and near-field guidelines are given. The guidelines were based on an extensive review of the current procedures for specifying ground motion in the United States. Both empirical and theoretical procedures were used in establishing the guidelines because of the low seismicity in the eastern United States. Only a few large to great (M > 7.5) sized earthquakes have occurred in this region, no evidence of tectonic surface ruptures related to historic or Holocene earthquakes have been found, and no currently active plate boundaries of any kind are known in this region. Very little instrumented data has been gathered in the East. Theoretical procedures are proposed so that in regions of almost no data a reasonable level of seismic ground motion activity can be assumed. The guidelines are to be used to develop the Safe Shutdown Earthquake, SSE. A new procedure for establishing the Operating Basis Earthquake, OBE, is proposed, in particular for the eastern United States. The OBE would be developed using a probabilistic assessment of the geological conditions and the recurrence of seismic events at a site. These guidelines should be useful in development of seismic design requirements for future reactors. 17 refs., 2 figs., 1 tab

  14. Guidelines for earthquake ground motion definition for the Eastern United States

    International Nuclear Information System (INIS)

    Gwaltney, R.C.; Aramayo, G.A.; Williams, R.T.

    1985-01-01

    Guidelines for the determination of earthquake ground-motion definition for the eastern United States are established in this paper. Both far-field and near-field guidelines are given. The guidelines were based on an extensive review of the current procedures for specifying ground motion in the United States. Both empirical and theoretical procedures were used in establishing the guidelines because of the low seismicity in the eastern United States. Only a few large to great (M > 7.5) sized earthquakes have occurred in this region, no evidence of tectonic surface ruptures related to historic or Holocene earthquakes have been found, and no currently active plate boundaries of any kind are known in this region. Very little instrumented data has been gathered in the East. Theoretical procedures are proposed so that in regions of almost no data a reasonable level of seismic ground motion activity can be assumed. The guidelines are to be used to develop the Safe Shutdown Earthquake, SSE. A new procedure for establishing the Operating Basis Earthquake, OBE, is proposed, in particular for the eastern United States. The OBE would be developed using a probabilistic assessment of the geological conditions and the recurrence of seismic events at a site. These guidelines should be useful in development of seismic design requirements for future reactors

  15. Spatial Evaluation and Verification of Earthquake Simulators

    Science.gov (United States)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  16. California Earthquake Clearinghouse Crisis Information-Sharing Strategy in Support of Situational Awareness, Understanding Interdependencies of Critical Infrastructure, Regional Resilience, Preparedness, Risk Assessment/mitigation, Decision-Making and Everyday Operational Needs

    Science.gov (United States)

    Rosinski, A.; Morentz, J.; Beilin, P.

    2017-12-01

    The principal function of the California Earthquake Clearinghouse is to provide State and Federal disaster response managers, and the scientific and engineering communities, with prompt information on ground failure, structural damage, and other consequences from significant seismic events such as earthquakes and tsunamis. The overarching problem highlighted in discussions with Clearinghouse partners is the confusion and frustration of many of the Operational Area representatives, and some regional utilities throughout the state on what software applications they should be using and maintaining to meet State, Federal, and Local, requirements, and for what purposes, and how to deal with the limitations of these applications. This problem is getting in the way of making meaningful progress on developing multi-application interoperability and the necessary supporting cross-sector information-sharing procedures and dialogue on essential common operational information that entities need to share for different all hazards missions and related operational activities associated with continuity, security, and resilience. The XchangeCore based system the Clearinghouse is evolving helps deal with this problem, and does not compound it by introducing yet another end-user application; there is no end-user interface with which one views XchangeCore, all viewing of data provided through XchangeCore occurs in and on existing, third-party operational applications. The Clearinghouse efforts with XchangeCore are compatible with FEMA, which is currently using XchangeCore-provided data for regional and National Business Emergency Operations Center (source of business information sharing during emergencies) response. Also important, and should be emphasized, is that information-sharing is not just for response, but for preparedness, risk assessment/mitigation decision-making, and everyday operational needs for situational awareness. In other words, the benefits of the Clearinghouse

  17. Schema building profiles among elementary school students in solving problems related to operations of addition to fractions on the basis of mathematic abilities

    Science.gov (United States)

    Gembong, S.; Suwarsono, S. T.; Prabowo

    2018-03-01

    Schema in the current study refers to a set of action, process, object and other schemas already possessed to build an individual’s ways of thinking to solve a given problem. The current study aims to investigate the schemas built among elementary school students in solving problems related to operations of addition to fractions. The analyses of the schema building were done qualitatively on the basis of the analytical framework of the APOS theory (Action, Process, Object, and Schema). Findings show that the schemas built on students of high and middle ability indicate the following. In the action stage, students were able to add two fractions by way of drawing a picture or procedural way. In the Stage of process, they could add two and three fractions. In the stage of object, they could explain the steps of adding two fractions and change a fraction into addition of fractions. In the last stage, schema, they could add fractions by relating them to another schema they have possessed i.e. the least common multiple. Those of high and middle mathematic abilities showed that their schema building in solving problems related to operations odd addition to fractions worked in line with the framework of the APOS theory. Those of low mathematic ability, however, showed that their schema on each stage did not work properly.

  18. Instruction system upon occurrence of earthquakes

    International Nuclear Information System (INIS)

    Inagaki, Masakatsu; Morikawa, Matsuo; Suzuki, Satoshi; Fukushi, Naomi.

    1987-01-01

    Purpose: To enable rapid re-starting of a nuclear reactor after earthquakes by informing various properties of encountered earthquake to operators and properly displaying the state of damages in comparison with designed standard values of facilities. Constitution: Even in a case where the maximum accelerations due to the movements of earthquakes encountered exceed designed standard values, it may be considered such a case that equipments still remain intact depending on the wave components of the seismic movements and the vibration properties inherent to the equipments. Taking notice of the fact, the instruction device comprises a system that indicates the relationship between the seismic waveforms of earthquakes being encountered and the scram setting values, a system for indicating the comparison between the floor response spectrum of the seismic waveforms of the encountered earthquakes and the designed floor response spectrum used for the design of the equipments and a system for indicating those equipments requiring inspection after the earthquakes. Accordingly, it is possible to improve the operationability upon scram of a nuclear power plant undergoing earthquakes and improve the power saving and safety by clearly defining the inspection portion after the earthquakes. (Kawakami, Y.)

  19. Design basis ground motion (Ss) required on new regulatory guide

    International Nuclear Information System (INIS)

    Kamae, Katsuhiro

    2013-01-01

    New regulatory guide is enforced on July 8. Here, it is introduced how the design basis ground motion (Ss) for seismic design of nuclear power reactor facilities was revised on the new guide. Ss is formulated as two types of earthquake ground motions, earthquake ground motions with site specific earthquake source and with no such specific source locations. The latter is going to be revised based on the recent observed near source ground motions. (author)

  20. Toward real-time regional earthquake simulation of Taiwan earthquakes

    Science.gov (United States)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  1. Earthquakes: hydrogeochemical precursors

    Science.gov (United States)

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  2. Ground water and earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Ts' ai, T H

    1977-11-01

    Chinese folk wisdom has long seen a relationship between ground water and earthquakes. Before an earthquake there is often an unusual change in the ground water level and volume of flow. Changes in the amount of particulate matter in ground water as well as changes in color, bubbling, gas emission, and noises and geysers are also often observed before earthquakes. Analysis of these features can help predict earthquakes. Other factors unrelated to earthquakes can cause some of these changes, too. As a first step it is necessary to find sites which are sensitive to changes in ground stress to be used as sensor points for predicting earthquakes. The necessary features are described. Recording of seismic waves of earthquake aftershocks is also an important part of earthquake predictions.

  3. POST Earthquake Debris Management — AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  4. POST Earthquake Debris Management - AN Overview

    Science.gov (United States)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  5. Ionospheric earthquake precursors

    International Nuclear Information System (INIS)

    Bulachenko, A.L.; Oraevskij, V.N.; Pokhotelov, O.A.; Sorokin, V.N.; Strakhov, V.N.; Chmyrev, V.M.

    1996-01-01

    Results of experimental study on ionospheric earthquake precursors, program development on processes in the earthquake focus and physical mechanisms of formation of various type precursors are considered. Composition of experimental cosmic system for earthquake precursors monitoring is determined. 36 refs., 5 figs

  6. Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  7. Self-organization comprehensive real-time state evaluation model for oil pump unit on the basis of operating condition classification and recognition

    Science.gov (United States)

    Liang, Wei; Yu, Xuchao; Zhang, Laibin; Lu, Wenqing

    2018-05-01

    In oil transmission station, the operating condition (OC) of an oil pump unit sometimes switches accordingly, which will lead to changes in operating parameters. If not taking the switching of OCs into consideration while performing a state evaluation on the pump unit, the accuracy of evaluation would be largely influenced. Hence, in this paper, a self-organization Comprehensive Real-Time State Evaluation Model (self-organization CRTSEM) is proposed based on OC classification and recognition. However, the underlying model CRTSEM is built through incorporating the advantages of Gaussian Mixture Model (GMM) and Fuzzy Comprehensive Evaluation Model (FCEM) first. That is to say, independent state models are established for every state characteristic parameter according to their distribution types (i.e. the Gaussian distribution and logistic regression distribution). Meanwhile, Analytic Hierarchy Process (AHP) is utilized to calculate the weights of state characteristic parameters. Then, the OC classification is determined by the types of oil delivery tasks, and CRTSEMs of different standard OCs are built to constitute the CRTSEM matrix. On the other side, the OC recognition is realized by a self-organization model that is established on the basis of Back Propagation (BP) model. After the self-organization CRTSEM is derived through integration, real-time monitoring data can be inputted for OC recognition. At the end, the current state of the pump unit can be evaluated by using the right CRTSEM. The case study manifests that the proposed self-organization CRTSEM can provide reasonable and accurate state evaluation results for the pump unit. Besides, the assumption that the switching of OCs will influence the results of state evaluation is also verified.

  8. Operational wave now- and forecast in the German Bight as a basis for the assessment of wave-induced hydrodynamic loads on coastal dikes

    Science.gov (United States)

    Dreier, Norman; Fröhle, Peter

    2017-12-01

    The knowledge of the wave-induced hydrodynamic loads on coastal dikes including their temporal and spatial resolution on the dike in combination with actual water levels is of crucial importance of any risk-based early warning system. As a basis for the assessment of the wave-induced hydrodynamic loads, an operational wave now- and forecast system is set up that consists of i) available field measurements from the federal and local authorities and ii) data from numerical simulation of waves in the German Bight using the SWAN wave model. In this study, results of the hindcast of deep water wave conditions during the winter storm on 5-6 December, 2013 (German name `Xaver') are shown and compared with available measurements. Moreover field measurements of wave run-up from the local authorities at a sea dike on the German North Sea Island of Pellworm are presented and compared against calculated wave run-up using the EurOtop (2016) approach.

  9. Safety Basis Report

    International Nuclear Information System (INIS)

    R.J. Garrett

    2002-01-01

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities

  10. Safety Basis Report

    Energy Technology Data Exchange (ETDEWEB)

    R.J. Garrett

    2002-01-14

    As part of the internal Integrated Safety Management Assessment verification process, it was determined that there was a lack of documentation that summarizes the safety basis of the current Yucca Mountain Project (YMP) site characterization activities. It was noted that a safety basis would make it possible to establish a technically justifiable graded approach to the implementation of the requirements identified in the Standards/Requirements Identification Document. The Standards/Requirements Identification Documents commit a facility to compliance with specific requirements and, together with the hazard baseline documentation, provide a technical basis for ensuring that the public and workers are protected. This Safety Basis Report has been developed to establish and document the safety basis of the current site characterization activities, establish and document the hazard baseline, and provide the technical basis for identifying structures, systems, and components (SSCs) that perform functions necessary to protect the public, the worker, and the environment from hazards unique to the YMP site characterization activities. This technical basis for identifying SSCs serves as a grading process for the implementation of programs such as Conduct of Operations (DOE Order 5480.19) and the Suspect/Counterfeit Items Program. In addition, this report provides a consolidated summary of the hazards analyses processes developed to support the design, construction, and operation of the YMP site characterization facilities and, therefore, provides a tool for evaluating the safety impacts of changes to the design and operation of the YMP site characterization activities.

  11. Crowdsourced earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  12. Earthquake forecasting and warning

    Energy Technology Data Exchange (ETDEWEB)

    Rikitake, T.

    1983-01-01

    This review briefly describes two other books on the same subject either written or partially written by Rikitake. In this book, the status of earthquake prediction efforts in Japan, China, the Soviet Union, and the United States are updated. An overview of some of the organizational, legal, and societal aspects of earthquake prediction in these countries is presented, and scientific findings of precursory phenomena are included. A summary of circumstances surrounding the 1975 Haicheng earthquake, the 1978 Tangshan earthquake, and the 1976 Songpan-Pingwu earthquake (all magnitudes = 7.0) in China and the 1978 Izu-Oshima earthquake in Japan is presented. This book fails to comprehensively summarize recent advances in earthquake prediction research.

  13. Earthquake Warning Performance in Vallejo for the South Napa Earthquake

    Science.gov (United States)

    Wurman, G.; Price, M.

    2014-12-01

    In 2002 and 2003, Seismic Warning Systems, Inc. installed first-generation QuakeGuardTM earthquake warning devices at all eight fire stations in Vallejo, CA. These devices are designed to detect the P-wave of an earthquake and initiate predetermined protective actions if the impending shaking is estimated at approximately Modifed Mercalli Intensity V or greater. At the Vallejo fire stations the devices were set up to sound an audio alert over the public address system and to command the equipment bay doors to open. In August 2014, after more than 11 years of operating in the fire stations with no false alarms, the five units that were still in use triggered correctly on the MW 6.0 South Napa earthquake, less than 16 km away. The audio alert sounded in all five stations, providing fire fighters with 1.5 to 2.5 seconds of warning before the arrival of the S-wave, and the equipment bay doors opened in three of the stations. In one station the doors were disconnected from the QuakeGuard device, and another station lost power before the doors opened completely. These problems highlight just a small portion of the complexity associated with realizing actionable earthquake warnings. The issues experienced in this earthquake have already been addressed in subsequent QuakeGuard product generations, with downstream connection monitoring and backup power for critical systems. The fact that the fire fighters in Vallejo were afforded even two seconds of warning at these epicentral distances results from the design of the QuakeGuard devices, which focuses on rapid false positive rejection and ground motion estimates. We discuss the performance of the ground motion estimation algorithms, with an emphasis on the accuracy and timeliness of the estimates at close epicentral distances.

  14. Earthquake risk assessment of Alexandria, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Gaber, Hanan; Ibrahim, Hamza

    2015-01-01

    Throughout historical and recent times, Alexandria has suffered great damage due to earthquakes from both near- and far-field sources. Sometimes, the sources of such damages are not well known. During the twentieth century, the city was shaken by several earthquakes generated from inland dislocations (e.g., 29 Apr. 1974, 12 Oct. 1992, and 28 Dec. 1999) and the African continental margin (e.g., 12 Sept. 1955 and 28 May 1998). Therefore, this study estimates the earthquake ground shaking and the consequent impacts in Alexandria on the basis of two earthquake scenarios. The simulation results show that Alexandria affected by both earthquakes scenarios relatively in the same manner despite the number of casualties during the first scenario (inland dislocation) is twice larger than the second one (African continental margin). An expected percentage of 2.27 from Alexandria's total constructions (12.9 millions, 2006 Census) will be affected, 0.19 % injuries and 0.01 % deaths of the total population (4.1 millions, 2006 Census) estimated by running the first scenario. The earthquake risk profile reveals that three districts (Al-Montazah, Al-Amriya, and Shark) lie in high seismic risks, two districts (Gharb and Wasat) are in moderate, and two districts (Al-Gomrok and Burg El-Arab) are in low seismic risk level. Moreover, the building damage estimations reflect that Al-Montazah is the highest vulnerable district whereas 73 % of expected damages were reported there. The undertaken analysis shows that the Alexandria urban area faces high risk. Informal areas and deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated earthquake risks (buildings damages) are concentrated at the most densely populated (Al-Montazah, Al-Amriya, and Shark) districts. Moreover, about 75 % of casualties are in the same districts.

  15. Encyclopedia of earthquake engineering

    CERN Document Server

    Kougioumtzoglou, Ioannis; Patelli, Edoardo; Au, Siu-Kui

    2015-01-01

    The Encyclopedia of Earthquake Engineering is designed to be the authoritative and comprehensive reference covering all major aspects of the science of earthquake engineering, specifically focusing on the interaction between earthquakes and infrastructure. The encyclopedia comprises approximately 265 contributions. Since earthquake engineering deals with the interaction between earthquake disturbances and the built infrastructure, the emphasis is on basic design processes important to both non-specialists and engineers so that readers become suitably well-informed without needing to deal with the details of specialist understanding. The content of this encyclopedia provides technically inclined and informed readers about the ways in which earthquakes can affect our infrastructure and how engineers would go about designing against, mitigating and remediating these effects. The coverage ranges from buildings, foundations, underground construction, lifelines and bridges, roads, embankments and slopes. The encycl...

  16. Earthquake at 40 feet

    Science.gov (United States)

    Miller, G. J.

    1976-01-01

    The earthquake that struck the island of Guam on November 1, 1975, at 11:17 a.m had many unique aspects-not the least of which was the experience of an earthquake of 6.25 Richter magnitude while at 40 feet. My wife Bonnie, a fellow diver, Greg Guzman, and I were diving at Gabgab Beach in teh outer harbor of Apra Harbor, engaged in underwater phoyography when the earthquake struck. 

  17. Earthquakes and economic growth

    OpenAIRE

    Fisker, Peter Simonsen

    2012-01-01

    This study explores the economic consequences of earthquakes. In particular, it is investigated how exposure to earthquakes affects economic growth both across and within countries. The key result of the empirical analysis is that while there are no observable effects at the country level, earthquake exposure significantly decreases 5-year economic growth at the local level. Areas at lower stages of economic development suffer harder in terms of economic growth than richer areas. In addition,...

  18. Accessing northern California earthquake data via Internet

    Science.gov (United States)

    Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David

    The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.

  19. Improvement of the electromagnetic situation in networks of operational current at nuclear power plant for the purpose of ensuring reliability of monitoring systems, control, RPA and communications, realized on the basis of digital equipment

    International Nuclear Information System (INIS)

    Fomenko, O.V.; Moloshnaya, E.S.; Ul'yanova, Yu.E.

    2014-01-01

    The authors propose the technique of designing a portable grounding contour for protection and control systems, relay protection and automation (RPA) systems made on the basis of digital equipment in networks of an operational direct current to increase the reliability of their functioning under the conditions of influence of strong hindrances and improvement of an electromagnetic environment [ru

  20. A suite of exercises for verifying dynamic earthquake rupture codes

    Science.gov (United States)

    Harris, Ruth A.; Barall, Michael; Aagaard, Brad T.; Ma, Shuo; Roten, Daniel; Olsen, Kim B.; Duan, Benchun; Liu, Dunyu; Luo, Bin; Bai, Kangchen; Ampuero, Jean-Paul; Kaneko, Yoshihiro; Gabriel, Alice-Agnes; Duru, Kenneth; Ulrich, Thomas; Wollherr, Stephanie; Shi, Zheqiang; Dunham, Eric; Bydlon, Sam; Zhang, Zhenguo; Chen, Xiaofei; Somala, Surendra N.; Pelties, Christian; Tago, Josue; Cruz-Atienza, Victor Manuel; Kozdon, Jeremy; Daub, Eric; Aslam, Khurram; Kase, Yuko; Withers, Kyle; Dalguer, Luis

    2018-01-01

    We describe a set of benchmark exercises that are designed to test if computer codes that simulate dynamic earthquake rupture are working as intended. These types of computer codes are often used to understand how earthquakes operate, and they produce simulation results that include earthquake size, amounts of fault slip, and the patterns of ground shaking and crustal deformation. The benchmark exercises examine a range of features that scientists incorporate in their dynamic earthquake rupture simulations. These include implementations of simple or complex fault geometry, off‐fault rock response to an earthquake, stress conditions, and a variety of formulations for fault friction. Many of the benchmarks were designed to investigate scientific problems at the forefronts of earthquake physics and strong ground motions research. The exercises are freely available on our website for use by the scientific community.

  1. Investigating landslides caused by earthquakes - A historical review

    Science.gov (United States)

    Keefer, D.K.

    2002-01-01

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides. This paper traces the historical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquake are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession of post-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing "retrospective" analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, synthesis of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  2. OMG Earthquake! Can Twitter improve earthquake response?

    Science.gov (United States)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  3. Children's emotional experience two years after an earthquake: An exploration of knowledge of earthquakes and associated emotions.

    Science.gov (United States)

    Raccanello, Daniela; Burro, Roberto; Hall, Rob

    2017-01-01

    We explored whether and how the exposure to a natural disaster such as the 2012 Emilia Romagna earthquake affected the development of children's emotional competence in terms of understanding, regulating, and expressing emotions, after two years, when compared with a control group not exposed to the earthquake. We also examined the role of class level and gender. The sample included two groups of children (n = 127) attending primary school: The experimental group (n = 65) experienced the 2012 Emilia Romagna earthquake, while the control group (n = 62) did not. The data collection took place two years after the earthquake, when children were seven or ten-year-olds. Beyond assessing the children's understanding of emotions and regulating abilities with standardized instruments, we employed semi-structured interviews to explore their knowledge of earthquakes and associated emotions, and a structured task on the intensity of some target emotions. We applied Generalized Linear Mixed Models. Exposure to the earthquake did not influence the understanding and regulation of emotions. The understanding of emotions varied according to class level and gender. Knowledge of earthquakes, emotional language, and emotions associated with earthquakes were, respectively, more complex, frequent, and intense for children who had experienced the earthquake, and at increasing ages. Our data extend the generalizability of theoretical models on children's psychological functioning following disasters, such as the dose-response model and the organizational-developmental model for child resilience, and provide further knowledge on children's emotional resources related to natural disasters, as a basis for planning educational prevention programs.

  4. Anthropogenic seismicity rates and operational parameters at the Salton Sea Geothermal Field.

    Science.gov (United States)

    Brodsky, Emily E; Lajoie, Lia J

    2013-08-02

    Geothermal power is a growing energy source; however, efforts to increase production are tempered by concern over induced earthquakes. Although increased seismicity commonly accompanies geothermal production, induced earthquake rate cannot currently be forecast on the basis of fluid injection volumes or any other operational parameters. We show that at the Salton Sea Geothermal Field, the total volume of fluid extracted or injected tracks the long-term evolution of seismicity. After correcting for the aftershock rate, the net fluid volume (extracted-injected) provides the best correlation with seismicity in recent years. We model the background earthquake rate with a linear combination of injection and net production rates that allows us to track the secular development of the field as the number of earthquakes per fluid volume injected decreases over time.

  5. Use of earthquake experience data

    International Nuclear Information System (INIS)

    Eder, S.J.; Eli, M.W.

    1991-01-01

    At many of the older existing US Department of Energy (DOE) facilities, the need has arisen for evaluation guidelines for natural phenomena hazard assessment. The effect of a design basis earthquake at most of these facilities is one of the main concerns. Earthquake experience data can provide a basis for the needed seismic evaluation guidelines, resulting in an efficient screening evaluation methodology for several of the items that are in the scope of the DOE facility reviews. The experience-based screening evaluation methodology, when properly established and implemented by trained engineers, has proven to result in sufficient safety margins and focuses on real concerns via facility walkdowns, usually at costs much less than the alternative options of analysis and testing. This paper summarizes a program that is being put into place to establish uniform seismic evaluation guidelines and criteria for evaluation of existing DOE facilities. The intent of the program is to maximize use of past experience, in conjunction with a walkdown screening evaluation process

  6. Bam Earthquake in Iran

    CERN Multimedia

    2004-01-01

    Following their request for help from members of international organisations, the permanent Mission of the Islamic Republic of Iran has given the following bank account number, where you can donate money to help the victims of the Bam earthquake. Re: Bam earthquake 235 - UBS 311264.35L Bubenberg Platz 3001 BERN

  7. Tradable Earthquake Certificates

    NARCIS (Netherlands)

    Woerdman, Edwin; Dulleman, Minne

    2018-01-01

    This article presents a market-based idea to compensate for earthquake damage caused by the extraction of natural gas and applies it to the case of Groningen in the Netherlands. Earthquake certificates give homeowners a right to yearly compensation for both property damage and degradation of living

  8. Prediction of strong earthquake motions on rock surface using evolutionary process models

    International Nuclear Information System (INIS)

    Kameda, H.; Sugito, M.

    1984-01-01

    Stochastic process models are developed for prediction of strong earthquake motions for engineering design purposes. Earthquake motions with nonstationary frequency content are modeled by using the concept of evolutionary processes. Discussion is focused on the earthquake motions on bed rocks which are important for construction of nuclear power plants in seismic regions. On this basis, two earthquake motion prediction models are developed, one (EMP-IB Model) for prediction with given magnitude and epicentral distance, and the other (EMP-IIB Model) to account for the successive fault ruptures and the site location relative to the fault of great earthquakes. (Author) [pt

  9. Historic Eastern Canadian earthquakes

    International Nuclear Information System (INIS)

    Asmis, G.J.K.; Atchinson, R.J.

    1981-01-01

    Nuclear power plants licensed in Canada have been designed to resist earthquakes: not all plants, however, have been explicitly designed to the same level of earthquake induced forces. Understanding the nature of strong ground motion near the source of the earthquake is still very tentative. This paper reviews historical and scientific accounts of the three strongest earthquakes - St. Lawrence (1925), Temiskaming (1935), Cornwall (1944) - that have occurred in Canada in 'modern' times, field studies of near-field strong ground motion records and their resultant damage or non-damage to industrial facilities, and numerical modelling of earthquake sources and resultant wave propagation to produce accelerograms consistent with the above historical record and field studies. It is concluded that for future construction of NPP's near-field strong motion must be explicitly considered in design

  10. Comment on "Rethinking first-principles electron transport theories with projection operators: The problems caused by partitioning the basis set" [J. Chem. Phys. 139, 114104 (2013)

    DEFF Research Database (Denmark)

    Brandbyge, Mads

    2014-01-01

    , different from what would be obtained by using an orthogonal basis, and dividing surfaces defined in real-space. We argue that this assumption is not required to be fulfilled to get exact results. We show how the current/transmission calculated by the standard Greens function method is independent...

  11. Civil-Military Relations in Domestic Support Operations. The California National Guard in Los Angeles 1992 Riots and Northridge Earthquake of 1994

    National Research Council Canada - National Science Library

    Khomchenko, Sergey

    1997-01-01

    .... Furthermore, it argues that civil- military relations in domestic support operations (DSO) are a very important factor to consider when new democracies try to build an effective system of emergency management...

  12. Modified-Fibonacci-Dual-Lucas method for earthquake prediction

    Science.gov (United States)

    Boucouvalas, A. C.; Gkasios, M.; Tselikas, N. T.; Drakatos, G.

    2015-06-01

    The FDL method makes use of Fibonacci, Dual and Lucas numbers and has shown considerable success in predicting earthquake events locally as well as globally. Predicting the location of the epicenter of an earthquake is one difficult challenge the other being the timing and magnitude. One technique for predicting the onset of earthquakes is the use of cycles, and the discovery of periodicity. Part of this category is the reported FDL method. The basis of the reported FDL method is the creation of FDL future dates based on the onset date of significant earthquakes. The assumption being that each occurred earthquake discontinuity can be thought of as a generating source of FDL time series The connection between past earthquakes and future earthquakes based on FDL numbers has also been reported with sample earthquakes since 1900. Using clustering methods it has been shown that significant earthquakes (conjunct Sun, Moon opposite Sun, Moon conjunct or opposite North or South Modes. In order to test improvement of the method we used all +8R earthquakes recorded since 1900, (86 earthquakes from USGS data). We have developed the FDL numbers for each of those seeds, and examined the earthquake hit rates (for a window of 3, i.e. +-1 day of target date) and for <6.5R. The successes are counted for each one of the 86 earthquake seeds and we compare the MFDL method with the FDL method. In every case we find improvement when the starting seed date is on the planetary trigger date prior to the earthquake. We observe no improvement only when a planetary trigger coincided with the earthquake date and in this case the FDL method coincides with the MFDL. Based on the MDFL method we present the prediction method capable of predicting global events or localized earthquakes and we will discuss the accuracy of the method in as far as the prediction and location parts of the method. We show example calendar style predictions for global events as well as for the Greek region using

  13. Electrostatically actuated resonant switches for earthquake detection

    KAUST Repository

    Ramini, Abdallah H.

    2013-04-01

    The modeling and design of electrostatically actuated resonant switches (EARS) for earthquake and seismic applications are presented. The basic concepts are based on operating an electrically actuated resonator close to instability bands of frequency, where it is forced to collapse (pull-in) if operated within these bands. By careful tuning, the resonator can be made to enter the instability zone upon the detection of the earthquake signal, thereby pulling-in as a switch. Such a switching action can be functionalized for useful functionalities, such as shutting off gas pipelines in the case of earthquakes, or can be used to activate a network of sensors for seismic activity recording in health monitoring applications. By placing a resonator on a printed circuit board (PCB) of a natural frequency close to that of the earthquake\\'s frequency, we show significant improvement on the detection limit of the EARS lowering it considerably to less than 60% of the EARS by itself without the PCB. © 2013 IEEE.

  14. Earthquakes, November-December 1977

    Science.gov (United States)

    Person, W.J.

    1978-01-01

    Two major earthquakes occurred in the last 2 months of the year. A magnitude 7.0 earthquake struck San Juan Province, Argentina, on November 23, causing fatalities and damage. The second major earthquake was a magnitude 7.0 in the Bonin Islands region, an unpopulated area. On December 19, Iran experienced a destructive earthquake, which killed over 500.

  15. Earthquakes, September-October 1986

    Science.gov (United States)

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  16. Earthquake hazard assessment and small earthquakes

    International Nuclear Information System (INIS)

    Reiter, L.

    1987-01-01

    The significance of small earthquakes and their treatment in nuclear power plant seismic hazard assessment is an issue which has received increased attention over the past few years. In probabilistic studies, sensitivity studies showed that the choice of the lower bound magnitude used in hazard calculations can have a larger than expected effect on the calculated hazard. Of particular interest is the fact that some of the difference in seismic hazard calculations between the Lawrence Livermore National Laboratory (LLNL) and Electric Power Research Institute (EPRI) studies can be attributed to this choice. The LLNL study assumed a lower bound magnitude of 3.75 while the EPRI study assumed a lower bound magnitude of 5.0. The magnitudes used were assumed to be body wave magnitudes or their equivalents. In deterministic studies recent ground motion recordings of small to moderate earthquakes at or near nuclear power plants have shown that the high frequencies of design response spectra may be exceeded. These exceedances became important issues in the licensing of the Summer and Perry nuclear power plants. At various times in the past particular concerns have been raised with respect to the hazard and damage potential of small to moderate earthquakes occurring at very shallow depths. In this paper a closer look is taken at these issues. Emphasis is given to the impact of lower bound magnitude on probabilistic hazard calculations and the historical record of damage from small to moderate earthquakes. Limited recommendations are made as to how these issues should be viewed

  17. Investigating Landslides Caused by Earthquakes A Historical Review

    Science.gov (United States)

    Keefer, David K.

    Post-earthquake field investigations of landslide occurrence have provided a basis for understanding, evaluating, and mapping the hazard and risk associated withearthquake-induced landslides. This paper traces thehistorical development of knowledge derived from these investigations. Before 1783, historical accounts of the occurrence of landslides in earthquakes are typically so incomplete and vague that conclusions based on these accounts are of limited usefulness. For example, the number of landslides triggered by a given event is almost always greatly underestimated. The first formal, scientific post-earthquake investigation that included systematic documentation of the landslides was undertaken in the Calabria region of Italy after the 1783 earthquake swarm. From then until the mid-twentieth century, the best information on earthquake-induced landslides came from a succession ofpost-earthquake investigations largely carried out by formal commissions that undertook extensive ground-based field studies. Beginning in the mid-twentieth century, when the use of aerial photography became widespread, comprehensive inventories of landslide occurrence have been made for several earthquakes in the United States, Peru, Guatemala, Italy, El Salvador, Japan, and Taiwan. Techniques have also been developed for performing ``retrospective'' analyses years or decades after an earthquake that attempt to reconstruct the distribution of landslides triggered by the event. The additional use of Geographic Information System (GIS) processing and digital mapping since about 1989 has greatly facilitated the level of analysis that can applied to mapped distributions of landslides. Beginning in 1984, syntheses of worldwide and national data on earthquake-induced landslides have defined their general characteristics and relations between their occurrence and various geologic and seismic parameters. However, the number of comprehensive post-earthquake studies of landslides is still

  18. The Challenge of Centennial Earthquakes to Improve Modern Earthquake Engineering

    International Nuclear Information System (INIS)

    Saragoni, G. Rodolfo

    2008-01-01

    The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8 earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand

  19. Sun, Moon and Earthquakes

    Science.gov (United States)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  20. Tidal controls on earthquake size-frequency statistics

    Science.gov (United States)

    Ide, S.; Yabe, S.; Tanaka, Y.

    2016-12-01

    The possibility that tidal stresses can trigger earthquakes is a long-standing issue in seismology. Except in some special cases, a causal relationship between seismicity and the phase of tidal stress has been rejected on the basis of studies using many small events. However, recently discovered deep tectonic tremors are highly sensitive to tidal stress levels, with the relationship being governed by a nonlinear law according to which the tremor rate increases exponentially with increasing stress; thus, slow deformation (and the probability of earthquakes) may be enhanced during periods of large tidal stress. Here, we show the influence of tidal stress on seismicity by calculating histories of tidal shear stress during the 2-week period before earthquakes. Very large earthquakes tend to occur near the time of maximum tidal stress, but this tendency is not obvious for small earthquakes. Rather, we found that tidal stress controls the earthquake size-frequency statistics; i.e., the fraction of large events increases (i.e. the b-value of the Gutenberg-Richter relation decreases) as the tidal shear stress increases. This correlation is apparent in data from the global catalog and in relatively homogeneous regional catalogues of earthquakes in Japan. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. Our findings indicate that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. This finding has clear implications for probabilistic earthquake forecasting.

  1. Post-Earthquake Reconstruction — in Context of Housing

    Science.gov (United States)

    Sarkar, Raju

    Comprehensive rescue and relief operations are always launched with no loss of time with active participation of the Army, Governmental agencies, Donor agencies, NGOs, and other Voluntary organizations after each Natural Disaster. There are several natural disasters occurring throughout the world round the year and one of them is Earthquake. More than any other natural catastrophe, an earthquake represents the undoing of our most basic pre-conceptions of the earth as the source of stability or the first distressing factor due to earthquake is the collapse of our dwelling units. Earthquake has affected buildings since people began constructing them. So after each earthquake a reconstruction of housing program is very much essential since housing is referred to as shelter satisfying one of the so-called basic needs next to food and clothing. It is a well-known fact that resettlement (after an earthquake) is often accompanied by the creation of ghettos and ensuing problems in the provision of infrastructure and employment. In fact a housing project after Bhuj earthquake in Gujarat, India, illustrates all the negative aspects of resettlement in the context of reconstruction. The main theme of this paper is to consider few issues associated with post-earthquake reconstruction in context of housing, all of which are significant to communities that have had to rebuild after catastrophe or that will face such a need in the future. Few of them are as follows: (1) Why rebuilding opportunities are time consuming? (2) What are the causes of failure in post-earthquake resettlement? (3) How can holistic planning after an earthquake be planned? (4) What are the criteria to be checked for sustainable building materials? (5) What are the criteria for success in post-earthquake resettlement? (6) How mitigation in post-earthquake housing can be made using appropriate repair, restoration, and strengthening concepts?

  2. Operating experience and systems analysis at Trillo NPP: A program intended for systematic review of plant safety systems to assess design basis requirements compliance

    International Nuclear Information System (INIS)

    Vega, R. de la

    1996-01-01

    The program was defined to apply to all plant safety systems and/or systems included in plant Technical Specifications. The goal of the program was to ensure, by systematic design, construction, and commissioning review, the adequacy of safety systems, structures and components to fulfill their safety functions. Also, as a result of the program, it was established that a complete, unambiguous, systematic, design basis definition shall take place. And finally, a complete documental review of the plant design shall result from the program execution

  3. Technologies for improving current and future light water reactor operation and maintenance: Development on the basis of O and M experiences - the WANO perspective

    International Nuclear Information System (INIS)

    Chang, M.J.

    2000-01-01

    World Association of Nuclear Operators (WANO) plays a role of promoting safety and reliability to nuclear industry after Chernobyl accident. Four programmes or so called cornerstones, operating experience, peer review, professional and technical development, and technical support and exchange have significantly promoted the nuclear operating performance in the past years. A WANO biennial general meeting was recently held in Victoria, Canada, which disclosed apparent achievements on higher unit capability factor, lower unplanned automatic scram per 7000 hours critical, lower collective radiation exposure and lower industrial safety accident rate. In more practical, exchange visits to learn good practices and measures in operation, maintenance, and management have shown benefits among WANO members. Recurring events can be minimized when members learn lessons from significant operating experience reports, significant event reports, and those events have posted on the WANO Web site. Particularly, plant managers' meetings that Tokyo Centre host have created an environment, which allows plant management to exchange ideas by such a face-to-face channel. WANO aims at nuclear safety and reliability. Economic and public acceptance are regarded as pillars to support WANO's mission as well. (author)

  4. Cooperative earthquake research between the United States and the People's Republic of China

    Energy Technology Data Exchange (ETDEWEB)

    Russ, D.P.; Johnson, L.E.

    1986-01-01

    This paper describes cooperative research by scientists of the US and the People's Republic of China (PRC) which has resulted in important new findings concerning the fundamental characteristics of earthquakes and new insight into mitigating earthquake hazards. There have been over 35 projects cooperatively sponsored by the Earthquake Studies Protocol in the past 5 years. The projects are organized into seven annexes, including investigations in earthquake prediction, intraplate faults and earthquakes, earthquake engineering and hazards investigation, deep crustal structure, rock mechanics, seismology, and data exchange. Operational earthquake prediction experiments are currently being developed at two primary sites: western Yunnan Province near the town of Xiaguan, where there are several active faults, and the northeast China plain, where the devastating 1976 Tangshan earthquake occurred.

  5. Incident simulation at the power plant simulator - on the interpretation of operator actions and their cognitive causes as basis for ergonomic recommendations

    International Nuclear Information System (INIS)

    Becker, G.

    1985-01-01

    We first carried out a pilot investigation, the aims of which were twofold, the development of suitable methods and the provision of evidence that these methods bring us nearer to the aims of the plan as a whole; the deduction of initial ideas for ergonomic improvements, since this BMFT program is application-oriented and is not a basic program. To the best of our knowledge, our investigation is the only one in which experienced nuclear power plant operators (operations staff) were observed during their handling of an unknown failure situation on a full simulator. During this investigation, the currently most promising methods for the analysis of strategies for the organization and solution of problems were applied, which comprise the interaction (particularly verbal) between the operators of a team with simultaneous video recording. (orig./GL) [de

  6. Earthquake Ground Motion Selection

    Science.gov (United States)

    2012-05-01

    Nonlinear analyses of soils, structures, and soil-structure systems offer the potential for more accurate characterization of geotechnical and structural response under strong earthquake shaking. The increasing use of advanced performance-based desig...

  7. 1988 Spitak Earthquake Database

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The 1988 Spitak Earthquake database is an extensive collection of geophysical and geological data, maps, charts, images and descriptive text pertaining to the...

  8. Technologies for improving current and future light water reactor operation and maintenance: Development on the basis of experience. Proceedings of a technical committee meeting

    International Nuclear Information System (INIS)

    2000-09-01

    Application of efficient technologies for improving operation and maintenance of nuclear power plants is an important element for assuring their economic competitiveness with other means of generating electricity. The competitive environment, which nuclear power plant operators face in many countries as a result of de-regulation of the electricity market, imposes cost pressures that must be met while at the same time satisfying stringent safety requirements. Further, as currently operating plants age, proper management includes development and application of better technologies for inspection, maintenance and repair. For future plants, the opportunity exists during the design phase to incorporate design features for performing efficient inspection, maintenance and repairs. Despite the prevailing low prices of fossil fuels, the generation costs of nuclear electricity continue to be competitive with electricity generation costs from fossil-fuelled plants for base load generation in several countries. For nuclear power, the capital investment component of electricity generation cost is relatively high, while the nuclear fuel cycle cost is - and is expected to remain - relatively low. The prices of fossil fuels are fairly low today but are likely to increase over the long term because the resource is limited. Moreover, governments may introduce incentives to reduce the use of fossil fuels in order to protect the environment. In many countries, nuclear utilities are experiencing increased competition with other sources of electricity production due to deregulation of the electricity market, and nuclear plant operators can no longer pass along the generation costs to consumers through regulated electricity rates. This competitive environment has significant implications for plant operations to achieve efficient use of all resources, and to effectively manage plant activities including outages and maintenance. Over the past several years, steady improvements have been

  9. Technologies for improving current and future light water reactor operation and maintenance: Development on the basis of experience. Proceedings of a technical committee meeting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-09-01

    Application of efficient technologies for improving operation and maintenance of nuclear power plants is an important element for assuring their economic competitiveness with other means of generating electricity. The competitive environment, which nuclear power plant operators face in many countries as a result of de-regulation of the electricity market, imposes cost pressures that must be met while at the same time satisfying stringent safety requirements. Further, as currently operating plants age, proper management includes development and application of better technologies for inspection, maintenance and repair. For future plants, the opportunity exists during the design phase to incorporate design features for performing efficient inspection, maintenance and repairs. Despite the prevailing low prices of fossil fuels, the generation costs of nuclear electricity continue to be competitive with electricity generation costs from fossil-fuelled plants for base load generation in several countries. For nuclear power, the capital investment component of electricity generation cost is relatively high, while the nuclear fuel cycle cost is - and is expected to remain - relatively low. The prices of fossil fuels are fairly low today but are likely to increase over the long term because the resource is limited. Moreover, governments may introduce incentives to reduce the use of fossil fuels in order to protect the environment. In many countries, nuclear utilities are experiencing increased competition with other sources of electricity production due to deregulation of the electricity market, and nuclear plant operators can no longer pass along the generation costs to consumers through regulated electricity rates. This competitive environment has significant implications for plant operations to achieve efficient use of all resources, and to effectively manage plant activities including outages and maintenance. Over the past several years, steady improvements have been

  10. Electromagnetic Manifestation of Earthquakes

    OpenAIRE

    Uvarov Vladimir

    2017-01-01

    In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  11. Electromagnetic Manifestation of Earthquakes

    Directory of Open Access Journals (Sweden)

    Uvarov Vladimir

    2017-01-01

    Full Text Available In a joint analysis of the results of recording the electrical component of the natural electromagnetic field of the Earth and the catalog of earthquakes in Kamchatka in 2013, unipolar pulses of constant amplitude associated with earthquakes were identified, whose activity is closely correlated with the energy of the electromagnetic field. For the explanation, a hypothesis about the cooperative character of these impulses is proposed.

  12. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    Science.gov (United States)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  13. Metrics for comparing dynamic earthquake rupture simulations

    Science.gov (United States)

    Barall, Michael; Harris, Ruth A.

    2014-01-01

    Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.

  14. Charles Darwin's earthquake reports

    Science.gov (United States)

    Galiev, Shamil

    2010-05-01

    As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the

  15. Nowcasting Earthquakes and Tsunamis

    Science.gov (United States)

    Rundle, J. B.; Turcotte, D. L.

    2017-12-01

    The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk

  16. Source processes of strong earthquakes in the North Tien-Shan region

    Science.gov (United States)

    Kulikova, G.; Krueger, F.

    2013-12-01

    Tien-Shan region attracts attention of scientists worldwide due to its complexity and tectonic uniqueness. A series of very strong destructive earthquakes occurred in Tien-Shan at the turn of XIX and XX centuries. Such large intraplate earthquakes are rare in seismology, which increases the interest in the Tien-Shan region. The presented study focuses on the source processes of large earthquakes in Tien-Shan. The amount of seismic data is limited for those early times. In 1889, when a major earthquake has occurred in Tien-Shan, seismic instruments were installed in very few locations in the world and these analog records did not survive till nowadays. Although around a hundred seismic stations were operating at the beginning of XIX century worldwide, it is not always possible to get high quality analog seismograms. Digitizing seismograms is a very important step in the work with analog seismic records. While working with historical seismic records one has to take into account all the aspects and uncertainties of manual digitizing and the lack of accurate timing and instrument characteristics. In this study, we develop an easy-to-handle and fast digitization program on the basis of already existing software which allows to speed up digitizing process and to account for all the recoding system uncertainties. Owing to the lack of absolute timing for the historical earthquakes (due to the absence of a universal clock at that time), we used time differences between P and S phases to relocate the earthquakes in North Tien-Shan and the body-wave amplitudes to estimate their magnitudes. Combining our results with geological data, five earthquakes in North Tien-Shan were precisely relocated. The digitizing of records can introduce steps into the seismograms which makes restitution (removal of instrument response) undesirable. To avoid the restitution, we simulated historic seismograph recordings with given values for damping and free period of the respective instrument and

  17. Earthquake cycles and physical modeling of the process leading up to a large earthquake

    Science.gov (United States)

    Ohnaka, Mitiyasu

    2004-08-01

    A thorough discussion is made on what the rational constitutive law for earthquake ruptures ought to be from the standpoint of the physics of rock friction and fracture on the basis of solid facts observed in the laboratory. From this standpoint, it is concluded that the constitutive law should be a slip-dependent law with parameters that may depend on slip rate or time. With the long-term goal of establishing a rational methodology of forecasting large earthquakes, the entire process of one cycle for a typical, large earthquake is modeled, and a comprehensive scenario that unifies individual models for intermediate-and short-term (immediate) forecasts is presented within the framework based on the slip-dependent constitutive law and the earthquake cycle model. The earthquake cycle includes the phase of accumulation of elastic strain energy with tectonic loading (phase II), and the phase of rupture nucleation at the critical stage where an adequate amount of the elastic strain energy has been stored (phase III). Phase II plays a critical role in physical modeling of intermediate-term forecasting, and phase III in physical modeling of short-term (immediate) forecasting. The seismogenic layer and individual faults therein are inhomogeneous, and some of the physical quantities inherent in earthquake ruptures exhibit scale-dependence. It is therefore critically important to incorporate the properties of inhomogeneity and physical scaling, in order to construct realistic, unified scenarios with predictive capability. The scenario presented may be significant and useful as a necessary first step for establishing the methodology for forecasting large earthquakes.

  18. Effect of changing of the parameters of the cable network of monitoring systems of high-rise buildings on the basis of string converters on their operability

    Science.gov (United States)

    Gusev, Nikolay; Svatovskaya, Larisa; Kucherenko, Alexandr

    2018-03-01

    The article is devoted to the problem of improving the reliability of monitoring systems for the technical conditions of high-rise buildings. The improvement is based on string sensors with an impulsed excitation method ensuring the maximum signal-to-noise ratio at their output. The influence of the parameters of the monitoring system on the shape of the excitation impulses of the string, and, consequently, on the amplitude of the string vibration of the string converter is also considered in the article. It has been experimentally proved that the parameters of the excitation impulses of the string converters. The article presents the results of the experiments showing the effect of the fronts duration of the excitation impulses on the amplitude of the oscillations of the strings. The influence of the fronts duration of the excitation impulse with the frontal lengths up to 0.5 ms is studied at the excitation impulse duration not exceeding 0.5 times the duration of natural oscillation periods of the string. The experimental data are compared with the theoretical ones and hypotheses explaining their difference are advanced. The article suggests some methods of reducing the influence of the cable-switching equipment system parameters on the amplitude of string oscillations. The possibilities of improving the reliability of the systems developed on the basis of string sensors with an impulsed excitation method and used for monitoring the technical conditions of the high-rise buildings are proposed.

  19. Overview of power plant and industrial facility performance in earthquakes in 1985 through 1987

    International Nuclear Information System (INIS)

    Horstman, N.G.; Yanev, P.I.; McCormick, D.L.

    1987-01-01

    This paper briefly documents the performance of power and industrial facilities during five destructive earthquakes in 1985 and 1986. These earthquakes represent varying levels of intensity, duration, frequency content, epicentral distance and construction practice. All of the earthquakes reinforce the findings of earlier earthquake investigations. Damage to equipment in power and industrial facilities is rare, as long as the equipment is adequately anchored. The ceramic components of switchyard equipment and the actuation of electro-mechanical relays remain concerns in the design of facilities which must remain operational during and following strong motion earthquakes. (orig.)

  20. Earthquakes and Tectonics Expert Judgment Elicitation Project

    International Nuclear Information System (INIS)

    Coppersmith, K.J.; Perman, R.C.; Youngs, R.R.

    1993-02-01

    This report summarizes the results of the Earthquakes and Tectonics Expert Judgement Excitation Project sponsored by the Electric Power Research Institute (EPRI). The objectives of this study were two-fold: (1) to demonstrate methods for the excitation of expert judgement, and (2) to quantify the uncertainties associated with earthquake and tectonics issues for use in the EPRI-HLW performance assessment. Specifically, the technical issue considered is the probability of differential fault displacement through the proposed repository at Yucca Mountain, Nevada. For this study, a strategy for quantifying uncertainties was developed that relies on the judgements of multiple experts. A panel of seven geologists and seismologists was assembled to quantify the uncertainties associated with earthquake and tectonics issues for the performance assessment model. A series of technical workshops focusing on these issues were conducted. Finally, each expert was individually interviewed in order to elicit his judgement regarding the technical issues and to provide the technical basis for his assessment. This report summarizes the methodologies used to elicit the judgements of the earthquakes and tectonics experts (termed ''specialists''), and summarizes the technical assessments made by the expert panel

  1. New characteristics of intensity assessment of Sichuan Lushan "4.20" M s7.0 earthquake

    Science.gov (United States)

    Sun, Baitao; Yan, Peilei; Chen, Xiangzhao

    2014-08-01

    The post-earthquake rapid accurate assessment of macro influence of seismic ground motion is of significance for earthquake emergency relief, post-earthquake reconstruction and scientific research. The seismic intensity distribution map released by the Lushan earthquake field team of the China Earthquake Administration (CEA) five days after the strong earthquake ( M7.0) occurred in Lushan County of Sichuan Ya'an City at 8:02 on April 20, 2013 provides a scientific basis for emergency relief, economic loss assessment and post-earthquake reconstruction. In this paper, the means for blind estimation of macroscopic intensity, field estimation of macro intensity, and review of intensity, as well as corresponding problems are discussed in detail, and the intensity distribution characteristics of the Lushan "4.20" M7.0 earthquake and its influential factors are analyzed, providing a reference for future seismic intensity assessments.

  2. Design basis 2

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, G.; Soerensen, P. [Risoe National Lab., Roskilde (Denmark)

    1996-09-01

    Design Basis Program 2 (DBP2) is comprehensive fully coupled code which has the capability to operate in the time domain as well as in the frequency domain. The code was developed during the period 1991-93 and succeed Design Basis 1, which is a one-blade model presuming stiff tower, transmission system and hub. The package is designed for use on a personal computer and offers a user-friendly environment based on menu-driven editing and control facilities, and with graphics used extensively for the data presentation. Moreover in-data as well as results are dumped on files in Ascii-format. The input data is organized in a in-data base with a structure that easily allows for arbitrary combinations of defined structural components and load cases. (au)

  3. Earthquake number forecasts testing

    Science.gov (United States)

    Kagan, Yan Y.

    2017-10-01

    We study the distributions of earthquake numbers in two global earthquake catalogues: Global Centroid-Moment Tensor and Preliminary Determinations of Epicenters. The properties of these distributions are especially required to develop the number test for our forecasts of future seismic activity rate, tested by the Collaboratory for Study of Earthquake Predictability (CSEP). A common assumption, as used in the CSEP tests, is that the numbers are described by the Poisson distribution. It is clear, however, that the Poisson assumption for the earthquake number distribution is incorrect, especially for the catalogues with a lower magnitude threshold. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrences, the negative-binomial distribution (NBD) has two parameters. The second parameter can be used to characterize the clustering or overdispersion of a process. We also introduce and study a more complex three-parameter beta negative-binomial distribution. We investigate the dependence of parameters for both Poisson and NBD distributions on the catalogue magnitude threshold and on temporal subdivision of catalogue duration. First, we study whether the Poisson law can be statistically rejected for various catalogue subdivisions. We find that for most cases of interest, the Poisson distribution can be shown to be rejected statistically at a high significance level in favour of the NBD. Thereafter, we investigate whether these distributions fit the observed distributions of seismicity. For this purpose, we study upper statistical moments of earthquake numbers (skewness and kurtosis) and compare them to the theoretical values for both distributions. Empirical values for the skewness and the kurtosis increase for the smaller magnitude threshold and increase with even greater intensity for small temporal subdivision of catalogues. The Poisson distribution for large rate values approaches the Gaussian law, therefore its skewness

  4. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    Science.gov (United States)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  5. Rupture, waves and earthquakes.

    Science.gov (United States)

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  6. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    Science.gov (United States)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  7. Earthquakes and Earthquake Engineering. LC Science Tracer Bullet.

    Science.gov (United States)

    Buydos, John F., Comp.

    An earthquake is a shaking of the ground resulting from a disturbance in the earth's interior. Seismology is the (1) study of earthquakes; (2) origin, propagation, and energy of seismic phenomena; (3) prediction of these phenomena; and (4) investigation of the structure of the earth. Earthquake engineering or engineering seismology includes the…

  8. Dynamic oxygen transfer measurements under operating conditions as a basis for the optimization of ventilation systems; Dynamische Sauerstoffeintragsmessungen unter Betriebsbedingungen als Grundlage zur Optimierung von Belueftungssystemen

    Energy Technology Data Exchange (ETDEWEB)

    Libra, J.A.; Biskup, M.; Wiesmann, U. [Technische Univ. Berlin (Germany). Inst. fuer Verfahrenstechnik; Sahlmann, C.; Gnirss, R. [Berliner Wasserbetriebe, Berlin (Germany)

    1999-07-01

    The largest single energy consumer at sewage treatment plants is the ventilation system of activated sludge tanks. This is why controlling and optimizing ventilation systems is the most appropriate approach to the cutting down of energy costs. The present paper reports on measurements of dynamic oxygen transfer by means of the off-gas method under operating conditions at the Berlin-Ruhleben sewage treatment plant. (orig.) [German] Der groesste Einzelenergieverbraucher auf Klaerwerken ist das Belueftungssystem von Belebungsbecken. Deshalb ist die Kontrolle und Optimierung der Belueftungssysteme der geeignete Weg zur Verringerung der Energiekosten. In diesem Beitrag wird ueber Messungen des dynamischen Sauerstoffeintrags mit der Abluft-Methode unter Betriebsbedingungen im Klaerwerk Berlin-Ruhleben berichtet. (orig.)

  9. Trial application of guidelines for nuclear plant response to an earthquake

    International Nuclear Information System (INIS)

    Schmidt, W.; Oliver, R.; O'Connor, W.

    1993-09-01

    Guidelines have been developed to assist nuclear plant personnel in the preparation of earthquake response procedures for nuclear power plants. These guidelines are published in EPRI report NP-6695, ''Guidelines for Nuclear Plant Response to an Earthquake,'' dated December 1989. This report includes two sets of nuclear plant procedures which were prepared to implement the guidelines of EPRI report NP-6695. The first set were developed by the Toledo Edison Company Davis-Besse plant. Davis-Besse is a pressurized water reactor (PWR) and contains relatively standard seismic monitoring instrumentation typical of many domestic nuclear plants. The second set of procedures were prepared by Yankee Atomic Electric Company for the Vermont Yankee facility. This plant is a boiling water reactor (BWR) with state-of-the-art seismic monitoring and PC-based data processing equipment, software developed specifically to implement the OBE Exceedance Criterion presented in EPRI report NP-5930, ''A Criterion for Determining Exceedance of the operating Basis Earthquake.'' The two sets of procedures are intended to demonstrate how two different nuclear utilities have interpreted and applied the EPRI guidance given in report NP-6695

  10. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    Science.gov (United States)

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  11. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.; Mai, Paul Martin; Schorlemmer, Danijel

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data

  12. Earthquakes; May-June 1982

    Science.gov (United States)

    Person, W.J.

    1982-01-01

    There were four major earthquakes (7.0-7.9) during this reporting period: two struck in Mexico, one in El Salvador, and one in teh Kuril Islands. Mexico, El Salvador, and China experienced fatalities from earthquakes.

  13. Preliminary quantitative assessment of earthquake casualties and damages

    DEFF Research Database (Denmark)

    Badal, J.; Vázquez-Prada, M.; González, Á.

    2005-01-01

    Prognostic estimations of the expected number of killed or injured people and about the approximate cost associated with the damages caused by earthquakes are made following a suitable methodology of wide-ranging application. For the preliminary assessment of human life losses due to the occurrence...... of a relatively strong earthquake we use a quantitative model consisting of a correlation between the number of casualties and the earthquake magnitude as a function of population density. The macroseismic intensity field is determined in accordance with an updated anelastic attenuation law, and the number...... the local social wealth as a function of the gross domestic product of the country. This last step is performed on the basis of the relationship of the macroseismic intensity to the earthquake economic loss in percentage of the wealth. Such an approach to the human casualty and damage levels is carried out...

  14. Earthquake forecasting studies using radon time series data in Taiwan

    Science.gov (United States)

    Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong

    2017-04-01

    For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.

  15. Large magnitude (M > 7.5) offshore earthquakes in 2012: few examples of absent or little tsunamigenesis, with implications for tsunami early warning

    Science.gov (United States)

    Pagnoni, Gianluca; Armigliato, Alberto; Tinti, Stefano

    2013-04-01

    We take into account some examples of offshore earthquakes occurred worldwide in year 2012 that were characterised by a "large" magnitude (Mw equal or larger than 7.5) but which produced no or little tsunami effects. Here, "little" is intended as "lower than expected on the basis of the parent earthquake magnitude". The examples we analyse include three earthquakes occurred along the Pacific coasts of Central America (20 March, Mw=7.8, Mexico; 5 September, Mw=7.6, Costa Rica; 7 November, Mw=7.5, Mexico), the Mw=7.6 and Mw=7.7 earthquakes occurred respectively on 31 August and 28 October offshore Philippines and offshore Alaska, and the two Indian Ocean earthquakes registered on a single day (11 April) and characterised by Mw=8.6 and Mw=8.2. For each event, we try to face the problem related to its tsunamigenic potential from two different perspectives. The first can be considered purely scientific and coincides with the question: why was the ensuing tsunami so weak? The answer can be related partly to the particular tectonic setting in the source area, partly to the particular position of the source with respect to the coastline, and finally to the focal mechanism of the earthquake and to the slip distribution on the ruptured fault. The first two pieces of information are available soon after the earthquake occurrence, while the third requires time periods in the order of tens of minutes. The second perspective is more "operational" and coincides with the tsunami early warning perspective, for which the question is: will the earthquake generate a significant tsunami and if so, where will it strike? The Indian Ocean events of 11 April 2012 are perfect examples of the fact that the information on the earthquake magnitude and position alone may not be sufficient to produce reliable tsunami warnings. We emphasise that it is of utmost importance that the focal mechanism determination is obtained in the future much more quickly than it is at present and that this

  16. Turkish Children's Ideas about Earthquakes

    Science.gov (United States)

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  17. Earthquakes, May-June 1991

    Science.gov (United States)

    Person, W.J.

    1992-01-01

    One major earthquake occurred during this reporting period. This was a magntidue 7.1 in Indonesia (Minahassa Peninsula) on June 20. Earthquake-related deaths were reported in the Western Caucasus (Georgia, USSR) on May 3 and June 15. One earthquake-related death was also reported El Salvador on June 21. 

  18. Organizational changes at Earthquakes & Volcanoes

    Science.gov (United States)

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  19. Aseismic blocks and destructive earthquakes in the Aegean

    Science.gov (United States)

    Stiros, Stathis

    2017-04-01

    Aseismic areas are not identified only in vast, geologically stable regions, but also within regions of active, intense, distributed deformation such as the Aegean. In the latter, "aseismic blocks" about 200m wide were recognized in the 1990's on the basis of the absence of instrumentally-derived earthquake foci, in contrast to surrounding areas. This pattern was supported by the available historical seismicity data, as well as by geologic evidence. Interestingly, GPS evidence indicates that such blocks are among the areas characterized by small deformation rates relatively to surrounding areas of higher deformation. Still, the largest and most destructive earthquake of the 1990's, the 1995 M6.6 earthquake occurred at the center of one of these "aseismic" zones at the northern part of Greece, found unprotected against seismic hazard. This case was indeed a repeat of the case of the tsunami-associated 1956 Amorgos Island M7.4 earthquake, the largest 20th century event in the Aegean back-arc region: the 1956 earthquake occurred at the center of a geologically distinct region (Cyclades Massif in Central Aegean), till then assumed aseismic. Interestingly, after 1956, the overall idea of aseismic regions remained valid, though a "promontory" of earthquake prone-areas intruding into the aseismic central Aegean was assumed. Exploitation of the archaeological excavation evidence and careful, combined analysis of historical and archaeological data and other palaeoseismic, mostly coastal data, indicated that destructive and major earthquakes have left their traces in previously assumed aseismic blocks. In the latter earthquakes typically occur with relatively low recurrence intervals, >200-300 years, much smaller than in adjacent active areas. Interestingly, areas assumed a-seismic in antiquity are among the most active in the last centuries, while areas hit by major earthquakes in the past are usually classified as areas of low seismic risk in official maps. Some reasons

  20. The 1976 Tangshan earthquake

    Science.gov (United States)

    Fang, Wang

    1979-01-01

    The Tangshan earthquake of 1976 was one of the largest earthquakes in recent years. It occurred on July 28 at 3:42 a.m, Beijing (Peking) local time, and had magnitude 7.8, focal depth of 15 kilometers, and an epicentral intensity of XI on the New Chinese Seismic Intensity Scale; it caused serious damage and loss of life in this densely populated industrial city. Now, with the help of people from all over China, the city of Tangshan is being rebuild. 

  1. [Earthquakes in El Salvador].

    Science.gov (United States)

    de Ville de Goyet, C

    2001-02-01

    The Pan American Health Organization (PAHO) has 25 years of experience dealing with major natural disasters. This piece provides a preliminary review of the events taking place in the weeks following the major earthquakes in El Salvador on 13 January and 13 February 2001. It also describes the lessons that have been learned over the last 25 years and the impact that the El Salvador earthquakes and other disasters have had on the health of the affected populations. Topics covered include mass-casualties management, communicable diseases, water supply, managing donations and international assistance, damages to the health-facilities infrastructure, mental health, and PAHO's role in disasters.

  2. ITER technical basis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-01-01

    Following on from the Final Report of the EDA(DS/21), and the summary of the ITER Final Design report(DS/22), the technical basis gives further details of the design of ITER. It is in two parts. The first, the Plant Design specification, summarises the main constraints on the plant design and operation from the viewpoint of engineering and physics assumptions, compliance with safety regulations, and siting requirements and assumptions. The second, the Plant Description Document, describes the physics performance and engineering characteristics of the plant design, illustrates the potential operational consequences foe the locality of a generic site, gives the construction, commissioning, exploitation and decommissioning schedule, and reports the estimated lifetime costing based on data from the industry of the EDA parties.

  3. ITER technical basis

    International Nuclear Information System (INIS)

    2002-01-01

    Following on from the Final Report of the EDA(DS/21), and the summary of the ITER Final Design report(DS/22), the technical basis gives further details of the design of ITER. It is in two parts. The first, the Plant Design specification, summarises the main constraints on the plant design and operation from the viewpoint of engineering and physics assumptions, compliance with safety regulations, and siting requirements and assumptions. The second, the Plant Description Document, describes the physics performance and engineering characteristics of the plant design, illustrates the potential operational consequences foe the locality of a generic site, gives the construction, commissioning, exploitation and decommissioning schedule, and reports the estimated lifetime costing based on data from the industry of the EDA parties

  4. Extending the ISC-GEM Global Earthquake Instrumental Catalogue

    Science.gov (United States)

    Di Giacomo, Domenico; Engdhal, Bob; Storchak, Dmitry; Villaseñor, Antonio; Harris, James

    2015-04-01

    After a 27-month project funded by the GEM Foundation (www.globalquakemodel.org), in January 2013 we released the ISC-GEM Global Instrumental Earthquake Catalogue (1900 2009) (www.isc.ac.uk/iscgem/index.php) as a special product to use for seismic hazard studies. The new catalogue was necessary as improved seismic hazard studies necessitate that earthquake catalogues are homogeneous (to the largest extent possible) over time in their fundamental parameters, such as location and magnitude. Due to time and resource limitation, the ISC-GEM catalogue (1900-2009) included earthquakes selected according to the following time-variable cut-off magnitudes: Ms=7.5 for earthquakes occurring before 1918; Ms=6.25 between 1918 and 1963; and Ms=5.5 from 1964 onwards. Because of the importance of having a reliable seismic input for seismic hazard studies, funding from GEM and two commercial companies in the US and UK allowed us to start working on the extension of the ISC-GEM catalogue both for earthquakes that occurred beyond 2009 and for earthquakes listed in the International Seismological Summary (ISS) which fell below the cut-off magnitude of 6.25. This extension is part of a four-year program that aims at including in the ISC-GEM catalogue large global earthquakes that occurred before the beginning of the ISC Bulletin in 1964. In this contribution we present the updated ISC GEM catalogue, which will include over 1000 more earthquakes that occurred in 2010 2011 and several hundreds more between 1950 and 1959. The catalogue extension between 1935 and 1949 is currently underway. The extension of the ISC-GEM catalogue will also be helpful for regional cross border seismic hazard studies as the ISC-GEM catalogue should be used as basis for cross-checking the consistency in location and magnitude of those earthquakes listed both in the ISC GEM global catalogue and regional catalogues.

  5. Earthquake Culture: A Significant Element in Earthquake Disaster Risk Assessment and Earthquake Disaster Risk Management

    OpenAIRE

    Ibrion, Mihaela

    2018-01-01

    This book chapter brings to attention the dramatic impact of large earthquake disasters on local communities and society and highlights the necessity of building and enhancing the earthquake culture. Iran was considered as a research case study and fifteen large earthquake disasters in Iran were investigated and analyzed over more than a century-time period. It was found that the earthquake culture in Iran was and is still conditioned by many factors or parameters which are not integrated and...

  6. The mechanism of earthquake

    Science.gov (United States)

    Lu, Kunquan; Cao, Zexian; Hou, Meiying; Jiang, Zehui; Shen, Rong; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    The physical mechanism of earthquake remains a challenging issue to be clarified. Seismologists used to attribute shallow earthquake to the elastic rebound of crustal rocks. The seismic energy calculated following the elastic rebound theory and with the data of experimental results upon rocks, however, shows a large discrepancy with measurement — a fact that has been dubbed as “the heat flow paradox”. For the intermediate-focus and deep-focus earthquakes, both occurring in the region of the mantle, there is not reasonable explanation either. This paper will discuss the physical mechanism of earthquake from a new perspective, starting from the fact that both the crust and the mantle are discrete collective system of matters with slow dynamics, as well as from the basic principles of physics, especially some new concepts of condensed matter physics emerged in the recent years. (1) Stress distribution in earth’s crust: Without taking the tectonic force into account, according to the rheological principle of “everything flows”, the normal stress and transverse stress must be balanced due to the effect of gravitational pressure over a long period of time, thus no differential stress in the original crustal rocks is to be expected. The tectonic force is successively transferred and accumulated via stick-slip motions of rock blocks to squeeze the fault gouge and then exerted upon other rock blocks. The superposition of such additional lateral tectonic force and the original stress gives rise to the real-time stress in crustal rocks. The mechanical characteristics of fault gouge are different from rocks as it consists of granular matters. The elastic moduli of the fault gouges are much less than those of rocks, and they become larger with increasing pressure. This peculiarity of the fault gouge leads to a tectonic force increasing with depth in a nonlinear fashion. The distribution and variation of the tectonic stress in the crust are specified. (2) The

  7. Memory effect in M ≥ 6 earthquakes of South-North Seismic Belt, Mainland China

    Science.gov (United States)

    Wang, Jeen-Hwa

    2013-07-01

    The M ≥ 6 earthquakes occurred in the South-North Seismic Belt, Mainland China, during 1901-2008 are taken to study the possible existence of memory effect in large earthquakes. The fluctuation analysis technique is applied to analyze the sequences of earthquake magnitude and inter-event time represented in the natural time domain. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of earthquake magnitude and inter-event time. The migration of earthquakes in study is taken to discuss the possible correlation between events. The phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Together with all kinds of given information, we conclude that the earthquakes in study is short-term correlated and thus the short-term memory effect would be operative.

  8. LLL/DOR seismic conservatism of operating plants project. Interm report on Task II.1.3: soil-structure interaction. Deconvolution of the June 7, 1975, Ferndale Earthquake at the Humboldt Bay Power Plant

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Smith, P.D.

    1978-01-01

    The Ferndale Earthquake of June 7, 1975, provided a unique opportunity to study the accuracy of seismic soil-structure interaction methods used in the nuclear industry because, other than this event, there have been no cases of significant earthquakes for which moderate motions of nuclear plants have been recorded. Future studies are planned which will evaluate the soil-structure interaction methodology further, using increasingly complex methods as required. The first step in this task was to perform deconvolution and soil-structure interaction analyses for the effects of the Ferndale earthquake at the Humboldt Bay Power Plant site. The deconvolution analyses of bedrock motions performed are compared as well as additional studies on analytical sensitivity

  9. On a method of evaluation of failure rate of equipment and pipings under excess-earthquake loadings

    International Nuclear Information System (INIS)

    Shibata, H.; Okamura, H.

    1979-01-01

    This paper deals with a method of evaluation of the failure rate of equipment and pipings in nuclear power plants under an earthquake which is exceeding the design basis earthquake. If we put the ratio of the maximum ground acceleration of an earthquake to that of the design basis earthquake as n, then the failure rate or the probability of failure is the function of n as p(n). The purpose of this study is establishing the procedure of evaluation of the relation n vs. p(n). (orig.)

  10. Seismogeodesy for rapid earthquake and tsunami characterization

    Science.gov (United States)

    Bock, Y.

    2016-12-01

    dozens of seismogeodetic stations available through the Pacific Northwest Seismic Network (University of Washington), the Plate Boundary Observatory (UNAVCO) and the Pacific Northwest Geodetic Array (Central Washington University) as the basis for local tsunami warnings for a large subduction zone earthquake in Cascadia.

  11. Earthquakes of Garhwal Himalaya region of NW Himalaya, India: A study of relocated earthquakes and their seismogenic source and stress

    Science.gov (United States)

    R, A. P.; Paul, A.; Singh, S.

    2017-12-01

    Since the continent-continent collision 55 Ma, the Himalaya has accommodated 2000 km of convergence along its arc. The strain energy is being accumulated at a rate of 37-44 mm/yr and releases at time as earthquakes. The Garhwal Himalaya is located at the western side of a Seismic Gap, where a great earthquake is overdue atleast since 200 years. This seismic gap (Central Seismic Gap: CSG) with 52% probability for a future great earthquake is located between the rupture zones of two significant/great earthquakes, viz. the 1905 Kangra earthquake of M 7.8 and the 1934 Bihar-Nepal earthquake of M 8.0; and the most recent one, the 2015 Gorkha earthquake of M 7.8 is in the eastern side of this seismic gap (CSG). The Garhwal Himalaya is one of the ideal locations of the Himalaya where all the major Himalayan structures and the Himalayan Seimsicity Belt (HSB) can ably be described and studied. In the present study, we are presenting the spatio-temporal analysis of the relocated local micro-moderate earthquakes, recorded by a seismicity monitoring network, which is operational since, 2007. The earthquake locations are relocated using the HypoDD (double difference hypocenter method for earthquake relocations) program. The dataset from July, 2007- September, 2015 have been used in this study to estimate their spatio-temporal relationships, moment tensor (MT) solutions for the earthquakes of M>3.0, stress tensors and their interactions. We have also used the composite focal mechanism solutions for small earthquakes. The majority of the MT solutions show thrust type mechanism and located near the mid-crustal-ramp (MCR) structure of the detachment surface at 8-15 km depth beneath the outer lesser Himalaya and higher Himalaya regions. The prevailing stress has been identified to be compressional towards NNE-SSW, which is the direction of relative plate motion between the India and Eurasia continental plates. The low friction coefficient estimated along with the stress inversions

  12. The EM Earthquake Precursor

    Science.gov (United States)

    Jones, K. B., II; Saxton, P. T.

    2013-12-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After the 1989 Loma Prieta Earthquake, American earthquake investigators predetermined magnetometer use and a minimum earthquake magnitude necessary for EM detection. This action was set in motion, due to the extensive damage incurred and public outrage concerning earthquake forecasting; however, the magnetometers employed, grounded or buried, are completely subject to static and electric fields and have yet to correlate to an identifiable precursor. Secondly, there is neither a networked array for finding any epicentral locations, nor have there been any attempts to find even one. This methodology needs dismissal, because it is overly complicated, subject to continuous change, and provides no response time. As for the minimum magnitude threshold, which was set at M5, this is simply higher than what modern technological advances have gained. Detection can now be achieved at approximately M1, which greatly improves forecasting chances. A propagating precursor has now been detected in both the field and laboratory. Field antenna testing conducted outside the NE Texas town of Timpson in February, 2013, detected three strong EM sources along with numerous weaker signals. The antenna had mobility, and observations were noted for recurrence, duration, and frequency response. Next, two

  13. Simulation analysis of earthquake response of nuclear power plant to the 2003 Miyagi-Oki earthquake

    International Nuclear Information System (INIS)

    Yoshihiro Ogata; Kiyoshi Hirotani; Masayuki Higuchi; Shingo Nakayama

    2005-01-01

    On May 26, 2003 an earthquake of magnitude scale 7.1 (Japan Meteorological Agency) occurred just offshore of Miyagi Prefecture. This was the largest earthquake ever experienced by the nuclear power plant of Tohoku Electric Power Co. in Onagawa (hereafter the Onagawa Nuclear Power Plant) during the 19 years since it had started operations in 1984. In this report, we review the vibration characteristics of the reactor building of the Onagawa Nuclear Power Plant Unit 1 based on acceleration records observed at the building, and give an account of a simulation analysis of the earthquake response carried out to ascertain the appropriateness of design procedure and a seismic safety of the building. (authors)

  14. Simulated earthquake ground motions

    International Nuclear Information System (INIS)

    Vanmarcke, E.H.; Gasparini, D.A.

    1977-01-01

    The paper reviews current methods for generating synthetic earthquake ground motions. Emphasis is on the special requirements demanded of procedures to generate motions for use in nuclear power plant seismic response analysis. Specifically, very close agreement is usually sought between the response spectra of the simulated motions and prescribed, smooth design response spectra. The features and capabilities of the computer program SIMQKE, which has been widely used in power plant seismic work are described. Problems and pitfalls associated with the use of synthetic ground motions in seismic safety assessment are also pointed out. The limitations and paucity of recorded accelerograms together with the widespread use of time-history dynamic analysis for obtaining structural and secondary systems' response have motivated the development of earthquake simulation capabilities. A common model for synthesizing earthquakes is that of superposing sinusoidal components with random phase angles. The input parameters for such a model are, then, the amplitudes and phase angles of the contributing sinusoids as well as the characteristics of the variation of motion intensity with time, especially the duration of the motion. The amplitudes are determined from estimates of the Fourier spectrum or the spectral density function of the ground motion. These amplitudes may be assumed to be varying in time or constant for the duration of the earthquake. In the nuclear industry, the common procedure is to specify a set of smooth response spectra for use in aseismic design. This development and the need for time histories have generated much practical interest in synthesizing earthquakes whose response spectra 'match', or are compatible with a set of specified smooth response spectra

  15. Approach to developing a ground-motion design basis for facilities important to safety at Yucca Mountain

    International Nuclear Information System (INIS)

    King, J.L.

    1990-01-01

    This paper discusses a methodology for developing a ground-motion design basis for prospective facilities at Yucca Mountain that are important to safety. The methodology utilizes a guasi-deterministic construct called the 10,000-year cumulative-slip earthquake that is designed to provide a conservative, robust, and reproducible estimate of ground motion that has a one-in-ten chance of occurring during the preclosure period. This estimate is intended to define a ground-motion level for which the seismic design would ensure minimal disruption to operations engineering analyses to ensure safe performance are included

  16. Romanian earthquakes analysis using BURAR seismic array

    International Nuclear Information System (INIS)

    Borleanu, Felix; Rogozea, Maria; Nica, Daniela; Popescu, Emilia; Popa, Mihaela; Radulian, Mircea

    2008-01-01

    Bucovina seismic array (BURAR) is a medium-aperture array, installed in 2002 in the northern part of Romania (47.61480 N latitude, 25.21680 E longitude, 1150 m altitude), as a result of the cooperation between Air Force Technical Applications Center, USA and National Institute for Earth Physics, Romania. The array consists of ten elements, located in boreholes and distributed over a 5 x 5 km 2 area; nine with short-period vertical sensors and one with a broadband three-component sensor. Since the new station has been operating the earthquake survey of Romania's territory has been significantly improved. Data recorded by BURAR during 01.01.2005 - 12.31.2005 time interval are first processed and analyzed, in order to establish the array detection capability of the local earthquakes, occurred in different Romanian seismic zones. Subsequently a spectral ratios technique was applied in order to determine the calibration relationships for magnitude, using only the information gathered by BURAR station. The spectral ratios are computed relatively to a reference event, considered as representative for each seismic zone. This method has the advantage to eliminate the path effects. The new calibration procedure is tested for the case of Vrancea intermediate-depth earthquakes and proved to be very efficient in constraining the size of these earthquakes. (authors)

  17. Single-earthquake design for piping systems in advanced light water reactors

    International Nuclear Information System (INIS)

    Terao, D.

    1993-01-01

    Appendix A to Part 100 of Title 10 of the Code of Federal Regulations (10 CFR Part 100) requires, in part, that all structures, systems, and components of the nuclear power plant necessary for continued operation without undue risk to the health and safety of the public shall be designed to remain functional and within applicable stress and deformation limits when subject to an operating basis earthquake (OBE). The US Nuclear Regulatory Commission (NRC) is proposing changes to Appendix A to Part 100 to redefine the OBE at a level such that its purpose can be satisfied without the need to perform explicit response analyses. Consequently, only the safe-shutdown earthquake (SSE) would be required for the seismic design of safety-related structures, systems and components. The purpose of this paper is to discuss the proposed changes to existing seismic design criteria that the NRC staff has found acceptable for implementing the proposed rule change in the design of safety-related piping systems in the advanced light water reactor (ALWR) lead plant. These criteria apply only to the ALWR lead plant design and are not intended to replace the seismic design criteria approved by the Commission in the licensing bases of currently operating facilities. Although the guidelines described herein have been proposed for use as a pilot program for implementing the proposed rule change specifically for the ALWR lead plant, the NRC staff expects that these guidelines will also be applied to other ALWRs

  18. Investigation of the relationship between earthquakes and indoor radon concentrations at a building in Gyeongju, Korea

    Directory of Open Access Journals (Sweden)

    Jae Wook Kim

    2018-04-01

    Full Text Available This article measured and analyzed the indoor radon concentrations at one university building in Gyeongju, Republic of Korea, to investigate if there is any relationship between earthquakes and indoor radon concentration. Since 12 September 2016, when two 5.1 and 5.8 magnitude earthquakes occurred, hundreds of aftershocks affected Gyeongju until January 2017. The measurements were made at the ground floor of the Energy Engineering Hall of Dongguk University in Gyeongju over a period between February 2016 and January 2017. The measurements were made with an RAD7 detector on the basis of the US Environmental Protection Agency measurement protocol. Each measurement was continuously made every 30 minutes over the measurement period every month. Among earthquakes with 2.0 or greater magnitude, the earthquakes whose occurrence timings fell into the measurement periods were screened for further analysis. We observed similar spike-like patterns between the indoor radon concentration distributions and earthquakes: a sudden increase in the peak indoor radon concentration 1–4 days before an earthquake, gradual decrease before the earthquake, and sudden drop on the day of the earthquake if the interval between successive earthquakes was moderately longer, for example, 3 days in this article. Keywords: Earthquakes, Gyeongju, Indoor Radon Concentration, RAD7, Radon Anomaly

  19. Measures for groundwater security during and after the Hanshin-Awaji earthquake (1995) and the Great East Japan earthquake (2011), Japan

    Science.gov (United States)

    Tanaka, Tadashi

    2016-03-01

    Many big earthquakes have occurred in the tectonic regions of the world, especially in Japan. Earthquakes often cause damage to crucial life services such as water, gas and electricity supply systems and even the sewage system in urban and rural areas. The most severe problem for people affected by earthquakes is access to water for their drinking/cooking and toilet flushing. Securing safe water for daily life in an earthquake emergency requires the establishment of countermeasures, especially in a mega city like Tokyo. This paper described some examples of groundwater use in earthquake emergencies, with reference to reports, books and newspapers published in Japan. The consensus is that groundwater, as a source of water, plays a major role in earthquake emergencies, especially where the accessibility of wells coincides with the emergency need. It is also important to introduce a registration system for citizen-owned and company wells that can form the basis of a cooperative during a disaster; such a registration system was implemented by many Japanese local governments after the Hanshin-Awaji Earthquake in 1995 and the Great East Japan Earthquake in 2011, and is one of the most effective countermeasures for groundwater use in an earthquake emergency. Emphasis is also placed the importance of establishing of a continuous monitoring system of groundwater conditions for both quantity and quality during non-emergency periods.

  20. The HayWired Earthquake Scenario—Earthquake Hazards

    Science.gov (United States)

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  1. Earthquake Swarm in Armutlu Peninsula, Eastern Marmara Region, Turkey

    Science.gov (United States)

    Yavuz, Evrim; Çaka, Deniz; Tunç, Berna; Serkan Irmak, T.; Woith, Heiko; Cesca, Simone; Lühr, Birger-Gottfried; Barış, Şerif

    2015-04-01

    The most active fault system of Turkey is North Anatolian Fault Zone and caused two large earthquakes in 1999. These two earthquakes affected the eastern Marmara region destructively. Unbroken part of the North Anatolian Fault Zone crosses north of Armutlu Peninsula on east-west direction. This branch has been also located quite close to Istanbul known as a megacity with its high population, economic and social aspects. A new cluster of microseismic activity occurred in the direct vicinity southeastern of the Yalova Termal area. Activity started on August 2, 2014 with a series of micro events, and then on August 3, 2014 a local magnitude is 4.1 event occurred, more than 1000 in the followed until August 31, 2014. Thus we call this tentatively a swarm-like activity. Therefore, investigation of the micro-earthquake activity of the Armutlu Peninsula has become important to understand the relationship between the occurrence of micro-earthquakes and the tectonic structure of the region. For these reasons, Armutlu Network (ARNET), installed end of 2005 and equipped with currently 27 active seismic stations operating by Kocaeli University Earth and Space Sciences Research Center (ESSRC) and Helmholtz-Zentrum Potsdam Deutsches GeoForschungsZentrum (GFZ), is a very dense network tool able to record even micro-earthquakes in this region. In the 30 days period of August 02 to 31, 2014 Kandilli Observatory and Earthquake Research Institute (KOERI) announced 120 local earthquakes ranging magnitudes between 0.7 and 4.1, but ARNET provided more than 1000 earthquakes for analyzes at the same time period. In this study, earthquakes of the swarm area and vicinity regions determined by ARNET were investigated. The focal mechanism of the August 03, 2014 22:22:42 (GMT) earthquake with local magnitude (Ml) 4.0 is obtained by the moment tensor solution. According to the solution, it discriminates a normal faulting with dextral component. The obtained focal mechanism solution is

  2. Lower bound earthquake magnitude for probabilistic seismic hazard evaluation

    International Nuclear Information System (INIS)

    McCann, M.W. Jr.; Reed, J.W.

    1990-01-01

    This paper presents the results of a study that develops an engineering and seismological basis for selecting a lower-bound magnitude (LBM) for use in seismic hazard assessment. As part of a seismic hazard analysis the range of earthquake magnitudes that are included in the assessment of the probability of exceedance of ground motion must be defined. The upper-bound magnitude is established by earth science experts based on their interpretation of the maximum size of earthquakes that can be generated by a seismic source. The lower-bound or smallest earthquake that is considered in the analysis must also be specified. The LBM limits the earthquakes that are considered in assessing the probability that specified ground motion levels are exceeded. In the past there has not been a direct consideration of the appropriate LBM value that should be used in a seismic hazard assessment. This study specifically looks at the selection of a LBM for use in seismic hazard analyses that are input to the evaluation/design of nuclear power plants (NPPs). Topics addressed in the evaluation of a LBM are earthquake experience data at heavy industrial facilities, engineering characteristics of ground motions associated with small-magnitude earthquakes, probabilistic seismic risk assessments (seismic PRAs), and seismic margin evaluations. The results of this study and the recommendations concerning a LBM for use in seismic hazard assessments are discussed. (orig.)

  3. Meeting the Challenge of Earthquake Risk Globalisation: Towards the Global Earthquake Model GEM (Sergey Soloviev Medal Lecture)

    Science.gov (United States)

    Zschau, J.

    2009-04-01

    Earthquake risk, like natural risks in general, has become a highly dynamic and globally interdependent phenomenon. Due to the "urban explosion" in the Third World, an increasingly complex cross linking of critical infrastructure and lifelines in the industrial nations and a growing globalisation of the world's economies, we are presently facing a dramatic increase of our society's vulnerability to earthquakes in practically all seismic regions on our globe. Such fast and global changes cannot be captured with conventional earthquake risk models anymore. The sciences in this field are, therefore, asked to come up with new solutions that are no longer exclusively aiming at the best possible quantification of the present risks but also keep an eye on their changes with time and allow to project these into the future. This does not apply to the vulnerablity component of earthquake risk alone, but also to its hazard component which has been realized to be time-dependent, too. The challenges of earthquake risk dynamics and -globalisation have recently been accepted by the Global Science Forum of the Organisation for Economic Co-operation and Development (OECD - GSF) who initiated the "Global Earthquake Model (GEM)", a public-private partnership for establishing an independent standard to calculate, monitor and communicate earthquake risk globally, raise awareness and promote mitigation.

  4. Earthquake hazard evaluation for Switzerland

    International Nuclear Information System (INIS)

    Ruettener, E.

    1995-01-01

    Earthquake hazard analysis is of considerable importance for Switzerland, a country with moderate seismic activity but high economic values at risk. The evaluation of earthquake hazard, i.e. the determination of return periods versus ground motion parameters, requires a description of earthquake occurrences in space and time. In this study the seismic hazard for major cities in Switzerland is determined. The seismic hazard analysis is based on historic earthquake records as well as instrumental data. The historic earthquake data show considerable uncertainties concerning epicenter location and epicentral intensity. A specific concept is required, therefore, which permits the description of the uncertainties of each individual earthquake. This is achieved by probability distributions for earthquake size and location. Historical considerations, which indicate changes in public earthquake awareness at various times (mainly due to large historical earthquakes), as well as statistical tests have been used to identify time periods of complete earthquake reporting as a function of intensity. As a result, the catalog is judged to be complete since 1878 for all earthquakes with epicentral intensities greater than IV, since 1750 for intensities greater than VI, since 1600 for intensities greater than VIII, and since 1300 for intensities greater than IX. Instrumental data provide accurate information about the depth distribution of earthquakes in Switzerland. In the Alps, focal depths are restricted to the uppermost 15 km of the crust, whereas below the northern Alpine foreland earthquakes are distributed throughout the entire crust (30 km). This depth distribution is considered in the final hazard analysis by probability distributions. (author) figs., tabs., refs

  5. Earthquake likelihood model testing

    Science.gov (United States)

    Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.

    2007-01-01

    INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a

  6. Smartphone-Based Earthquake and Tsunami Early Warning in Chile

    Science.gov (United States)

    Brooks, B. A.; Baez, J. C.; Ericksen, T.; Barrientos, S. E.; Minson, S. E.; Duncan, C.; Guillemot, C.; Smith, D.; Boese, M.; Cochran, E. S.; Murray, J. R.; Langbein, J. O.; Glennie, C. L.; Dueitt, J.; Parra, H.

    2016-12-01

    Many locations around the world face high seismic hazard, but do not have the resources required to establish traditional earthquake and tsunami warning systems (E/TEW) that utilize scientific grade seismological sensors. MEMs accelerometers and GPS chips embedded in, or added inexpensively to, smartphones are sensitive enough to provide robust E/TEW if they are deployed in sufficient numbers. We report on a pilot project in Chile, one of the most productive earthquake regions world-wide. There, magnitude 7.5+ earthquakes occurring roughly every 1.5 years and larger tsunamigenic events pose significant local and trans-Pacific hazard. The smartphone-based network described here is being deployed in parallel to the build-out of a scientific-grade network for E/TEW. Our sensor package comprises a smartphone with internal MEMS and an external GPS chipset that provides satellite-based augmented positioning and phase-smoothing. Each station is independent of local infrastructure, they are solar-powered and rely on cellular SIM cards for communications. An Android app performs initial onboard processing and transmits both accelerometer and GPS data to a server employing the FinDer-BEFORES algorithm to detect earthquakes, producing an acceleration-based line source model for smaller magnitude earthquakes or a joint seismic-geodetic finite-fault distributed slip model for sufficiently large magnitude earthquakes. Either source model provides accurate ground shaking forecasts, while distributed slip models for larger offshore earthquakes can be used to infer seafloor deformation for local tsunami warning. The network will comprise 50 stations by Sept. 2016 and 100 stations by Dec. 2016. Since Nov. 2015, batch processing has detected, located, and estimated the magnitude for Mw>5 earthquakes. Operational since June, 2016, we have successfully detected two earthquakes > M5 (M5.5, M5.1) that occurred within 100km of our network while producing zero false alarms.

  7. Magnitude Estimation for Large Earthquakes from Borehole Recordings

    Science.gov (United States)

    Eshaghi, A.; Tiampo, K. F.; Ghofrani, H.; Atkinson, G.

    2012-12-01

    We present a simple and fast method for magnitude determination technique for earthquake and tsunami early warning systems based on strong ground motion prediction equations (GMPEs) in Japan. This method incorporates borehole strong motion records provided by the Kiban Kyoshin network (KiK-net) stations. We analyzed strong ground motion data from large magnitude earthquakes (5.0 ≤ M ≤ 8.1) with focal depths < 50 km and epicentral distances of up to 400 km from 1996 to 2010. Using both peak ground acceleration (PGA) and peak ground velocity (PGV) we derived GMPEs in Japan. These GMPEs are used as the basis for regional magnitude determination. Predicted magnitudes from PGA values (Mpga) and predicted magnitudes from PGV values (Mpgv) were defined. Mpga and Mpgv strongly correlate with the moment magnitude of the event, provided sufficient records for each event are available. The results show that Mpgv has a smaller standard deviation in comparison to Mpga when compared with the estimated magnitudes and provides a more accurate early assessment of earthquake magnitude. We test this new method to estimate the magnitude of the 2011 Tohoku earthquake and we present the results of this estimation. PGA and PGV from borehole recordings allow us to estimate the magnitude of this event 156 s and 105 s after the earthquake onset, respectively. We demonstrate that the incorporation of borehole strong ground-motion records immediately available after the occurrence of large earthquakes significantly increases the accuracy of earthquake magnitude estimation and the associated improvement in earthquake and tsunami early warning systems performance. Moment magnitude versus predicted magnitude (Mpga and Mpgv).

  8. Scientific aspects of the Tohoku earthquake and Fukushima nuclear accident

    Science.gov (United States)

    Koketsu, Kazuki

    2016-04-01

    We investigated the 2011 Tohoku earthquake, the accident of the Fukushima Daiichi nuclear power plant, and assessments conducted beforehand for earthquake and tsunami potential in the Pacific offshore region of the Tohoku District. The results of our investigation show that all the assessments failed to foresee the earthquake and its related tsunami, which was the main cause of the accident. Therefore, the disaster caused by the earthquake, and the accident were scientifically unforeseeable at the time. However, for a zone neighboring the reactors, a 2008 assessment showed tsunamis higher than the plant height. As a lesson learned from the accident, companies operating nuclear power plants should be prepared using even such assessment results for neighboring zones.

  9. Identified EM Earthquake Precursors

    Science.gov (United States)

    Jones, Kenneth, II; Saxton, Patrick

    2014-05-01

    Many attempts have been made to determine a sound forecasting method regarding earthquakes and warn the public in turn. Presently, the animal kingdom leads the precursor list alluding to a transmission related source. By applying the animal-based model to an electromagnetic (EM) wave model, various hypotheses were formed, but the most interesting one required the use of a magnetometer with a differing design and geometry. To date, numerous, high-end magnetometers have been in use in close proximity to fault zones for potential earthquake forecasting; however, something is still amiss. The problem still resides with what exactly is forecastable and the investigating direction of EM. After a number of custom rock experiments, two hypotheses were formed which could answer the EM wave model. The first hypothesis concerned a sufficient and continuous electron movement either by surface or penetrative flow, and the second regarded a novel approach to radio transmission. Electron flow along fracture surfaces was determined to be inadequate in creating strong EM fields, because rock has a very high electrical resistance making it a high quality insulator. Penetrative flow could not be corroborated as well, because it was discovered that rock was absorbing and confining electrons to a very thin skin depth. Radio wave transmission and detection worked with every single test administered. This hypothesis was reviewed for propagating, long-wave generation with sufficient amplitude, and the capability of penetrating solid rock. Additionally, fracture spaces, either air or ion-filled, can facilitate this concept from great depths and allow for surficial detection. A few propagating precursor signals have been detected in the field occurring with associated phases using custom-built loop antennae. Field testing was conducted in Southern California from 2006-2011, and outside the NE Texas town of Timpson in February, 2013. The antennae have mobility and observations were noted for

  10. Geophysical Anomalies and Earthquake Prediction

    Science.gov (United States)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  11. Pain after earthquake

    Directory of Open Access Journals (Sweden)

    Angeletti Chiara

    2012-06-01

    Full Text Available Abstract Introduction On 6 April 2009, at 03:32 local time, an Mw 6.3 earthquake hit the Abruzzi region of central Italy causing widespread damage in the City of L Aquila and its nearby villages. The earthquake caused 308 casualties and over 1,500 injuries, displaced more than 25,000 people and induced significant damage to more than 10,000 buildings in the L'Aquila region. Objectives This observational retrospective study evaluated the prevalence and drug treatment of pain in the five weeks following the L'Aquila earthquake (April 6, 2009. Methods 958 triage documents were analysed for patients pain severity, pain type, and treatment efficacy. Results A third of pain patients reported pain with a prevalence of 34.6%. More than half of pain patients reported severe pain (58.8%. Analgesic agents were limited to available drugs: anti-inflammatory agents, paracetamol, and weak opioids. Reduction in verbal numerical pain scores within the first 24 hours after treatment was achieved with the medications at hand. Pain prevalence and characterization exhibited a biphasic pattern with acute pain syndromes owing to trauma occurring in the first 15 days after the earthquake; traumatic pain then decreased and re-surged at around week five, owing to rebuilding efforts. In the second through fourth week, reports of pain occurred mainly owing to relapses of chronic conditions. Conclusions This study indicates that pain is prevalent during natural disasters, may exhibit a discernible pattern over the weeks following the event, and current drug treatments in this region may be adequate for emergency situations.

  12. Fault lubrication during earthquakes.

    Science.gov (United States)

    Di Toro, G; Han, R; Hirose, T; De Paola, N; Nielsen, S; Mizoguchi, K; Ferri, F; Cocco, M; Shimamoto, T

    2011-03-24

    The determination of rock friction at seismic slip rates (about 1 m s(-1)) is of paramount importance in earthquake mechanics, as fault friction controls the stress drop, the mechanical work and the frictional heat generated during slip. Given the difficulty in determining friction by seismological methods, elucidating constraints are derived from experimental studies. Here we review a large set of published and unpublished experiments (∼300) performed in rotary shear apparatus at slip rates of 0.1-2.6 m s(-1). The experiments indicate a significant decrease in friction (of up to one order of magnitude), which we term fault lubrication, both for cohesive (silicate-built, quartz-built and carbonate-built) rocks and non-cohesive rocks (clay-rich, anhydrite, gypsum and dolomite gouges) typical of crustal seismogenic sources. The available mechanical work and the associated temperature rise in the slipping zone trigger a number of physicochemical processes (gelification, decarbonation and dehydration reactions, melting and so on) whose products are responsible for fault lubrication. The similarity between (1) experimental and natural fault products and (2) mechanical work measures resulting from these laboratory experiments and seismological estimates suggests that it is reasonable to extrapolate experimental data to conditions typical of earthquake nucleation depths (7-15 km). It seems that faults are lubricated during earthquakes, irrespective of the fault rock composition and of the specific weakening mechanism involved.

  13. Housing Damage Following Earthquake

    Science.gov (United States)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  14. Do Earthquakes Shake Stock Markets?

    Science.gov (United States)

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  15. Earthquake engineering for nuclear facilities

    CERN Document Server

    Kuno, Michiya

    2017-01-01

    This book is a comprehensive compilation of earthquake- and tsunami-related technologies and knowledge for the design and construction of nuclear facilities. As such, it covers a wide range of fields including civil engineering, architecture, geotechnical engineering, mechanical engineering, and nuclear engineering, for the development of new technologies providing greater resistance against earthquakes and tsunamis. It is crucial both for students of nuclear energy courses and for young engineers in nuclear power generation industries to understand the basics and principles of earthquake- and tsunami-resistant design of nuclear facilities. In Part I, "Seismic Design of Nuclear Power Plants", the design of nuclear power plants to withstand earthquakes and tsunamis is explained, focusing on buildings, equipment's, and civil engineering structures. In Part II, "Basics of Earthquake Engineering", fundamental knowledge of earthquakes and tsunamis as well as the dynamic response of structures and foundation ground...

  16. Plant data evaluation of performance confirmation test in HTTR after Tohoku-Pacific Ocean Earthquake

    International Nuclear Information System (INIS)

    Ono, Masato; Tochio, Daisuke; Shinohara, Masanori; Shimazaki, Yosuke; Yanagi, Shunki; Iigaki, Kazuhiko

    2012-03-01

    Tohoku-Pacific Ocean Earthquake occurred on March 11th 2011 and the earthquake intensity of an upper 5 on the Japanese scale was observed in Oarai town. HTTR conducted the confirmation test on cold state in order to ensure the facilities/instruments of reactor building operate normally. In this test, the plant data in the facilities/instruments start-up phase and continue steady operation phase were measured and compared with the previous operation data, and the soundness of facilities/instruments is evaluated. As a result, in after the earthquake, the facilities/instruments operate normally and the reactor cooling function of the HTTR were ensured. (author)

  17. Plant data evaluation of performance confirmation test in HTTR after Tohoku-Pacific Ocean Earthquake

    International Nuclear Information System (INIS)

    Ono, Masato; Tochio, Daisuke; Shinohara, Masanori; Shimazaki, Yosuke; Yanagi, Shunki; Iigaki, Kazuhiko

    2012-01-01

    Tohoku-Pacific Ocean Earthquake occurred on March 11th 2011 and the earthquake intensity of an upper 5 on the Japanese scale was observed in Oarai town. HTTR conducted the confirmation test on cold state in order to ensure the facilities/instruments of reactor building operate normally. In this test, the plant data in the facilities/instruments start-up phase and continue steady operation phase were measured and compared with the previous operation data, and the soundness of facilities/instruments is evaluated. As a result, in after the earthquake, the facilities/instruments operate normally and the reactor cooling function of the HTTR were ensured. (author)

  18. Assessment of earthquake-induced landslides hazard in El Salvador after the 2001 earthquakes using macroseismic analysis

    Science.gov (United States)

    Esposito, Eliana; Violante, Crescenzo; Giunta, Giuseppe; Ángel Hernández, Miguel

    2016-04-01

    Two strong earthquakes and a number of smaller aftershocks struck El Salvador in the year 2001. The January 13 2001 earthquake, Mw 7.7, occurred along the Cocos plate, 40 km off El Salvador southern coast. It resulted in about 1300 deaths and widespread damage, mainly due to massive landsliding. Two of the largest earthquake-induced landslides, Las Barioleras and Las Colinas (about 2x105 m3) produced major damage to buildings and infrastructures and 500 fatalities. A neighborhood in Santa Tecla, west of San Salvador, was destroyed. The February 13 2001 earthquake, Mw 6.5, occurred 40 km east-southeast of San Salvador. This earthquake caused over 300 fatalities and triggered several landslides over an area of 2,500 km2 mostly in poorly consolidated volcaniclastic deposits. The La Leona landslide (5-7x105 m3) caused 12 fatalities and extensive damage to the Panamerican Highway. Two very large landslides of 1.5 km3 and 12 km3 produced hazardous barrier lakes at Rio El Desague and Rio Jiboa, respectively. More than 16.000 landslides occurred throughout the country after both quakes; most of them occurred in pyroclastic deposits, with a volume less than 1x103m3. The present work aims to define the relationship between the above described earthquake intensity, size and areal distribution of induced landslides, as well as to refine the earthquake intensity in sparsely populated zones by using landslide effects. Landslides triggered by the 2001 seismic sequences provided useful indication for a realistic seismic hazard assessment, providing a basis for understanding, evaluating, and mapping the hazard and risk associated with earthquake-induced landslides.

  19. Earthquake resistant design of structures

    International Nuclear Information System (INIS)

    Choi, Chang Geun; Kim, Gyu Seok; Lee, Dong Geun

    1990-02-01

    This book tells of occurrence of earthquake and damage analysis of earthquake, equivalent static analysis method, application of equivalent static analysis method, dynamic analysis method like time history analysis by mode superposition method and direct integration method, design spectrum analysis considering an earthquake-resistant design in Korea. Such as analysis model and vibration mode, calculation of base shear, calculation of story seismic load and combine of analysis results.

  20. The severity of an earthquake

    Science.gov (United States)

    ,

    1997-01-01

    The severity of an earthquake can be expressed in terms of both intensity and magnitude. However, the two terms are quite different, and they are often confused. Intensity is based on the observed effects of ground shaking on people, buildings, and natural features. It varies from place to place within the disturbed region depending on the location of the observer with respect to the earthquake epicenter. Magnitude is related to the amount of seismic energy released at the hypocenter of the earthquake. It is based on the amplitude of the earthquake waves recorded on instruments

  1. Human casualties in earthquakes: Modelling and mitigation

    Science.gov (United States)

    Spence, R.J.S.; So, E.K.M.

    2011-01-01

    Earthquake risk modelling is needed for the planning of post-event emergency operations, for the development of insurance schemes, for the planning of mitigation measures in the existing building stock, and for the development of appropriate building regulations; in all of these applications estimates of casualty numbers are essential. But there are many questions about casualty estimation which are still poorly understood. These questions relate to the causes and nature of the injuries and deaths, and the extent to which they can be quantified. This paper looks at the evidence on these questions from recent studies. It then reviews casualty estimation models available, and finally compares the performance of some casualty models in making rapid post-event casualty estimates in recent earthquakes.

  2. Structural performance of the DOE's Idaho National Engineering Laboratory during the 1983 Borah Peak Earthquake

    International Nuclear Information System (INIS)

    Guenzler, R.C.; Gorman, V.W.

    1985-01-01

    The 1983 Borah Peak Earthquake (7.3 Richter magnitude) was the largest earthquake ever experienced by the DOE's Idaho National Engineering Laboratory (INEL). Reactor and plant facilities are generally located about 90 to 110 km (60 miles) from the epicenter. Several reactors were operating normally at the time of the earthquake. Based on detailed inspections, comparisons of measured accelerations with design levels, and instrumental seismograph information, it was concluded that the 1983 Borah Peak Earthquake created no safety problems for INEL reactors or other facilities. 10 references, 16 figures, 2 tables

  3. Modified mercalli intensities for nine earthquakes in central and western Washington between 1989 and 1999

    Science.gov (United States)

    Brocher, Thomas M.; Dewey, James W.; Cassidy, John F.

    2017-08-15

    We determine Modified Mercalli (Seismic) Intensities (MMI) for nine onshore earthquakes of magnitude 4.5 and larger that occurred in central and western Washington between 1989 and 1999, on the basis of effects reported in postal questionnaires, the press, and professional collaborators. The earthquakes studied include four earthquakes of M5 and larger: the M5.0 Deming earthquake of April 13, 1990, the M5.0 Point Robinson earthquake of January 29, 1995, the M5.4 Duvall earthquake of May 3, 1996, and the M5.8 Satsop earthquake of July 3, 1999. The MMI are assigned using data and procedures that evolved at the U.S. Geological Survey (USGS) and its Department of Commerce predecessors and that were used to assign MMI to felt earthquakes occurring in the United States between 1931 and 1986. We refer to the MMI assigned in this report as traditional MMI, because they are based on responses to postal questionnaires and on newspaper reports, and to distinguish them from MMI calculated from data contributed by the public by way of the internet. Maximum traditional MMI documented for the M5 and larger earthquakes are VII for the 1990 Deming earthquake, V for the 1995 Point Robinson earthquake, VI for the 1996 Duvall earthquake, and VII for the 1999 Satsop earthquake; the five other earthquakes were variously assigned maximum intensities of IV, V, or VI. Starting in 1995, the Pacific Northwest Seismic Network (PNSN) published MMI maps for four of the studied earthquakes, based on macroseismic observations submitted by the public by way of the internet. With the availability now of the traditional USGS MMI interpreted for all the sites from which USGS postal questionnaires were returned, the four Washington earthquakes join a rather small group of earthquakes for which both traditional USGS MMI and some type of internet-based MMI have been assigned. The values and distributions of the traditional MMI are broadly similar to the internet-based PNSN intensities; we discuss some

  4. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    Science.gov (United States)

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  5. Earthquake precursory events around epicenters and local active faults; the cases of two inland earthquakes in Iran

    Science.gov (United States)

    Valizadeh Alvan, H.; Mansor, S.; Haydari Azad, F.

    2012-12-01

    The possibility of earthquake prediction in the frame of several days to few minutes before its occurrence has stirred interest among researchers, recently. Scientists believe that the new theories and explanations of the mechanism of this natural phenomenon are trustable and can be the basis of future prediction efforts. During the last thirty years experimental researches resulted in some pre-earthquake events which are now recognized as confirmed warning signs (precursors) of past known earthquakes. With the advances in in-situ measurement devices and data analysis capabilities and the emergence of satellite-based data collectors, monitoring the earth's surface is now a regular work. Data providers are supplying researchers from all over the world with high quality and validated imagery and non-imagery data. Surface Latent Heat Flux (SLHF) or the amount of energy exchange in the form of water vapor between the earth's surface and atmosphere has been frequently reported as an earthquake precursor during the past years. The accumulated stress in the earth's crust during the preparation phase of earthquakes is said to be the main cause of temperature anomalies weeks to days before the main event and subsequent shakes. Chemical and physical interactions in the presence of underground water lead to higher water evaporation prior to inland earthquakes. On the other hand, the leak of Radon gas occurred as rocks break during earthquake preparation causes the formation of airborne ions and higher Air Temperature (AT) prior to main event. Although co-analysis of direct and indirect observation for precursory events is considered as a promising method for future successful earthquake prediction, without proper and thorough knowledge about the geological setting, atmospheric factors and geodynamics of the earthquake-prone regions we will not be able to identify anomalies due to seismic activity in the earth's crust. Active faulting is a key factor in identification of the

  6. Neoliberalism and criticisms of earthquake insurance arrangements in New Zealand.

    Science.gov (United States)

    Hay, I

    1996-03-01

    Global collapse of the Fordist-Keynesian regime of accumulation and an attendant philosophical shift in New Zealand politics to neoliberalism have prompted criticisms of, and changes to, the Earthquake and War Damage Commission. Earthquake insurance arrangements made 50 years ago in an era of collectivist, welfarist political action are now set in an environment in which emphasis is given to competitive relations and individualism. Six specific criticisms of the Commission are identified, each of which is founded in the rhetoric and ideology of a neoliberal political project which has underpinned radical social and economic changes in New Zealand since the early 1980s. On the basis of those criticisms, and in terms of the Earthquake Commission Act 1993, the Commission has been restructured. The new Commission is withdrawing from its primary position as the nation's non-residential property hazards insurer and is restricting its coverage of residential properties.

  7. Leveraging geodetic data to reduce losses from earthquakes

    Science.gov (United States)

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    Seismic hazard assessments that are based on a variety of data and the best available science, coupled with rapid synthesis of real-time information from continuous monitoring networks to guide post-earthquake response, form a solid foundation for effective earthquake loss reduction. With this in mind, the Earthquake Hazards Program (EHP) of the U.S. Geological Survey (USGS) Natural Hazards Mission Area (NHMA) engages in a variety of undertakings, both established and emergent, in order to provide high quality products that enable stakeholders to take action in advance of and in response to earthquakes. Examples include the National Seismic Hazard Model (NSHM), development of tools for improved situational awareness such as earthquake early warning (EEW) and operational earthquake forecasting (OEF), research about induced seismicity, and new efforts to advance comprehensive subduction zone science and monitoring. Geodetic observations provide unique and complementary information directly relevant to advancing many aspects of these efforts (fig. 1). EHP scientists have long leveraged geodetic data for a range of influential studies, and they continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to natural hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, considering ways to fully utilize this observational resource for earthquake loss reduction is timely and essential. This report presents strategies, and the underlying scientific rationale, by which the EHP could achieve the following outcomes: The EHP is an authoritative source for the interpretation of geodetic data and its use for earthquake loss reduction throughout the United States and its territories.The USGS consistently provides timely, high quality geodetic data to stakeholders.Significant earthquakes are better characterized by incorporating geodetic data into USGS

  8. Generation of earthquake signals

    International Nuclear Information System (INIS)

    Kjell, G.

    1994-01-01

    Seismic verification can be performed either as a full scale test on a shaker table or as numerical calculations. In both cases it is necessary to have an earthquake acceleration time history. This report describes generation of such time histories by filtering white noise. Analogue and digital filtering methods are compared. Different methods of predicting the response spectrum of a white noise signal filtered by a band-pass filter are discussed. Prediction of both the average response level and the statistical variation around this level are considered. Examples with both the IEEE 301 standard response spectrum and a ground spectrum suggested for Swedish nuclear power stations are included in the report

  9. Earthquakes Threaten Many American Schools

    Science.gov (United States)

    Bailey, Nancy E.

    2010-01-01

    Millions of U.S. children attend schools that are not safe from earthquakes, even though they are in earthquake-prone zones. Several cities and states have worked to identify and repair unsafe buildings, but many others have done little or nothing to fix the problem. The reasons for ignoring the problem include political and financial ones, but…

  10. Make an Earthquake: Ground Shaking!

    Science.gov (United States)

    Savasci, Funda

    2011-01-01

    The main purposes of this activity are to help students explore possible factors affecting the extent of the damage of earthquakes and learn the ways to reduce earthquake damages. In these inquiry-based activities, students have opportunities to develop science process skills and to build an understanding of the relationship among science,…

  11. Future Earth: Reducing Loss By Automating Response to Earthquake Shaking

    Science.gov (United States)

    Allen, R. M.

    2014-12-01

    Earthquakes pose a significant threat to society in the U.S. and around the world. The risk is easily forgotten given the infrequent recurrence of major damaging events, yet the likelihood of a major earthquake in California in the next 30 years is greater than 99%. As our societal infrastructure becomes ever more interconnected, the potential impacts of these future events are difficult to predict. Yet, the same inter-connected infrastructure also allows us to rapidly detect earthquakes as they begin, and provide seconds, tens or seconds, or a few minutes warning. A demonstration earthquake early warning system is now operating in California and is being expanded to the west coast (www.ShakeAlert.org). In recent earthquakes in the Los Angeles region, alerts were generated that could have provided warning to the vast majority of Los Angelinos who experienced the shaking. Efforts are underway to build a public system. Smartphone technology will be used not only to issue that alerts, but could also be used to collect data, and improve the warnings. The MyShake project at UC Berkeley is currently testing an app that attempts to turn millions of smartphones into earthquake-detectors. As our development of the technology continues, we can anticipate ever-more automated response to earthquake alerts. Already, the BART system in the San Francisco Bay Area automatically stops trains based on the alerts. In the future, elevators will stop, machinery will pause, hazardous materials will be isolated, and self-driving cars will pull-over to the side of the road. In this presentation we will review the current status of the earthquake early warning system in the US. We will illustrate how smartphones can contribute to the system. Finally, we will review applications of the information to reduce future losses.

  12. A Crowdsourcing-based Taiwan Scientific Earthquake Reporting System

    Science.gov (United States)

    Liang, W. T.; Lee, J. C.; Lee, C. F.

    2017-12-01

    To collect immediately field observations for any earthquake-induced ground damages, such as surface fault rupture, landslide, rock fall, liquefaction, and landslide-triggered dam or lake, etc., we are developing an earthquake damage reporting system which particularly relies on school teachers as volunteers after taking a series of training courses organized by this project. This Taiwan Scientific Earthquake Reporting (TSER) system is based on the Ushahidi mapping platform, which has been widely used for crowdsourcing on different purposes. Participants may add an app-like icon for mobile devices to this website at https://ies-tser.iis.sinica.edu.tw. Right after a potential damaging earthquake occurred in the Taiwan area, trained volunteers will be notified/dispatched to the source area to carry out field surveys and to describe the ground damages through this system. If the internet is available, they may also upload some relevant images in the field right away. This collected information will be shared with all public after a quick screen by the on-duty scientists. To prepare for the next strong earthquake, we set up a specific project on TSER for sharing spectacular/remarkable geologic features wherever possible. This is to help volunteers get used to this system and share any teachable material on this platform. This experimental, science-oriented crowdsourcing system was launched early this year. Together with a DYFI-like intensity reporting system, Taiwan Quake-Catcher Network, and some online games and teaching materials, the citizen seismology has been much improved in Taiwan in the last decade. All these constructed products are now either operated or promoted at the Taiwan Earthquake Research Center (TEC). With these newly developed platforms and materials, we are aiming not only to raise the earthquake awareness and preparedness, but also to encourage public participation in earthquake science in Taiwan.

  13. Earthquake Catalogue of the Caucasus

    Science.gov (United States)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  14. Quantitative prediction of strong motion for a potential earthquake fault

    Directory of Open Access Journals (Sweden)

    Shamita Das

    2010-02-01

    Full Text Available This paper describes a new method for calculating strong motion records for a given seismic region on the basis of the laws of physics using information on the tectonics and physical properties of the earthquake fault. Our method is based on a earthquake model, called a «barrier model», which is characterized by five source parameters: fault length, width, maximum slip, rupture velocity, and barrier interval. The first three parameters may be constrained from plate tectonics, and the fourth parameter is roughly a constant. The most important parameter controlling the earthquake strong motion is the last parameter, «barrier interval». There are three methods to estimate the barrier interval for a given seismic region: 1 surface measurement of slip across fault breaks, 2 model fitting with observed near and far-field seismograms, and 3 scaling law data for small earthquakes in the region. The barrier intervals were estimated for a dozen earthquakes and four seismic regions by the above three methods. Our preliminary results for California suggest that the barrier interval may be determined if the maximum slip is given. The relation between the barrier interval and maximum slip varies from one seismic region to another. For example, the interval appears to be unusually long for Kilauea, Hawaii, which may explain why only scattered evidence of strong ground shaking was observed in the epicentral area of the Island of Hawaii earthquake of November 29, 1975. The stress drop associated with an individual fault segment estimated from the barrier interval and maximum slip lies between 100 and 1000 bars. These values are about one order of magnitude greater than those estimated earlier by the use of crack models without barriers. Thus, the barrier model can resolve, at least partially, the well known discrepancy between the stress-drops measured in the laboratory and those estimated for earthquakes.

  15. Testing earthquake source inversion methodologies

    KAUST Repository

    Page, Morgan T.

    2011-01-01

    Source Inversion Validation Workshop; Palm Springs, California, 11-12 September 2010; Nowadays earthquake source inversions are routinely performed after large earthquakes and represent a key connection between recorded seismic and geodetic data and the complex rupture process at depth. The resulting earthquake source models quantify the spatiotemporal evolution of ruptures. They are also used to provide a rapid assessment of the severity of an earthquake and to estimate losses. However, because of uncertainties in the data, assumed fault geometry and velocity structure, and chosen rupture parameterization, it is not clear which features of these source models are robust. Improved understanding of the uncertainty and reliability of earthquake source inversions will allow the scientific community to use the robust features of kinematic inversions to more thoroughly investigate the complexity of the rupture process and to better constrain other earthquakerelated computations, such as ground motion simulations and static stress change calculations.

  16. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    Science.gov (United States)

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  17. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage

    Science.gov (United States)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.

    2010-12-01

    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  18. The CATDAT damaging earthquakes database

    Directory of Open Access Journals (Sweden)

    J. E. Daniell

    2011-08-01

    Full Text Available The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes.

    Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon.

    Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected, and economic losses (direct, indirect, aid, and insured.

    Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto ($214 billion USD damage; 2011 HNDECI-adjusted dollars compared to the 2011 Tohoku (>$300 billion USD at time of writing, 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product, exchange rate, wage information, population, HDI (Human Development Index, and insurance information have been collected globally to form comparisons.

    This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global

  19. The CATDAT damaging earthquakes database

    Science.gov (United States)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  20. Effects of the northern Ohio earthquake on the Perry nuclear power plant

    International Nuclear Information System (INIS)

    Stevenson, J.D.

    1987-01-01

    On January 31, 1986 at 11:47 A.M. EST, a brief strong motion duration and shallow (10 km focal depth) earthquake with a 5.0 Richter magnitude occurred. Its epicenter was located near Leroy, Ohio which is south of Lake Erie, at a distance of approximately ten (10) miles from the Perry Nuclear Power Plant site at Perry, Ohio. The potential safety significance of the Leroy 1986 earthquake is that it produced a recorded component of earthquake motion zero period acceleration approximately equal to the 0.15g zero period ground acceleration defined as the Safe Shutdown Earthquake for the site. The Leroy 1986 earthquake is the first recorded instance in the U.S. of a nuclear power plant being subjected to some level of OBE exceedance. In general, the short duration and high frequency non-damaging character of the Leroy 1986 earthquake cannot be equated directly on the basis of peak ground acceleration alone with the longer duration, lower frequency content of earthquakes which are expected to do structural damage. However, all the available evidence suggests that the Leroy 1986 is not atypical of what might be expected earthquake activity in the area of the eastern U.S. with 1-10 year return periods. On this basis, it is essential that new methods be developed which properly characterized the damage potential of these types of earthquakes and not simply process the raw data associated with recorded peak acceleration as the basis of nuclear plant shutdown and potentially lengthly examination

  1. Earthquake Emergency Education in Dushanbe, Tajikistan

    Science.gov (United States)

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  2. Radon observation for earthquake prediction

    Energy Technology Data Exchange (ETDEWEB)

    Wakita, Hiroshi [Tokyo Univ. (Japan)

    1998-12-31

    Systematic observation of groundwater radon for the purpose of earthquake prediction began in Japan in late 1973. Continuous observations are conducted at fixed stations using deep wells and springs. During the observation period, significant precursory changes including the 1978 Izu-Oshima-kinkai (M7.0) earthquake as well as numerous coseismic changes were observed. At the time of the 1995 Kobe (M7.2) earthquake, significant changes in chemical components, including radon dissolved in groundwater, were observed near the epicentral region. Precursory changes are presumably caused by permeability changes due to micro-fracturing in basement rock or migration of water from different sources during the preparation stage of earthquakes. Coseismic changes may be caused by seismic shaking and by changes in regional stress. Significant drops of radon concentration in groundwater have been observed after earthquakes at the KSM site. The occurrence of such drops appears to be time-dependent, and possibly reflects changes in the regional stress state of the observation area. The absence of radon drops seems to be correlated with periods of reduced regional seismic activity. Experience accumulated over the two past decades allows us to reach some conclusions: 1) changes in groundwater radon do occur prior to large earthquakes; 2) some sites are particularly sensitive to earthquake occurrence; and 3) the sensitivity changes over time. (author)

  3. Earthquake prediction by Kina Method

    International Nuclear Information System (INIS)

    Kianoosh, H.; Keypour, H.; Naderzadeh, A.; Motlagh, H.F.

    2005-01-01

    Earthquake prediction has been one of the earliest desires of the man. Scientists have worked hard to predict earthquakes for a long time. The results of these efforts can generally be divided into two methods of prediction: 1) Statistical Method, and 2) Empirical Method. In the first method, earthquakes are predicted using statistics and probabilities, while the second method utilizes variety of precursors for earthquake prediction. The latter method is time consuming and more costly. However, the result of neither method has fully satisfied the man up to now. In this paper a new method entitled 'Kiana Method' is introduced for earthquake prediction. This method offers more accurate results yet lower cost comparing to other conventional methods. In Kiana method the electrical and magnetic precursors are measured in an area. Then, the time and the magnitude of an earthquake in the future is calculated using electrical, and in particular, electrical capacitors formulas. In this method, by daily measurement of electrical resistance in an area we make clear that the area is capable of earthquake occurrence in the future or not. If the result shows a positive sign, then the occurrence time and the magnitude can be estimated by the measured quantities. This paper explains the procedure and details of this prediction method. (authors)

  4. Precisely locating the Klamath Falls, Oregon, earthquakes

    Science.gov (United States)

    Qamar, A.; Meagher, K.L.

    1993-01-01

    The Klamath Falls earthquakes on September 20, 1993, were the largest earthquakes centered in Oregon in more than 50 yrs. Only the magnitude 5.75 Milton-Freewater earthquake in 1936, which was centered near the Oregon-Washington border and felt in an area of about 190,000 sq km, compares in size with the recent Klamath Falls earthquakes. Although the 1993 earthquakes surprised many local residents, geologists have long recognized that strong earthquakes may occur along potentially active faults that pass through the Klamath Falls area. These faults are geologically related to similar faults in Oregon, Idaho, and Nevada that occasionally spawn strong earthquakes

  5. Ionospheric phenomena before strong earthquakes

    Directory of Open Access Journals (Sweden)

    A. S. Silina

    2001-01-01

    Full Text Available A statistical analysis of several ionospheric parameters before earthquakes with magnitude M > 5.5 located less than 500 km from an ionospheric vertical sounding station is performed. Ionospheric effects preceding "deep" (depth h > 33 km and "crust" (h 33 km earthquakes were analysed separately. Data of nighttime measurements of the critical frequencies foF2 and foEs, the frequency fbEs and Es-spread at the middle latitude station Dushanbe were used. The frequencies foF2 and fbEs are proportional to the square root of the ionization density at heights of 300 km and 100 km, respectively. It is shown that two days before the earthquakes the values of foF2 averaged over the morning hours (00:00 LT–06:00 LT and of fbEs averaged over the nighttime hours (18:00 LT–06:00 LT decrease; the effect is stronger for the "deep" earthquakes. Analysing the coefficient of semitransparency which characterizes the degree of small-scale turbulence, it was shown that this value increases 1–4 days before "crust" earthquakes, and it does not change before "deep" earthquakes. Studying Es-spread which manifests itself as diffuse Es track on ionograms and characterizes the degree of large-scale turbulence, it was found that the number of Es-spread observations increases 1–3 days before the earthquakes; for "deep" earthquakes the effect is more intensive. Thus it may be concluded that different mechanisms of energy transfer from the region of earthquake preparation to the ionosphere occur for "deep" and "crust" events.

  6. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    Science.gov (United States)

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  7. Latitude-Time Total Electron Content Anomalies as Precursors to Japan's Large Earthquakes Associated with Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Jyh-Woei Lin

    2011-01-01

    Full Text Available The goal of this study is to determine whether principal component analysis (PCA can be used to process latitude-time ionospheric TEC data on a monthly basis to identify earthquake associated TEC anomalies. PCA is applied to latitude-time (mean-of-a-month ionospheric total electron content (TEC records collected from the Japan GEONET network to detect TEC anomalies associated with 18 earthquakes in Japan (M≥6.0 from 2000 to 2005. According to the results, PCA was able to discriminate clear TEC anomalies in the months when all 18 earthquakes occurred. After reviewing months when no M≥6.0 earthquakes occurred but geomagnetic storm activity was present, it is possible that the maximal principal eigenvalues PCA returned for these 18 earthquakes indicate earthquake associated TEC anomalies. Previously PCA has been used to discriminate earthquake-associated TEC anomalies recognized by other researchers, who found that statistical association between large earthquakes and TEC anomalies could be established in the 5 days before earthquake nucleation; however, since PCA uses the characteristics of principal eigenvalues to determine earthquake related TEC anomalies, it is possible to show that such anomalies existed earlier than this 5-day statistical window.

  8. A study on generation of simulated earthquake ground motion for seismic design of nuclear power plant

    International Nuclear Information System (INIS)

    Ichiki, Tadaharu; Matsumoto, Takuji; Kitada, Yoshio; Osaki, Yorihiko; Kanda, Jun; Masao, Toru.

    1985-01-01

    The aseismatic design of nuclear power generation facilities carried out in Japan at present must conform to the ''Guideline for aseismatic design examination regarding power reactor facilities'' decided by the Atomic Energy Commission in 1978. In this guideline, the earthquake motion used for the analysis of dynamic earthquake response is to be given in the form of the magnitude determined on the basis of the investigation of historical earthquakes and active faults around construction sites and the response spectra corresponding to the distance from epicenters. Accordingly when the analysis of dynamic earthquake response is actually carried out, the simulated earthquake motion made in conformity with these set up response spectra is used as the input earthquake motion for the design. For the purpose of establishing the techniques making simulated earthquake motion which is more appropriate and rational from engineering viewpoint, the research was carried out, and the results are summarized in this paper. The techniques for making simulated earthquake motion, the response of buildings and the response spectra of floors are described. (Kako, I.)

  9. Earthquake rupture below the brittle-ductile transition in continental lithospheric mantle.

    Science.gov (United States)

    Prieto, Germán A; Froment, Bérénice; Yu, Chunquan; Poli, Piero; Abercrombie, Rachel

    2017-03-01

    Earthquakes deep in the continental lithosphere are rare and hard to interpret in our current understanding of temperature control on brittle failure. The recent lithospheric mantle earthquake with a moment magnitude of 4.8 at a depth of ~75 km in the Wyoming Craton was exceptionally well recorded and thus enabled us to probe the cause of these unusual earthquakes. On the basis of complete earthquake energy balance estimates using broadband waveforms and temperature estimates using surface heat flow and shear wave velocities, we argue that this earthquake occurred in response to ductile deformation at temperatures above 750°C. The high stress drop, low rupture velocity, and low radiation efficiency are all consistent with a dissipative mechanism. Our results imply that earthquake nucleation in the lithospheric mantle is not exclusively limited to the brittle regime; weakening mechanisms in the ductile regime can allow earthquakes to initiate and propagate. This finding has significant implications for understanding deep earthquake rupture mechanics and rheology of the continental lithosphere.

  10. The Pocatello Valley, Idaho, earthquake

    Science.gov (United States)

    Rogers, A. M.; Langer, C.J.; Bucknam, R.C.

    1975-01-01

    A Richter magnitude 6.3 earthquake occurred at 8:31 p.m mountain daylight time on March 27, 1975, near the Utah-Idaho border in Pocatello Valley. The epicenter of the main shock was located at 42.094° N, 112.478° W, and had a focal depth of 5.5 km. This earthquake was the largest in the continental United States since the destructive San Fernando earthquake of February 1971. The main shock was preceded by a magnitude 4.5 foreshock on March 26. 

  11. The threat of silent earthquakes

    Science.gov (United States)

    Cervelli, Peter

    2004-01-01

    Not all earthquakes shake the ground. The so-called silent types are forcing scientists to rethink their understanding of the way quake-prone faults behave. In rare instances, silent earthquakes that occur along the flakes of seaside volcanoes may cascade into monstrous landslides that crash into the sea and trigger towering tsunamis. Silent earthquakes that take place within fault zones created by one tectonic plate diving under another may increase the chance of ground-shaking shocks. In other locations, however, silent slip may decrease the likelihood of destructive quakes, because they release stress along faults that might otherwise seem ready to snap.

  12. USGS Earthquake Program GPS Use Case : Earthquake Early Warning

    Science.gov (United States)

    2015-03-12

    USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...

  13. EARTHQUAKE-INDUCED DEFORMATION STRUCTURES AND RELATED TO EARTHQUAKE MAGNITUDES

    Directory of Open Access Journals (Sweden)

    Savaş TOPAL

    2003-02-01

    Full Text Available Earthquake-induced deformation structures which are called seismites may helpful to clasify the paleoseismic history of a location and to estimate the magnitudes of the potention earthquakes in the future. In this paper, seismites were investigated according to the types formed in deep and shallow lake sediments. Seismites are observed forms of sand dikes, introduced and fractured gravels and pillow structures in shallow lakes and pseudonodules, mushroom-like silts protruding laminites, mixed layers, disturbed varved lamination and loop bedding in deep lake sediments. Earthquake-induced deformation structures, by benefiting from previous studies, were ordered according to their formations and earthquake magnitudes. In this order, the lowest eartquake's record is loop bedding and the highest one is introduced and fractured gravels in lacustrine deposits.

  14. OSR encapsulation basis -- 100-KW

    International Nuclear Information System (INIS)

    Meichle, R.H.

    1995-01-01

    The purpose of this report is to provide the basis for a change in the Operations Safety Requirement (OSR) encapsulated fuel storage requirements in the 105 KW fuel storage basin which will permit the handling and storing of encapsulated fuel in canisters which no longer have a water-free space in the top of the canister. The scope of this report is limited to providing the change from the perspective of the safety envelope (bases) of the Safety Analysis Report (SAR) and Operations Safety Requirements (OSR). It does not change the encapsulation process itself

  15. Twitter earthquake detection: Earthquake monitoring in a social world

    Science.gov (United States)

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  16. Extreme value statistics and thermodynamics of earthquakes. Large earthquakes

    Energy Technology Data Exchange (ETDEWEB)

    Lavenda, B. [Camerino Univ., Camerino, MC (Italy); Cipollone, E. [ENEA, Centro Ricerche Casaccia, S. Maria di Galeria, RM (Italy). National Centre for Research on Thermodynamics

    2000-06-01

    A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershocks sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Frechet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions show that self-similar power laws are transformed into non scaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Frechet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same catalogue of Chinese earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Frechet distribution. Earthquake temperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  17. The key role of eyewitnesses in rapid earthquake impact assessment

    Science.gov (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline

    2014-05-01

    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  18. Automatic Earthquake Shear Stress Measurement Method Developed for Accurate Time- Prediction Analysis of Forthcoming Major Earthquakes Along Shallow Active Faults

    Science.gov (United States)

    Serata, S.

    2006-12-01

    basis to disclose an acting earthquake shear stress S at top of the tectonic plate is established at the depth of 600-800m (Window). This concept is supported by outcome of the Japanese government stress measurement made at the epicenter of the Kobe earthquake of 1995, where S is found to be less than 5 MPa. At the same time S at the earthquake active Ashio mining district was found to be 36 MPa (90 percent of maximum S) at Window. These findings led to formulation of a quantitative method proposed to monitor earthquake triggering potential in and around any growing earthquake stress nucleus along shallow active faults. For future earthquake time prediction, the Stressmeter can be applied first to survey general distribution of earthquake shear stress S along major active faults. A site with its shear stress greater than 30 MPa may be identified as a site of growing stress nucleus. A Stressmeter must be permanently buried at the site to monitor future stress growth toward a possible triggering by mathematical analysis of the stress excursion dynamics. This is made possible by the automatic stress measurement capability of the Stressmeter at a frequency up to 100 times per day. The significance of this approach is a possibility to save lives by time-prediction of a forthcoming major earthquake with accuracy in hours and minutes.

  19. Importance of weak minerals on earthquake mechanics

    Science.gov (United States)

    Kaneki, S.; Hirono, T.

    2017-12-01

    The role of weak minerals such as smectite and talc on earthquake mechanics is one of the important issues, and has been debated for recent several decades. Traditionally weak minerals in fault have been reported to weaken fault strength causing from its low frictional resistance. Furthermore, velocity-strengthening behavior of such weak mineral (talc) is considered to responsible for fault creep (aseismic slip) in the San Andreas fault. In contrast, recent studies reported that large amount of weak smectite in the Japan Trench could facilitate gigantic seismic slip during the 2011 Tohoku-oki earthquake. To investigate the role of weak minerals on rupture propagation process and magnitude of slip, we focus on the frictional properties of carbonaceous materials (CMs), which is the representative weak materials widely distributed in and around the convergent boundaries. Field observation and geochemical analyses revealed that graphitized CMs-layer is distributed along the slip surface of a fossil plate-subduction fault. Laboratory friction experiments demonstrated that pure quartz, bulk mixtures with bituminous coal (1 wt.%), and quartz with layered coal samples exhibited almost similar frictional properties (initial, yield, and dynamic friction). However, mixtures of quartz (99 wt.%) and layered graphite (1 wt.%) showed significantly lower initial and yield friction coefficient (0.31 and 0.50, respectively). Furthermore, the stress ratio S, defined as (yield stress-initial stress)/(initial stress-dynamic stress), increased in layered graphite samples (1.97) compared to quartz samples (0.14). Similar trend was observed in smectite-rich fault gouge. By referring the reported results of dynamic rupture propagation simulation using S ratio of 1.4 (typical value for the Japan Trench) and 2.0 (this study), we confirmed that higher S ratio results in smaller slip distance by approximately 20 %. On the basis of these results, we could conclude that weak minerals have lower

  20. Regional dependence in earthquake early warning and real time seismology

    International Nuclear Information System (INIS)

    Caprio, M.

    2013-01-01

    An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

  1. Regional dependence in earthquake early warning and real time seismology

    Energy Technology Data Exchange (ETDEWEB)

    Caprio, M.

    2013-07-01

    An effective earthquake prediction method is still a Chimera. What we can do at the moment, after the occurrence of a seismic event, is to provide the maximum available information as soon as possible. This can help in reducing the impact of the quake on population or and better organize the rescue operations in case of post-event actions. This study strives to improve the evaluation of earthquake parameters shortly after the occurrence of a major earthquake, and the characterization of regional dependencies in Real-Time Seismology. The recent earthquake experience from Tohoku (M 9.0, 11.03.2011) showed how an efficient EEW systems can inform numerous people and thus potentially reduce the economic and human losses by distributing warning messages several seconds before the arrival of seismic waves. In the case of devastating earthquakes, usually, in the first minutes to days after the main shock, the common communications channels can be overloaded or broken. In such cases, a precise knowledge of the macroseismic intensity distribution will represent a decisive contribution in help management and in the valuation of losses. In this work, I focused on improving the adaptability of EEW systems (chapters 1 and 2) and in deriving a global relationship for converting peak ground motion into macroseismic intensity and vice versa (chapter 3). For EEW applications, in chapter 1 we present an evolutionary approach for magnitude estimation for earthquake early warning based on real-time inversion of displacement spectra. The Spectrum Inversion (SI) method estimates magnitude and its uncertainty by inferring the shape of the entire displacement spectral curve based on the part of the spectra constrained by available data. Our method can be applied in any region without the need for calibration. SI magnitude and uncertainty estimates are updated each second following the initial P detection and potentially stabilize within 10 seconds from the initial earthquake detection

  2. Guide to post-earthquake investigation of lifelines

    International Nuclear Information System (INIS)

    Schiff, A.J.

    1991-01-01

    This book contains proceedings of the Guide to Post-Earthquake Investigation of Life Lines. Topics covered include: the type of facilities and equipment that were damaged and a count of each type, information needed to advance the state-of-the-art of seismic design of facilities and equipment, failure modes and factors contributing to them, the impacts of failures on system operations and the resources and time required to restore facilities, information on the response of the overall system, facilities and equipment to past earthquakes should be known

  3. Centrality in earthquake multiplex networks

    Science.gov (United States)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  4. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    Science.gov (United States)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  5. Elastic energy release in great earthquakes and eruptions

    Directory of Open Access Journals (Sweden)

    Agust eGudmundsson

    2014-05-01

    Full Text Available The sizes of earthquakes are measured using well-defined, measurable quantities such as seismic moment and released (transformed elastic energy. No similar measures exist for the sizes of volcanic eruptions, making it difficult to compare the energies released in earthquakes and eruptions. Here I provide a new measure of the elastic energy (the potential mechanical energy associated with magma chamber rupture and contraction (shrinkage during an eruption. For earthquakes and eruptions, elastic energy derives from two sources: (1 the strain energy stored in the volcano/fault zone before rupture, and (2 the external applied load (force, pressure, stress, displacement on the volcano/fault zone. From thermodynamic considerations it follows that the elastic energy released or transformed (dU during an eruption is directly proportional to the excess pressure (pe in the magma chamber at the time of rupture multiplied by the volume decrease (-dVc of the chamber, so that . This formula can be used as a basis for a new eruption magnitude scale, based on elastic energy released, which can be related to the moment-magnitude scale for earthquakes. For very large eruptions (>100 km3, the volume of the feeder-dike is negligible, so that the decrease in chamber volume during an eruption corresponds roughly to the associated volume of erupted materials , so that the elastic energy is . Using a typical excess pressures of 5 MPa, it is shown that the largest known eruptions on Earth, such as the explosive La Garita Caldera eruption (27-28 million years ago and largest single (effusive Colombia River basalt lava flows (15-16 million years ago, both of which have estimated volumes of about 5000 km3, released elastic energy of the order of 10EJ. For comparison, the seismic moment of the largest earthquake ever recorded, the M9.5 1960 Chile earthquake, is estimated at 100 ZJ and the associated elastic energy release at 10EJ.

  6. The GIS and analysis of earthquake damage distribution of the 1303 Hongtong M=8 earthquake

    Science.gov (United States)

    Gao, Meng-Tan; Jin, Xue-Shen; An, Wei-Ping; Lü, Xiao-Jian

    2004-07-01

    The geography information system of the 1303 Hongton M=8 earthquake has been established. Using the spatial analysis function of GIS, the spatial distribution characteristics of damage and isoseismal of the earthquake are studies. By comparing with the standard earthquake intensity attenuation relationship, the abnormal damage distribution of the earthquake is found, so the relationship of the abnormal distribution with tectonics, site condition and basin are analyzed. In this paper, the influence on the ground motion generated by earthquake source and the underground structures near source also are studied. The influence on seismic zonation, anti-earthquake design, earthquake prediction and earthquake emergency responding produced by the abnormal density distribution are discussed.

  7. Seismicity and earthquake risk in western Sicily

    Directory of Open Access Journals (Sweden)

    P. COSENTINO

    1978-06-01

    Full Text Available The seismicity and the earthquake risk in Western Sicily are here
    evaluated on the basis of the experimental data referring to the historical
    and instrumentally recorded earthquakes in this area (from 1248
    up to 1968, which have been thoroughly collected, analyzed, tested and
    normalized in order to assure the quasi-stationarity of the series of
    events.
    The approximated magnitude values — obtained by means of a compared
    analysis of the magnitude and epicentral intensity values of the
    latest events — have allowed to study the parameters of the frequency-
    magnitude relation with both the classical exponential model and
    the truncated exponential one previously proposed by the author.
    So, the basic parameters, including the maximum possible regional
    magnitude, have been estimated by means of different procedures, and
    their behaviours have been studied as functions of the threshold magnitude.

  8. Earthquake data base for Romania

    International Nuclear Information System (INIS)

    Rizescu, M.; Ghica, D.; Grecu, B.; Popa, M.; Borcia, I. S.

    2002-01-01

    A new earthquake database for Romania is being constructed, comprising complete earthquake information and being up-to-date, user-friendly and rapidly accessible. One main component of the database consists from the catalog of earthquakes occurred in Romania since 984 up to present. The catalog contains information related to locations and other source parameters, when available, and links to waveforms of important earthquakes. The other very important component is the 'strong motion database', developed for strong intermediate-depth Vrancea earthquakes where instrumental data were recorded. Different parameters to characterize strong motion properties as: effective peak acceleration, effective peak velocity, corner periods T c and T d , global response spectrum based intensities were computed and recorded into this database. Also, information on the recording seismic stations as: maps giving their positioning, photographs of the instruments and site conditions ('free-field or on buildings) are included. By the huge volume and quality of gathered data, also by its friendly user interface, the Romania earthquake data base provides a very useful tool for geosciences and civil engineering in their effort towards reducing seismic risk in Romania. (authors)

  9. Mapping Tectonic Stress Using Earthquakes

    International Nuclear Information System (INIS)

    Arnold, Richard; Townend, John; Vignaux, Tony

    2005-01-01

    An earthquakes occurs when the forces acting on a fault overcome its intrinsic strength and cause it to slip abruptly. Understanding more specifically why earthquakes occur at particular locations and times is complicated because in many cases we do not know what these forces actually are, or indeed what processes ultimately trigger slip. The goal of this study is to develop, test, and implement a Bayesian method of reliably determining tectonic stresses using the most abundant stress gauges available - earthquakes themselves.Existing algorithms produce reasonable estimates of the principal stress directions, but yield unreliable error bounds as a consequence of the generally weak constraint on stress imposed by any single earthquake, observational errors, and an unavoidable ambiguity between the fault normal and the slip vector.A statistical treatment of the problem can take into account observational errors, combine data from multiple earthquakes in a consistent manner, and provide realistic error bounds on the estimated principal stress directions.We have developed a realistic physical framework for modelling multiple earthquakes and show how the strong physical and geometrical constraints present in this problem allow inference to be made about the orientation of the principal axes of stress in the earth's crust

  10. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  11. Building with Earthquakes in Mind

    Science.gov (United States)

    Mangieri, Nicholas

    2016-04-01

    Earthquakes are some of the most elusive and destructive disasters humans interact with on this planet. Engineering structures to withstand earthquake shaking is critical to ensure minimal loss of life and property. However, the majority of buildings today in non-traditional earthquake prone areas are not built to withstand this devastating force. Understanding basic earthquake engineering principles and the effect of limited resources helps students grasp the challenge that lies ahead. The solution can be found in retrofitting existing buildings with proper reinforcements and designs to deal with this deadly disaster. The students were challenged in this project to construct a basic structure, using limited resources, that could withstand a simulated tremor through the use of an earthquake shake table. Groups of students had to work together to creatively manage their resources and ideas to design the most feasible and realistic type of building. This activity provided a wealth of opportunities for the students to learn more about a type of disaster they do not experience in this part of the country. Due to the fact that most buildings in New York City were not designed to withstand earthquake shaking, the students were able to gain an appreciation for how difficult it would be to prepare every structure in the city for this type of event.

  12. Large earthquakes and creeping faults

    Science.gov (United States)

    Harris, Ruth A.

    2017-01-01

    Faults are ubiquitous throughout the Earth's crust. The majority are silent for decades to centuries, until they suddenly rupture and produce earthquakes. With a focus on shallow continental active-tectonic regions, this paper reviews a subset of faults that have a different behavior. These unusual faults slowly creep for long periods of time and produce many small earthquakes. The presence of fault creep and the related microseismicity helps illuminate faults that might not otherwise be located in fine detail, but there is also the question of how creeping faults contribute to seismic hazard. It appears that well-recorded creeping fault earthquakes of up to magnitude 6.6 that have occurred in shallow continental regions produce similar fault-surface rupture areas and similar peak ground shaking as their locked fault counterparts of the same earthquake magnitude. The behavior of much larger earthquakes on shallow creeping continental faults is less well known, because there is a dearth of comprehensive observations. Computational simulations provide an opportunity to fill the gaps in our understanding, particularly of the dynamic processes that occur during large earthquake rupture and arrest.

  13. Global earthquake fatalities and population

    Science.gov (United States)

    Holzer, Thomas L.; Savage, James C.

    2013-01-01

    Modern global earthquake fatalities can be separated into two components: (1) fatalities from an approximately constant annual background rate that is independent of world population growth and (2) fatalities caused by earthquakes with large human death tolls, the frequency of which is dependent on world population. Earthquakes with death tolls greater than 100,000 (and 50,000) have increased with world population and obey a nonstationary Poisson distribution with rate proportional to population. We predict that the number of earthquakes with death tolls greater than 100,000 (50,000) will increase in the 21st century to 8.7±3.3 (20.5±4.3) from 4 (7) observed in the 20th century if world population reaches 10.1 billion in 2100. Combining fatalities caused by the background rate with fatalities caused by catastrophic earthquakes (>100,000 fatalities) indicates global fatalities in the 21st century will be 2.57±0.64 million if the average post-1900 death toll for catastrophic earthquakes (193,000) is assumed.

  14. Mw 8.5 BENGKULU EARTHQUAKES FROM CONTINUOUS GPS DATA

    Directory of Open Access Journals (Sweden)

    W. A. W. Aris

    2016-09-01

    Full Text Available The Mw 8.5 Bengkulu earthquake of 30 September 2007 and the Mw8.6 28 March 2005 are considered amongst large earthquake ever recorded in Southeast Asia. The impact into tectonic deformation was recorded by a network of Global Positioning System (GPS Continuously Operating Reference Station (CORS within southern of Sumatra and west-coast of Peninsular Malaysia. The GPS data from the GPS CORS network has been deployed to investigate the characteristic of postseismic deformation due to the earthquakes. Analytical logarithmic and exponential function was applied to investigate the deformation decay period of postseismic deformation. This investigation provides a preliminary insight into postseismic cycle along the Sumatra subduction zone in particular and on the dynamics Peninsular Malaysia in general.

  15. PRELIMINARY SELECTION OF MGR DESIGN BASIS EVENTS

    International Nuclear Information System (INIS)

    Kappes, J.A.

    1999-01-01

    The purpose of this analysis is to identify the preliminary design basis events (DBEs) for consideration in the design of the Monitored Geologic Repository (MGR). For external events and natural phenomena (e.g., earthquake), the objective is to identify those initiating events that the MGR will be designed to withstand. Design criteria will ensure that radiological release scenarios resulting from these initiating events are beyond design basis (i.e., have a scenario frequency less than once per million years). For internal (i.e., human-induced and random equipment failures) events, the objective is to identify credible event sequences that result in bounding radiological releases. These sequences will be used to establish the design basis criteria for MGR structures, systems, and components (SSCs) design basis criteria in order to prevent or mitigate radiological releases. The safety strategy presented in this analysis for preventing or mitigating DBEs is based on the preclosure safety strategy outlined in ''Strategy to Mitigate Preclosure Offsite Exposure'' (CRWMS M andO 1998f). DBE analysis is necessary to provide feedback and requirements to the design process, and also to demonstrate compliance with proposed 10 CFR 63 (Dyer 1999b) requirements. DBE analysis is also required to identify and classify the SSCs that are important to safety (ITS)

  16. Memory effect in M ≥ 7 earthquakes of Taiwan

    Science.gov (United States)

    Wang, Jeen-Hwa

    2014-07-01

    The M ≥ 7 earthquakes that occurred in the Taiwan region during 1906-2006 are taken to study the possibility of memory effect existing in the sequence of those large earthquakes. Those events are all mainshocks. The fluctuation analysis technique is applied to analyze two sequences in terms of earthquake magnitude and inter-event time represented in the natural time domain. For both magnitude and inter-event time, the calculations are made for three data sets, i.e., the original order data, the reverse-order data, and that of the mean values. Calculated results show that the exponents of scaling law of fluctuation versus window length are less than 0.5 for the sequences of both magnitude and inter-event time data. In addition, the phase portraits of two sequent magnitudes and two sequent inter-event times are also applied to explore if large (or small) earthquakes are followed by large (or small) events. Results lead to a negative answer. Together with all types of information in study, we make a conclusion that the earthquake sequence in study is short-term corrected and thus the short-term memory effect would be operative.

  17. Twitter earthquake detection: earthquake monitoring in a social world

    Directory of Open Access Journals (Sweden)

    Daniel C. Bowden

    2011-06-01

    Full Text Available The U.S. Geological Survey (USGS is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word “earthquake” clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  18. The deadly Morelos-Puebla, Mexico Intraslab Earthquake of 19 September 2017 (Mw7.1): Was the Earthquake Unexpected and Were the Ground Motions and Damage Pattern in Mexico City Abnormal?

    Science.gov (United States)

    Perez-Campos, X.; Singh, S. K.; Arroyo, D.; Cruz-Atienza, V. M.; Ordaz, M.; Hjorleifsdottir, V.; Iglesias, A.

    2017-12-01

    On 19 September 2017, thirty two years after the 1985 Michoacan interplate earthquake (Mw8.0), the city was once again devastated but this time by a Mw7.1 intraslab earthquake. The 2017 earthquake was located near the border of the states of Morelos and Puebla (18.410N, -98.710E; H=57 km), to SSE of Mexico City, at a hypocentral distance of about 127 km. It caused great panic in Mexico City, collapse of 44 buildings, and severely damaged many others. More than 200 persons were killed in the city. It was the second most destructive earthquake in the history of Mexico City, next only to the 1985 earthquake. A strong-motion station at CU located on basalt lava flows on main campus UNAM has been in continuous operation since 1964. PGA of 59 gal at CU during the 2017 earthquake is the largest ever, two times greater than that recorded during the 1985 earthquake (29 gal). The 2017 earthquake raised questions that are critical in fathoming the seismic vulnerability of the city and in its reconstruction. Was such an intraslab earthquake (Mw 7 at a hypocentral distance of 127 km) unexpected? Were the recorded ground motions in the city unusually high for such an earthquake? Why did the damage pattern during the earthquake differ from that observed during the 1985 earthquake? The earthquake was the closest M>5 intraslab earthquake to Mexico City ever recorded. However, Mw 5.9 events have occurred in recent years in the vicinity of the 2017 earthquake (R 145 km). Three Mw≥6.9 earthquakes have occurred since 1964 in the distance range 184-225 km. Thus, Mw and R of the earthquake was not surprising. However, a comparison of Fourier acceleration spectra at CU of 10 intraslab earthquakes with largest PGA, reduced to a common distance of R=127 km, shows that the amplitudes of the 2017 events were abnormally high in 1-2s range. Spectra of intraslab events at CU are enriched at higher frequencies relative to interplate ones because of closer distance, greater depth and higher

  19. Biological Indicators in Studies of Earthquake Precursors

    Science.gov (United States)

    Sidorin, A. Ya.; Deshcherevskii, A. V.

    2012-04-01

    Time series of data on variations in the electric activity (EA) of four species of weakly electric fish Gnathonemus leopoldianus and moving activity (MA) of two cat-fishes Hoplosternum thoracatum and two groups of Columbian cockroaches Blaberus craniifer were analyzed. The observations were carried out in the Garm region of Tajikistan within the frameworks of the experiments aimed at searching for earthquake precursors. An automatic recording system continuously recorded EA and DA over a period of several years. Hourly means EA and MA values were processed. Approximately 100 different parameters were calculated on the basis of six initial EA and MA time series, which characterize different variations in the EA and DA structure: amplitude of the signal and fluctuations of activity, parameters of diurnal rhythms, correlated changes in the activity of various biological indicators, and others. A detailed analysis of the statistical structure of the total array of parametric time series obtained in the experiment showed that the behavior of all animals shows a strong temporal variability. All calculated parameters are unstable and subject to frequent changes. A comparison of the data obtained with seismicity allow us to make the following conclusions: (1) The structure of variations in the studied parameters is represented by flicker noise or even a more complex process with permanent changes in its characteristics. Significant statistics are required to prove the cause-and-effect relationship of the specific features of such time series with seismicity. (2) The calculation of the reconstruction statistics in the EA and MA series structure demonstrated an increase in their frequency in the last hours or a few days before the earthquake if the hypocenter distance is comparable to the source size. Sufficiently dramatic anomalies in the behavior of catfishes and cockroaches (changes in the amplitude of activity variation, distortions of diurnal rhythms, increase in the

  20. USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation

    Science.gov (United States)

    Hudnut, K. W.; Murray, J. R.; Minson, S. E.

    2015-12-01

    Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of

  1. Evidence for Ancient Mesoamerican Earthquakes

    Science.gov (United States)

    Kovach, R. L.; Garcia, B.

    2001-12-01

    Evidence for past earthquake damage at Mesoamerican ruins is often overlooked because of the invasive effects of tropical vegetation and is usually not considered as a casual factor when restoration and reconstruction of many archaeological sites are undertaken. Yet the proximity of many ruins to zones of seismic activity would argue otherwise. Clues as to the types of damage which should be soughtwere offered in September 1999 when the M = 7.5 Oaxaca earthquake struck the ruins of Monte Alban, Mexico, where archaeological renovations were underway. More than 20 structures were damaged, 5 of them seriously. Damage features noted were walls out of plumb, fractures in walls, floors, basal platforms and tableros, toppling of columns, and deformation, settling and tumbling of walls. A Modified Mercalli Intensity of VII (ground accelerations 18-34 %b) occurred at the site. Within the diffuse landward extension of the Caribbean plate boundary zone M = 7+ earthquakes occur with repeat times of hundreds of years arguing that many Maya sites were subjected to earthquakes. Damage to re-erected and reinforced stelae, walls, and buildings were witnessed at Quirigua, Guatemala, during an expedition underway when then 1976 M = 7.5 Guatemala earthquake on the Motagua fault struck. Excavations also revealed evidence (domestic pttery vessels and skeleton of a child crushed under fallen walls) of an ancient earthquake occurring about the teim of the demise and abandonment of Quirigua in the late 9th century. Striking evidence for sudden earthquake building collapse at the end of the Mayan Classic Period ~A.D. 889 was found at Benque Viejo (Xunantunich), Belize, located 210 north of Quirigua. It is argued that a M = 7.5 to 7.9 earthquake at the end of the Maya Classic period centered in the vicinity of the Chixoy-Polochic and Motagua fault zones cound have produced the contemporaneous earthquake damage to the above sites. As a consequences this earthquake may have accelerated the

  2. Impact-based earthquake alerts with the U.S. Geological Survey's PAGER system: what's next?

    Science.gov (United States)

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Garcia, D.; So, E.; Hearne, M.

    2012-01-01

    In September 2010, the USGS began publicly releasing earthquake alerts for significant earthquakes around the globe based on estimates of potential casualties and economic losses with its Prompt Assessment of Global Earthquakes for Response (PAGER) system. These estimates significantly enhanced the utility of the USGS PAGER system which had been, since 2006, providing estimated population exposures to specific shaking intensities. Quantifying earthquake impacts and communicating estimated losses (and their uncertainties) to the public, the media, humanitarian, and response communities required a new protocol—necessitating the development of an Earthquake Impact Scale—described herein and now deployed with the PAGER system. After two years of PAGER-based impact alerting, we now review operations, hazard calculations, loss models, alerting protocols, and our success rate for recent (2010-2011) events. This review prompts analyses of the strengths, limitations, opportunities, and pressures, allowing clearer definition of future research and development priorities for the PAGER system.

  3. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    Science.gov (United States)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  4. The role of post-earthquake structural safety in pre-earthquake retrof in decision: guidelines and applications

    International Nuclear Information System (INIS)

    Bazzurro, P.; Telleen, K.; Maffei, J.; Yin, J.; Cornell, C.A.

    2009-01-01

    Critical structures such as hospitals, police stations, local administrative office buildings, and critical lifeline facilities, are expected to be operational immediately after earthquakes. Any rational decision about whether these structures are strong enough to meet this goal or whether pre-empitive retrofitting is needed cannot be made without an explicit consideration of post-earthquake safety and functionality with respect to aftershocks. Advanced Seismic Assessment Guidelines offer improvement over previous methods for seismic evaluation of buildings where post-earthquake safety and usability is a concern. This new method allows engineers to evaluate the like hood that a structure may have restricted access or no access after an earthquake. The building performance is measured in terms of the post-earthquake occupancy classifications Green Tag, Yellow Tag, and Red Tag, defining these performance levels quantitatively, based on the structure's remaining capacity to withstand aftershocks. These color-coded placards that constitute an established practice in US could be replaced by the standard results of inspections (A to E) performed by the Italian Dept. of Civil Protection after an event. The article also shows some applications of these Guidelines to buildings of the largest utility company in California, Pacific Gas and Electric Company (PGE). [it

  5. Do earthquakes exhibit self-organized criticality?

    International Nuclear Information System (INIS)

    Yang Xiaosong; Ma Jin; Du Shuming

    2004-01-01

    If earthquakes are phenomena of self-organized criticality (SOC), statistical characteristics of the earthquake time series should be invariant after the sequence of events in an earthquake catalog are randomly rearranged. In this Letter we argue that earthquakes are unlikely phenomena of SOC because our analysis of the Southern California Earthquake Catalog shows that the first-return-time probability P M (T) is apparently changed after the time series is rearranged. This suggests that the SOC theory should not be used to oppose the efforts of earthquake prediction

  6. Earthquake, GIS and multimedia. The 1883 Casamicciola earthquake

    Directory of Open Access Journals (Sweden)

    M. Rebuffat

    1995-06-01

    Full Text Available A series of multimedia monographs concerning the main seismic events that have affected the Italian territory are in the process of being produced for the Documental Integrated Multimedia Project (DIMP started by the Italian National Seismic Survey (NSS. The purpose of the project is to reconstruct the historical record of earthquakes and promote an earthquake public education. Producing the monographs. developed in ARC INFO and working in UNIX. involved designing a special filing and management methodology to integrate heterogeneous information (images, papers, cartographies, etc.. This paper describes the possibilities of a GIS (Geographic Information System in the filing and management of documental information. As an example we present the first monograph on the 1883 Casamicciola earthquake. on the island of Ischia (Campania, Italy. This earthquake is particularly interesting for the following reasons: I historical-cultural context (first destructive seismic event after the unification of Italy; 2 its features (volcanic earthquake; 3 the socioeconomic consequences caused at such an important seaside resort.

  7. Extreme value statistics and thermodynamics of earthquakes: large earthquakes

    Directory of Open Access Journals (Sweden)

    B. H. Lavenda

    2000-06-01

    Full Text Available A compound Poisson process is used to derive a new shape parameter which can be used to discriminate between large earthquakes and aftershock sequences. Sample exceedance distributions of large earthquakes are fitted to the Pareto tail and the actual distribution of the maximum to the Fréchet distribution, while the sample distribution of aftershocks are fitted to a Beta distribution and the distribution of the minimum to the Weibull distribution for the smallest value. The transition between initial sample distributions and asymptotic extreme value distributions shows that self-similar power laws are transformed into nonscaling exponential distributions so that neither self-similarity nor the Gutenberg-Richter law can be considered universal. The energy-magnitude transformation converts the Fréchet distribution into the Gumbel distribution, originally proposed by Epstein and Lomnitz, and not the Gompertz distribution as in the Lomnitz-Adler and Lomnitz generalization of the Gutenberg-Richter law. Numerical comparison is made with the Lomnitz-Adler and Lomnitz analysis using the same Catalogue of Chinese Earthquakes. An analogy is drawn between large earthquakes and high energy particle physics. A generalized equation of state is used to transform the Gamma density into the order-statistic Fréchet distribution. Earthquaketemperature and volume are determined as functions of the energy. Large insurance claims based on the Pareto distribution, which does not have a right endpoint, show why there cannot be a maximum earthquake energy.

  8. Laboratory generated M -6 earthquakes

    Science.gov (United States)

    McLaskey, Gregory C.; Kilgore, Brian D.; Lockner, David A.; Beeler, Nicholas M.

    2014-01-01

    We consider whether mm-scale earthquake-like seismic events generated in laboratory experiments are consistent with our understanding of the physics of larger earthquakes. This work focuses on a population of 48 very small shocks that are foreshocks and aftershocks of stick–slip events occurring on a 2.0 m by 0.4 m simulated strike-slip fault cut through a large granite sample. Unlike the larger stick–slip events that rupture the entirety of the simulated fault, the small foreshocks and aftershocks are contained events whose properties are controlled by the rigidity of the surrounding granite blocks rather than characteristics of the experimental apparatus. The large size of the experimental apparatus, high fidelity sensors, rigorous treatment of wave propagation effects, and in situ system calibration separates this study from traditional acoustic emission analyses and allows these sources to be studied with as much rigor as larger natural earthquakes. The tiny events have short (3–6 μs) rise times and are well modeled by simple double couple focal mechanisms that are consistent with left-lateral slip occurring on a mm-scale patch of the precut fault surface. The repeatability of the experiments indicates that they are the result of frictional processes on the simulated fault surface rather than grain crushing or fracture of fresh rock. Our waveform analysis shows no significant differences (other than size) between the M -7 to M -5.5 earthquakes reported here and larger natural earthquakes. Their source characteristics such as stress drop (1–10 MPa) appear to be entirely consistent with earthquake scaling laws derived for larger earthquakes.

  9. Guidelines for determining design basis ground motions

    International Nuclear Information System (INIS)

    1993-11-01

    This report develops and applies a method for estimating strong earthquake ground motion. The emphasis of this study is on ground motion estimation in Eastern North America (east of the Rocky Mountains), with particular emphasis on the Eastern United States and southeastern Canada. Specifically considered are ground motions resulting from earthquakes with magnitudes from 5 to 8, fault distances from 0 to 500 km, and frequencies from 1 to 35 Hz. The two main objectives were: (1) to develop generic relations for estimating ground motion appropriate for site screening; and (2) to develop a guideline for conducting a thorough site investigation needed to define the seismic design basis. For the first objective, an engineering model was developed to predict the expected ground motion on rock sites, with an additional set of amplification factors to account for the response of the soil column over rock at soil sites. The results incorporate best estimates of ground motion as well as the randomness and uncertainty associated with those estimates. For the second objective, guidelines were developed for gathering geotechnical information at a site and using this information in calculating site response. As a part of this development, an extensive set of geotechnical and seismic investigations was conducted at three reference sites. Together, the engineering model and guidelines provide the means to select and assess the seismic suitability of a site

  10. Combining multiple earthquake models in real time for earthquake early warning

    Science.gov (United States)

    Minson, Sarah E.; Wu, Stephen; Beck, James L; Heaton, Thomas H.

    2017-01-01

    The ultimate goal of earthquake early warning (EEW) is to provide local shaking information to users before the strong shaking from an earthquake reaches their location. This is accomplished by operating one or more real‐time analyses that attempt to predict shaking intensity, often by estimating the earthquake’s location and magnitude and then predicting the ground motion from that point source. Other EEW algorithms use finite rupture models or may directly estimate ground motion without first solving for an earthquake source. EEW performance could be improved if the information from these diverse and independent prediction models could be combined into one unified, ground‐motion prediction. In this article, we set the forecast shaking at each location as the common ground to combine all these predictions and introduce a Bayesian approach to creating better ground‐motion predictions. We also describe how this methodology could be used to build a new generation of EEW systems that provide optimal decisions customized for each user based on the user’s individual false‐alarm tolerance and the time necessary for that user to react.

  11. The search for Infrared radiation prior to major earthquakes

    Science.gov (United States)

    Ouzounov, D.; Taylor, P.; Pulinets, S.

    2004-12-01

    This work describes our search for a relationship between tectonic stresses and electro-chemical and thermodynamic processes in the Earth and increases in mid-IR flux as part of a possible ensemble of electromagnetic (EM) phenomena that may be related to earthquake activity. Recent analysis of continuous ongoing long- wavelength Earth radiation (OLR) indicates significant and anomalous variability prior to some earthquakes. The cause of these anomalies is not well understood but could be the result of a triggering by an interaction between the lithosphere-hydrosphere and atmospheric related to changes in the near surface electrical field and gas composition prior to the earthquake. The OLR anomaly covers large areas surrounding the main epicenter. We have use the NOAA IR data to differentiate between the global and seasonal variability and these transient local anomalies. Indeed, on the basis of a temporal and spatial distribution analysis, an anomaly pattern is found to occur several days prior some major earthquakes. The significance of these observations was explored using data sets of some recent worldwide events.

  12. Earthquake behavior of steel cushion-implemented reinforced concrete frames

    Science.gov (United States)

    Özkaynak, Hasan

    2018-04-01

    The earthquake performance of vulnerable structures can be increased by the implementation of supplementary energy-dissipative metallic elements. The main aim of this paper is to describe the earthquake behavior of steel cushion-implemented reinforced concrete frames (SCI-RCFR) in terms of displacement demands and energy components. Several quasi-static experiments were performed on steel cushions (SC) installed in reinforced concrete (RC) frames. The test results served as the basis of the analytical models of SCs and a bare reinforced concrete frame (B-RCFR). These models were integrated in order to obtain the resulting analytical model of the SCI-RCFR. Nonlinear-time history analyses (NTHA) were performed on the SCI-RCFR under the effects of the selected earthquake data set. According to the NTHA, SC application is an effective technique for increasing the seismic performance of RC structures. The main portion of the earthquake input energy was dissipated through SCs. SCs succeeded in decreasing the plastic energy demand on structural elements by almost 50% at distinct drift levels.

  13. Phase characteristics of earthquake accelerogram and its application

    International Nuclear Information System (INIS)

    Ohsaki, Y.; Iwasaki, R.; Ohkawa, I.; Masao, T.

    1979-01-01

    As the input earthquake motion for seismic design of nuclear power plant structures and equipments, an artificial time history compatible with smoothed design response spectrum is frequently used. This paper deals with a wave generation technique based on phase characteristics in earthquake accelerograms as an alternate of envelope time function. The concept of 'phase differences' distribution' is defined to represent phase characteristics of earthquake motion. The procedure proposed in this paper consists of following steps; (1) Specify a design response spectrum and derive a corresponding initial modal amplitude. (2) Determine a phase differences' distribution corresponding to an envelope function, the shape of which is dependent on magnitude and epicentral distance of an earthquake. (3) Derive the phase angles at all modal frequencies from the phase differences' distribution. (4) Generate a time history by inverse Fourier transeform on the basis of the amplitudes and the phase angles thus determined. (5) Calculate the response spectrum. (6) Compare the specified and calculated response spectra, and correct the amplitude at each frequency so that the response spectrum will be consistent with the specified. (7) Repeat the steps 4 through 6, until the specified and calculated response spectra become consistent with sufficient accuracy. (orig.)

  14. The evolution of hillslope strength following large earthquakes

    Science.gov (United States)

    Brain, Matthew; Rosser, Nick; Tunstall, Neil

    2017-04-01

    Earthquake-induced landslides play an important role in the evolution of mountain landscapes. Earthquake ground shaking triggers near-instantaneous landsliding, but has also been shown to weaken hillslopes, preconditioning them for failure during subsequent seismicity and/or precipitation events. The temporal evolution of hillslope strength during and following primary seismicity, and if and how this ultimately results in failure, is poorly constrained due to the rarity of high-magnitude earthquakes and limited availability of suitable field datasets. We present results obtained from novel geotechnical laboratory tests to better constrain the mechanisms that control strength evolution in Earth materials of differing rheology. We consider how the strength of hillslope materials responds to ground-shaking events of different magnitude and if and how this persists to influence landslide activity during interseismic periods. We demonstrate the role of stress path and stress history, strain rate and foreshock and aftershock sequences in controlling the evolution of hillslope strength and stability. Critically, we show how hillslopes can be strengthened rather than weakened in some settings, challenging conventional assumptions. On the basis of our laboratory data, we consider the implications for earthquake-induced geomorphic perturbations in mountain landscapes over multiple timescales and in different seismogenic settings.

  15. Authorization basis requirements comparison report

    Energy Technology Data Exchange (ETDEWEB)

    Brantley, W.M.

    1997-08-18

    The TWRS Authorization Basis (AB) consists of a set of documents identified by TWRS management with the concurrence of DOE-RL. Upon implementation of the TWRS Basis for Interim Operation (BIO) and Technical Safety Requirements (TSRs), the AB list will be revised to include the BIO and TSRs. Some documents that currently form part of the AB will be removed from the list. This SD identifies each - requirement from those documents, and recommends a disposition for each to ensure that necessary requirements are retained when the AB is revised to incorporate the BIO and TSRs. This SD also identifies documents that will remain part of the AB after the BIO and TSRs are implemented. This document does not change the AB, but provides guidance for the preparation of change documentation.

  16. Authorization basis requirements comparison report

    International Nuclear Information System (INIS)

    Brantley, W.M.

    1997-01-01

    The TWRS Authorization Basis (AB) consists of a set of documents identified by TWRS management with the concurrence of DOE-RL. Upon implementation of the TWRS Basis for Interim Operation (BIO) and Technical Safety Requirements (TSRs), the AB list will be revised to include the BIO and TSRs. Some documents that currently form part of the AB will be removed from the list. This SD identifies each - requirement from those documents, and recommends a disposition for each to ensure that necessary requirements are retained when the AB is revised to incorporate the BIO and TSRs. This SD also identifies documents that will remain part of the AB after the BIO and TSRs are implemented. This document does not change the AB, but provides guidance for the preparation of change documentation

  17. Determination of Road Functionality for Küçükçekmece District Following a Scenario Earthquake for Istanbul

    Directory of Open Access Journals (Sweden)

    Betül Ergün Konukcu

    2016-03-01

    Full Text Available Istanbul has been affected by earthquakes throughout its history. The most recent earthquake to shake Istanbul was on August 17 1999, along the North Anatolian Fault, 12 km southeast of the Izmit Province, with a magnitude of 7.4. Following the 1999 Izmit earthquake, the earthquake risk in Istanbul started to draw attention and many scientific studies were conducted on the potential earthquake risk in this city. Based on these studies, predictions are that Istanbul is going to face a major earthquake in the near future and this will cause severe damage to the built environment. It is estimated that the damage caused by the anticipated earthquake will be extensive as a consequence of Istanbul’s low quality building stock of Istanbul. The buildings that have the possibility of being damaged cause debris around them. If roadside buildings collapsed during the earthquake, the scattered parts of the buildings could cause roads to lose their functionality. Not only building damage but also transportation damage analysis is necessary to use risk mitigation studies and decisions, being that experiences showed that the functionality of transportation structure effects post-earthquake emergency response and recovery operation seriously. This study aims to reveal a method for road functionality in Küçükçekçekmece following a potential Istanbul Earthquake by using building collapse direction and bridge damage.

  18. Comparison of injury epidemiology between the Wenchuan and Lushan earthquakes in Sichuan, China.

    Science.gov (United States)

    Hu, Yang; Zheng, Xi; Yuan, Yong; Pu, Qiang; Liu, Lunxu; Zhao, Yongfan

    2014-12-01

    We aimed to compare injury characteristics and the timing of admissions and surgeries in the Wenchuan earthquake in 2008 and the Lushan earthquake in 2013. We retrospectively compared the admission and operating times and injury profiles of patients admitted to our medical center during both earthquakes. We also explored the relationship between seismic intensity and injury type. The time from earthquake onset to the peak in patient admissions and surgeries differed between the 2 earthquakes. In the Wenchuan earthquake, injuries due to being struck by objects or being buried were more frequent than other types of injuries, and more patients suffered injuries of the extremities than thoracic injuries or brain trauma. In the Lushan earthquake, falls were the most common injury, and more patients suffered thoracic trauma or brain injuries. The types of injury seemed to vary with seismic intensity, whereas the anatomical location of the injury did not. Greater seismic intensity of an earthquake is associated with longer delay between the event and the peak in patient admissions and surgeries, higher frequencies of injuries due to being struck or buried, and lower frequencies of injuries due to falls and injuries to the chest and brain. These insights may prove useful for planning rescue interventions in trauma centers near the epicenter.

  19. Hospital stay as a proxy indicator for severe injury in earthquakes: a retrospective analysis.

    Science.gov (United States)

    Zhao, Lu-Ping; Gerdin, Martin; Westman, Lina; Rodriguez-Llanes, Jose Manuel; Wu, Qi; van den Oever, Barbara; Pan, Liang; Albela, Manuel; Chen, Gao; Zhang, De-Sheng; Guha-Sapir, Debarati; von Schreeb, Johan

    2013-01-01

    Earthquakes are the most violent type of natural disasters and injuries are the dominant medical problem in the early phases after earthquakes. However, likely because of poor data availability, high-quality research on injuries after earthquakes is lacking. Length of hospital stay (LOS) has been validated as a proxy indicator for injury severity in high-income settings and could potentially be used in retrospective research of injuries after earthquakes. In this study, we assessed LOS as an adequate proxy indicator for severe injury in trauma survivors of an earthquake. A retrospective analysis was conducted using a database of 1,878 injured patients from the 2008 Wenchuan earthquake. Our primary outcome was severe injury, defined as a composite measure of serious injury or resource use. Secondary outcomes were serious injury and resource use, analysed separately. Non-parametric receiver operating characteristics (ROC) and area under the curve (AUC) analysis was used to test the discriminatory accuracy of LOS when used to identify severe injury. An 0.7earthquake survivors. However, LOS was found to be a proxy for major nonorthopaedic surgery and blood transfusion. These findings can be useful for retrospective research on earthquake-injured patients when detailed hospital records are not available.

  20. Quasi real-time estimation of the moment magnitude of large earthquake from static strain changes

    Science.gov (United States)

    Itaba, S.

    2016-12-01

    The 2011 Tohoku-Oki (off the Pacific coast of Tohoku) earthquake, of moment magnitude 9.0, was accompanied by large static strain changes (10-7), as measured by borehole strainmeters operated by the Geological Survey of Japan in the Tokai, Kii Peninsula, and Shikoku regions. A fault model for the earthquake on the boundary between the Pacific and North American plates, based on these borehole strainmeter data, yielded a moment magnitude of 8.7. On the other hand, based on the seismic wave, the prompt report of the magnitude which the Japan Meteorological Agency (JMA) announced just after earthquake occurrence was 7.9. Such geodetic moment magnitudes, derived from static strain changes, can be estimated almost as rapidly as determinations using seismic waves. I have to verify the validity of this method in some cases. In the case of this earthquake's largest aftershock, which occurred 29 minutes after the mainshock. The prompt report issued by JMA assigned this aftershock a magnitude of 7.3, whereas the moment magnitude derived from borehole strain data is 7.6, which is much closer to the actual moment magnitude of 7.7. In order to grasp the magnitude of a great earthquake earlier, several methods are now being suggested to reduce the earthquake disasters including tsunami. Our simple method of using static strain changes is one of the strong methods for rapid estimation of the magnitude of large earthquakes, and useful to improve the accuracy of Earthquake Early Warning.

  1. Web-Based Real Time Earthquake Forecasting and Personal Risk Management

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Turcotte, D. L.; Donnellan, A.

    2012-12-01

    Earthquake forecasts have been computed by a variety of countries and economies world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. One example is the Working Group on California Earthquake Probabilities that has been responsible for the official California earthquake forecast since 1988. However, in a time of increasingly severe global financial constraints, we are now moving inexorably towards personal risk management, wherein mitigating risk is becoming the responsibility of individual members of the public. Under these circumstances, open access to a variety of web-based tools, utilities and information is a necessity. Here we describe a web-based system that has been operational since 2009 at www.openhazards.com and www.quakesim.org. Models for earthquake physics and forecasting require input data, along with model parameters. The models we consider are the Natural Time Weibull (NTW) model for regional earthquake forecasting, together with models for activation and quiescence. These models use small earthquakes ('seismicity-based models") to forecast the occurrence of large earthquakes, either through varying rates of small earthquake activity, or via an accumulation of this activity over time. These approaches use data-mining algorithms combined with the ANSS earthquake catalog. The basic idea is to compute large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Each of these approaches has computational challenges associated with computing forecast information in real time. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we show that real-time forecasting is possible at a grid scale of 0.1o. We have analyzed the performance of these models using Reliability/Attributes and standard Receiver Operating Characteristic (ROC) tests. We show how the Reliability and

  2. The music of earthquakes and Earthquake Quartet #1

    Science.gov (United States)

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  3. Is Your Class a Natural Disaster? It can be... The Real Time Earthquake Education (RTEE) System

    Science.gov (United States)

    Whitlock, J. S.; Furlong, K.

    2003-12-01

    In cooperation with the U.S. Geological Survey (USGS) and its National Earthquake Information Center (NEIC) in Golden, Colorado, we have implemented an autonomous version of the NEIC's real-time earthquake database management and earthquake alert system (Earthworm). This is the same system used professionally by the USGS in its earthquake response operations. Utilizing this system, Penn State University students participating in natural hazard classes receive real-time alerts of worldwide earthquake events on cell phones distributed to the class. The students are then responsible for reacting to actual earthquake events, in real-time, with the same data (or lack thereof) as earthquake professionals. The project was first implemented in Spring 2002, and although it had an initial high intrigue and "coolness" factor, the interest of the students waned with time. Through student feedback, we observed that scientific data presented on its own without an educational context does not foster student learning. In order to maximize the impact of real-time data and the accompanying e-media, the students need to become personally involved. Therefore, in collaboration with the Incorporated Research Institutes of Seismology (IRIS), we have begun to develop an online infrastructure that will help teachers and faculty effectively use real-time earthquake information. The Real-Time Earthquake Education (RTEE) website promotes student learning by integrating inquiry-based education modules with real-time earthquake data. The first module guides the students through an exploration of real-time and historic earthquake datasets to model the most important criteria for determining the potential impact of an earthquake. Having provided the students with content knowledge in the first module, the second module presents a more authentic, open-ended educational experience by setting up an earthquake role-play situation. Through the Earthworm system, we have the ability to "set off

  4. Prevent recurrence of nuclear disaster (3). Agenda on nuclear safety from earthquake engineering

    International Nuclear Information System (INIS)

    Kameda, Hiroyuki; Takada, Tsuyoshi; Ebisawa, Katsumi; Nakamura, Susumu

    2012-01-01

    Based on results of activities of committee on seismic safety of nuclear power plants (NPPs) of Japan Association for Earthquake Engineering, which started activities after Chuetsu-oki earthquake and then experienced Great East Japan Earthquake, (under close collaboration with the committee of Atomic Energy Society of Japan started activities simultaneously), and taking account of further development of concept, agenda on nuclear safety were proposed from earthquake engineering. In order to prevent recurrence of nuclear disaster, individual technical issues of earthquake engineering and comprehensive issues of integration technology, multidisciplinary collaboration and establishment of technology governance based on them were of prime importance. This article described important problems to be solved; (1) technical issues and mission of seismic safety of NPPs, (2) decision making based on risk assessment - basis of technical governance, (3) framework of risk, design and regulation - framework of required technology governance, (4) technical issues of earthquake engineering for nuclear safety, (5) role of earthquake engineering in nuclear power risk communication and (6) importance of multidisciplinary collaboration. Responsibility of engineering would be attributed to establishment of technology governance, cultivation of individual technology and integration technology, and social communications. (T. Tanaka)

  5. Book review: Earthquakes and water

    Science.gov (United States)

    Bekins, Barbara A.

    2012-01-01

    It is really nice to see assembled in one place a discussion of the documented and hypothesized hydrologic effects of earthquakes. The book is divided into chapters focusing on particular hydrologic phenomena including liquefaction, mud volcanism, stream discharge increases, groundwater level, temperature and chemical changes, and geyser period changes. These hydrologic effects are inherently fascinating, and the large number of relevant publications in the past decade makes this summary a useful milepost. The book also covers hydrologic precursors and earthquake triggering by pore pressure. A natural need to limit the topics covered resulted in the omission of tsunamis and the vast literature on the role of fluids and pore pressure in frictional strength of faults. Regardless of whether research on earthquake-triggered hydrologic effects ultimately provides insight into the physics of earthquakes, the text provides welcome common ground for interdisciplinary collaborations between hydrologists and seismologists. Such collaborations continue to be crucial for investigating hypotheses about the role of fluids in earthquakes and slow slip. 

  6. Urgent Safety Measures in Japan after Great East Japan Earthquake

    International Nuclear Information System (INIS)

    Taniura, Wataru; Otani, Hiroyasu

    2012-01-01

    Due to tsunami triggered by the Great East Japan Earthquake, the operating and refueling reactor facilities at Fukushima Dai-ichi and Dai-ni Nuclear Power Plants caused a nuclear hazard. Given the fact, Japanese electric power companies voluntarily began to compile various urgent measures against tsunami. And then the Nuclear and Industrial Safety Agency (NISA) ordered the licensees to put into practice the voluntarily compiled urgent safety measures, in order to ensure the effectiveness of the means for recovering cooling functions along with avoiding the release of radioactive substances to the possible minimum, even if a huge tsunami following a severe earthquake hits nuclear power plants. The following describes the state and the effect of the urgent safety measures implemented for 44 reactors (under operation) and 1 reactor (under construction) in Japan and also describes the measures to be implemented by the licensees of reactor operation in the future.

  7. Global Earthquake Hazard Frequency and Distribution

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Earthquake Hazard Frequency and Distribution is a 2.5 minute grid utilizing Advanced National Seismic System (ANSS) Earthquake Catalog data of actual...

  8. Unbonded Prestressed Columns for Earthquake Resistance

    Science.gov (United States)

    2012-05-01

    Modern structures are able to survive significant shaking caused by earthquakes. By implementing unbonded post-tensioned tendons in bridge columns, the damage caused by an earthquake can be significantly lower than that of a standard reinforced concr...

  9. Extreme value distribution of earthquake magnitude

    Science.gov (United States)

    Zi, Jun Gan; Tung, C. C.

    1983-07-01

    Probability distribution of maximum earthquake magnitude is first derived for an unspecified probability distribution of earthquake magnitude. A model for energy release of large earthquakes, similar to that of Adler-Lomnitz and Lomnitz, is introduced from which the probability distribution of earthquake magnitude is obtained. An extensive set of world data for shallow earthquakes, covering the period from 1904 to 1980, is used to determine the parameters of the probability distribution of maximum earthquake magnitude. Because of the special form of probability distribution of earthquake magnitude, a simple iterative scheme is devised to facilitate the estimation of these parameters by the method of least-squares. The agreement between the empirical and derived probability distributions of maximum earthquake magnitude is excellent.

  10. The "Tsunami Earthquake" of 13 April 1923 in Northern Kamchatka: Seismological and Hydrodynamic Investigations

    Science.gov (United States)

    Salaree, Amir; Okal, Emile A.

    2018-04-01

    We present a seismological and hydrodynamic investigation of the earthquake of 13 April 1923 at Ust'-Kamchatsk, Northern Kamchatka, which generated a more powerful and damaging tsunami than the larger event of 03 February 1923, thus qualifying as a so-called "tsunami earthquake". On the basis of modern relocations, we suggest that it took place outside the fault area of the mainshock, across the oblique Pacific-North America plate boundary, a model confirmed by a limited dataset of mantle waves, which also confirms the slow nature of the source, characteristic of tsunami earthquakes. However, numerical simulations for a number of legitimate seismic models fail to reproduce the sharply peaked distribution of tsunami wave amplitudes reported in the literature. By contrast, we can reproduce the distribution of reported wave amplitudes using an underwater landslide as a source of the tsunami, itself triggered by the earthquake inside the Kamchatskiy Bight.

  11. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    Science.gov (United States)

    Ingebritsen, Steven E.; Shelly, David R.; Hsieh, Paul A.; Clor, Laura; P.H. Seward,; Evans, William C.

    2015-01-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  12. Irregular recurrence of large earthquakes along the san andreas fault: evidence from trees.

    Science.gov (United States)

    Jacoby, G C; Sheppard, P R; Sieh, K E

    1988-07-08

    Old trees growing along the San Andreas fault near Wrightwood, California, record in their annual ring-width patterns the effects of a major earthquake in the fall or winter of 1812 to 1813. Paleoseismic data and historical information indicate that this event was the "San Juan Capistrano" earthquake of 8 December 1812, with a magnitude of 7.5. The discovery that at least 12 kilometers of the Mojave segment of the San Andreas fault ruptured in 1812, only 44 years before the great January 1857 rupture, demonstrates that intervals between large earthquakes on this part of the fault are highly variable. This variability increases the uncertainty of forecasting destructive earthquakes on the basis of past behavior and accentuates the need for a more fundamental knowledge of San Andreas fault dynamics.

  13. BWR NSSS design basis documentation

    International Nuclear Information System (INIS)

    Vij, R.S.; Bates, R.E.

    2004-01-01

    programs that GE has participated in and describes the different options and approaches that have been used by various utilities in their design basis programs. Some of these variations deal with the scope and depth of coverage of the information, while others are related to the process (how the work is done). Both of these topics can have a significant effect on the program cost. Some insight into these effects is provided. The final section of the paper presents a set of lessons learned and a recommendation for an optimum approach to a design basis information program. The lessons learned reflect the knowledge that GE has gained by participating in design basis programs with nineteen domestic and international BWR owner/operators. The optimum approach described in this paper is GE's attempt to define a set of information and a work process for a utility/GE NSSS Design Basis Information program that will maximize the cost effectiveness of the program for the utility. (author)

  14. Characterisation of Liquefaction Effects for Beyond-Design Basis Safety Assessment of Nuclear Power Plants

    Science.gov (United States)

    Bán, Zoltán; Győri, Erzsébet; János Katona, Tamás; Tóth, László

    2015-04-01

    Preparedness of nuclear power plants to beyond design base external effects became high importance after 11th of March 2011 Great Tohoku Earthquakes. In case of some nuclear power plants constructed at the soft soil sites, liquefaction should be considered as a beyond design basis hazard. The consequences of liquefaction have to be analysed with the aim of definition of post-event plant condition, identification of plant vulnerabilities and planning the necessary measures for accident management. In the paper, the methodology of the analysis of liquefaction effects for nuclear power plants is outlined. The case of Nuclear Power Plant at Paks, Hungary is used as an example for demonstration of practical importance of the presented results and considerations. Contrary to the design, conservatism of the methodology for the evaluation of beyond design basis liquefaction effects for an operating plant has to be limited to a reasonable level. Consequently, applicability of all existing methods has to be considered for the best estimation. The adequacy and conclusiveness of the results is mainly limited by the epistemic uncertainty of the methods used for liquefaction hazard definition and definition of engineering parameters characterizing the consequences of liquefaction. The methods have to comply with controversial requirements. They have to be consistent and widely accepted and used in the practice. They have to be based on the comprehensive database. They have to provide basis for the evaluation of dominating engineering parameters that control the post-liquefaction response of the plant structures. Experience of Kashiwazaki-Kariwa plant hit by Niigata-ken Chuetsu-oki earthquake of 16 July 2007 and analysis of site conditions and plant layout at Paks plant have shown that the differential settlement is found to be the dominating effect in case considered. They have to be based on the probabilistic seismic hazard assessment and allow the integration into logic

  15. 340 waste handling facility interim safety basis

    Energy Technology Data Exchange (ETDEWEB)

    VAIL, T.S.

    1999-04-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people.

  16. 340 waste handling facility interim safety basis

    International Nuclear Information System (INIS)

    VAIL, T.S.

    1999-01-01

    This document presents an interim safety basis for the 340 Waste Handling Facility classifying the 340 Facility as a Hazard Category 3 facility. The hazard analysis quantifies the operating safety envelop for this facility and demonstrates that the facility can be operated without a significant threat to onsite or offsite people

  17. PRECURSORS OF EARTHQUAKES: VLF SIGNALSIONOSPHERE IONOSPHERE RELATION

    Directory of Open Access Journals (Sweden)

    Mustafa ULAS

    2013-01-01

    Full Text Available lot of people have died because of earthquakes every year. Therefore It is crucial to predict the time of the earthquakes reasonable time before it had happed. This paper presents recent information published in the literature about precursors of earthquakes. The relationships between earthquakes and ionosphere are targeted to guide new researches in order to study further to find novel prediction methods.

  18. EARTHQUAKE RESEARCH PROBLEMS OF NUCLEAR POWER GENERATORS

    Energy Technology Data Exchange (ETDEWEB)

    Housner, G. W.; Hudson, D. E.

    1963-10-15

    Earthquake problems associated with the construction of nuclear power generators require a more extensive and a more precise knowledge of earthquake characteristics and the dynamic behavior of structures than was considered necessary for ordinary buildings. Economic considerations indicate the desirability of additional research on the problems of earthquakes and nuclear reactors. The nature of these earthquake-resistant design problems is discussed and programs of research are recommended. (auth)

  19. Fault geometry and earthquake mechanics

    Directory of Open Access Journals (Sweden)

    D. J. Andrews

    1994-06-01

    Full Text Available Earthquake mechanics may be determined by the geometry of a fault system. Slip on a fractal branching fault surface can explain: 1 regeneration of stress irregularities in an earthquake; 2 the concentration of stress drop in an earthquake into asperities; 3 starting and stopping of earthquake slip at fault junctions, and 4 self-similar scaling of earthquakes. Slip at fault junctions provides a natural realization of barrier and asperity models without appealing to variations of fault strength. Fault systems are observed to have a branching fractal structure, and slip may occur at many fault junctions in an earthquake. Consider the mechanics of slip at one fault junction. In order to avoid a stress singularity of order 1/r, an intersection of faults must be a triple junction and the Burgers vectors on the three fault segments at the junction must sum to zero. In other words, to lowest order the deformation consists of rigid block displacement, which ensures that the local stress due to the dislocations is zero. The elastic dislocation solution, however, ignores the fact that the configuration of the blocks changes at the scale of the displacement. A volume change occurs at the junction; either a void opens or intense local deformation is required to avoid material overlap. The volume change is proportional to the product of the slip increment and the total slip since the formation of the junction. Energy absorbed at the junction, equal to confining pressure times the volume change, is not large enongh to prevent slip at a new junction. The ratio of energy absorbed at a new junction to elastic energy released in an earthquake is no larger than P/µ where P is confining pressure and µ is the shear modulus. At a depth of 10 km this dimensionless ratio has th value P/µ= 0.01. As slip accumulates at a fault junction in a number of earthquakes, the fault segments are displaced such that they no longer meet at a single point. For this reason the

  20. Historical earthquake investigations in Greece

    Directory of Open Access Journals (Sweden)

    K. Makropoulos

    2004-06-01

    Full Text Available The active tectonics of the area of Greece and its seismic activity have always been present in the country?s history. Many researchers, tempted to work on Greek historical earthquakes, have realized that this is a task not easily fulfilled. The existing catalogues of strong historical earthquakes are useful tools to perform general SHA studies. However, a variety of supporting datasets, non-uniformly distributed in space and time, need to be further investigated. In the present paper, a review of historical earthquake studies in Greece is attempted. The seismic history of the country is divided into four main periods. In each one of them, characteristic examples, studies and approaches are presented.

  1. Ground Motion Characteristics of Induced Earthquakes in Central North America

    Science.gov (United States)

    Atkinson, G. M.; Assatourians, K.; Novakovic, M.

    2017-12-01

    The ground motion characteristics of induced earthquakes in central North America are investigated based on empirical analysis of a compiled database of 4,000,000 digital ground-motion records from events in induced-seismicity regions (especially Oklahoma). Ground-motion amplitudes are characterized non-parametrically by computing median amplitudes and their variability in magnitude-distance bins. We also use inversion techniques to solve for regional source, attenuation and site response effects. Ground motion models are used to interpret the observations and compare the source and attenuation attributes of induced earthquakes to those of their natural counterparts. Significant conclusions are that the stress parameter that controls the strength of high-frequency radiation is similar for induced earthquakes (depth of h 5 km) and shallow (h 5 km) natural earthquakes. By contrast, deeper natural earthquakes (h 10 km) have stronger high-frequency ground motions. At distances close to the epicenter, a greater focal depth (which increases distance from the hypocenter) counterbalances the effects of a larger stress parameter, resulting in motions of similar strength close to the epicenter, regardless of event depth. The felt effects of induced versus natural earthquakes are also investigated using USGS "Did You Feel It?" reports; 400,000 reports from natural events and 100,000 reports from induced events are considered. The felt reports confirm the trends that we expect based on ground-motion modeling, considering the offsetting effects of the stress parameter versus focal depth in controlling the strength of motions near the epicenter. Specifically, felt intensity for a given magnitude is similar near the epicenter, on average, for all event types and depths. At distances more than 10 km from the epicenter, deeper events are felt more strongly than shallow events. These ground-motion attributes imply that the induced-seismicity hazard is most critical for facilities in

  2. The Value, Protocols, and Scientific Ethics of Earthquake Forecasting

    Science.gov (United States)

    Jordan, Thomas H.

    2013-04-01

    Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should

  3. Theoretical basis for dosimetry

    International Nuclear Information System (INIS)

    Carlsson, G.A.

    1985-01-01

    Radiation dosimetry is fundamental to all fields of science dealing with radiation effects and is concerned with problems which are often intricate as hinted above. A firm scientific basis is needed to face increasing demands on accurate dosimetry. This chapter is an attempt to review and to elucidate the elements for such a basis. Quantities suitable for radiation dosimetry have been defined in the unique work to coordinate radiation terminology and usage by the International Commission on Radiation Units and Measurements, ICRU. Basic definitions and terminology used in this chapter conform with the recent ''Radiation Quantities and Units, Report 33'' of the ICRU

  4. Fault failure with moderate earthquakes

    Science.gov (United States)

    Johnston, M. J. S.; Linde, A. T.; Gladwin, M. T.; Borcherdt, R. D.

    1987-12-01

    High resolution strain and tilt recordings were made in the near-field of, and prior to, the May 1983 Coalinga earthquake ( ML = 6.7, Δ = 51 km), the August 4, 1985, Kettleman Hills earthquake ( ML = 5.5, Δ = 34 km), the April 1984 Morgan Hill earthquake ( ML = 6.1, Δ = 55 km), the November 1984 Round Valley earthquake ( ML = 5.8, Δ = 54 km), the January 14, 1978, Izu, Japan earthquake ( ML = 7.0, Δ = 28 km), and several other smaller magnitude earthquakes. These recordings were made with near-surface instruments (resolution 10 -8), with borehole dilatometers (resolution 10 -10) and a 3-component borehole strainmeter (resolution 10 -9). While observed coseismic offsets are generally in good agreement with expectations from elastic dislocation theory, and while post-seismic deformation continued, in some cases, with a moment comparable to that of the main shock, preseismic strain or tilt perturbations from hours to seconds (or less) before the main shock are not apparent above the present resolution. Precursory slip for these events, if any occurred, must have had a moment less than a few percent of that of the main event. To the extent that these records reflect general fault behavior, the strong constraint on the size and amount of slip triggering major rupture makes prediction of the onset times and final magnitudes of the rupture zones a difficult task unless the instruments are fortuitously installed near the rupture initiation point. These data are best explained by an inhomogeneous failure model for which various areas of the fault plane have either different stress-slip constitutive laws or spatially varying constitutive parameters. Other work on seismic waveform analysis and synthetic waveforms indicates that the rupturing process is inhomogeneous and controlled by points of higher strength. These models indicate that rupture initiation occurs at smaller regions of higher strength which, when broken, allow runaway catastrophic failure.

  5. Modeling, Forecasting and Mitigating Extreme Earthquakes

    Science.gov (United States)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  6. 13 CFR 120.174 - Earthquake hazards.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Earthquake hazards. 120.174... Applying to All Business Loans Requirements Imposed Under Other Laws and Orders § 120.174 Earthquake..., the construction must conform with the “National Earthquake Hazards Reduction Program (“NEHRP...

  7. Operating function tests of the PWR type RHR pump for engineering safety system under simulated strong ground excitation

    International Nuclear Information System (INIS)

    Uga, Takeo; Shiraki, Kazuhiro; Homma, Toshiaki; Inazuka, Hisashi; Nakajima, Norifumi.

    1979-08-01

    Results are described of operating function verification tests of a PWR RHR pump during an earthquake. Of the active reactor components, the PWR residual heat removal pump was chosen from view points of aseismic classification, safety function, structural complexity and past aseismic tests. Through survey of the service conditions and structure of this pump, seismic test conditions such as acceleration level, simulated seismic wave form and earthquake duration were decided for seismicity of the operating pump. Then, plans were prepared to evaluate vibration chracteristics of the pump and to estimate its aseismic design margins. Subsequently, test facility and instrumentation system were designed and constructed. Experimental results could thus be acquired on vibration characteristics of the pump and its dynamic behavior during different kinds and levels of simulated earthquake. In conclusion: (1) Stiffeners attached to the auxiliary system piping do improve aseismic performance of the pump. (2) The rotor-shaft-bearing system is secure unless it is subjected to transient disturbunces having high frequency content. (3) The motor and pump casing having resonance frequencies much higher than frequency content of the seismic wave show only small amplifications. (4) The RHR pump possesses an aseismic design margin more than 2.6 times the expected ultimate earthquake on design basis. (author)

  8. Current status of JRR-3. After the 3.11 earthquake

    International Nuclear Information System (INIS)

    Arai, Masaji; Murayama, Yoji; Wada, Shigeru

    2012-01-01

    JRR-3 at Tokai site of JAEA was in its regular maintenance period, when the Great East Japan Earthquake was taken place on 11th March 2011. The reactor building with their solid foundations and the equipment important to safety survived the earthquake without serious damage and no radioactive leakage has been occurred. Recovery work is planned to be completed by the end of this March. At the same time, check and test of the integrity of all components and seismic assessment to show resistance with the 3.11 earthquake have been carried out. JRR-3 will restart its operation after completing above mentioned procedures. (author)

  9. Computational methods in earthquake engineering

    CERN Document Server

    Plevris, Vagelis; Lagaros, Nikos

    2017-01-01

    This is the third book in a series on Computational Methods in Earthquake Engineering. The purpose of this volume is to bring together the scientific communities of Computational Mechanics and Structural Dynamics, offering a wide coverage of timely issues on contemporary Earthquake Engineering. This volume will facilitate the exchange of ideas in topics of mutual interest and can serve as a platform for establishing links between research groups with complementary activities. The computational aspects are emphasized in order to address difficult engineering problems of great social and economic importance. .

  10. Earthquake Education in Prime Time

    Science.gov (United States)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  11. Radon as an earthquake precursor

    International Nuclear Information System (INIS)

    Planinic, J.; Radolic, V.; Vukovic, B.

    2004-01-01

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude ≥3 at epicentral distances ≤200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined

  12. Radon as an earthquake precursor

    Energy Technology Data Exchange (ETDEWEB)

    Planinic, J. E-mail: planinic@pedos.hr; Radolic, V.; Vukovic, B

    2004-09-11

    Radon concentrations in soil gas were continuously measured by the LR-115 nuclear track detectors during a four-year period. Seismic activities, as well as barometric pressure, rainfall and air temperature were also observed. The influence of meteorological parameters on temporal radon variations was investigated, and a respective equation of the multiple regression was derived. The earthquakes with magnitude {>=}3 at epicentral distances {<=}200 km were recognized by means of radon anomaly. Empirical equations between earthquake magnitude, epicentral distance and precursor time were examined, and respective constants were determined.

  13. Earthquake location in island arcs

    Science.gov (United States)

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  14. Radon anomalies preceding earthquakes which occurred in the UK, in summer and autumn 2002

    International Nuclear Information System (INIS)

    Crockett, R.G.M.; Gillmore, G.K.; Phillips, P.S.; Denman, A.R.; Groves-Kirkby, C.J.

    2006-01-01

    During the course of an investigation into domestic radon levels in Northamptonshire, two hourly sampling real-time radon detectors were operated simultaneously in separate locations 2.25 km apart in Northampton, in the English East Midlands, for a 25-week period. This period of operation encompassed the period in September 2002 during which the Dudley earthquake (magnitude - 5.0) and smaller aftershocks occurred in the English West Midlands, UK. We report herein our observations regarding the occurrence of simultaneous short-period radon anomalies and their timing in relation to the Dudley, and other, earthquakes which occurred during the monitoring period. Analysis of the radon time-series reveals a short period when the two time-series displayed simultaneous in-phase short-term (6-9 h) radon anomalies prior to the main Dudley earthquake. Subsequent investigation revealed that a similar period occurred prior to another smaller but recorded earthquake in the English Channel

  15. Dancing Earthquake Science Assists Recovery from the Christchurch Earthquakes

    Science.gov (United States)

    Egan, Candice J.; Quigley, Mark C.

    2015-01-01

    The 2010-2012 Christchurch (Canterbury) earthquakes in New Zealand caused loss of life and psychological distress in residents throughout the region. In 2011, student dancers of the Hagley Dance Company and dance professionals choreographed the performance "Move: A Seismic Journey" for the Christchurch Body Festival that explored…

  16. From BASIS to MIRACLES

    DEFF Research Database (Denmark)

    Tsapatsaris, Nikolaos; Willendrup, Peter Kjær; E. Lechner, Ruep

    2015-01-01

    Results based on virtual instrument models for the first high-flux, high-resolution, spallation based, backscattering spectrometer, BASIS are presented in this paper. These were verified using the Monte Carlo instrument simulation packages McStas and VITESS. Excellent agreement of the neutron count...... are pivotal to the conceptual design of the next generation backscattering spectrometer, MIRACLES at the European Spallation Source....

  17. TrigDB for improving the reliability of the epicenter locations by considering the neighborhood station's trigger and cutting out of outliers in operation of Earthquake Early Warning System.

    Science.gov (United States)

    Chi, H. C.; Park, J. H.; Lim, I. S.; Seong, Y. J.

    2016-12-01

    TrigDB is initially developed for the discrimination of teleseismic-origin false alarm in the case with unreasonably associated triggers producing mis-located epicenters. We have applied TrigDB to the current EEWS(Earthquake Early Warning System) from 2014. During the early stage of testing EEWS from 2011, we adapted ElarmS from US Berkeley BSL to Korean seismic network and applied more than 5 years. We found out that the real-time testing results of EEWS in Korea showed that all events inside of seismic network with bigger than magnitude 3.0 were well detected. However, two events located at sea area gave false location results with magnitude over 4.0 due to the long period and relatively high amplitude signals related to the teleseismic waves or regional deep sources. These teleseismic-relevant false events were caused by logical co-relation during association procedure and the corresponding geometric distribution of associated stations is crescent-shaped. Seismic stations are not deployed uniformly, so the expected bias ratio varies with evaluated epicentral location. This ratio is calculated in advance and stored into database, called as TrigDB, for the discrimination of teleseismic-origin false alarm. We upgraded this method, so called `TrigDB back filling', updating location with supplementary association of stations comparing triggered times between sandwiched stations which was not associated previously based on predefined criteria such as travel-time. And we have tested a module to reject outlier trigger times by setting a criteria comparing statistical values(Sigma) to the triggered times. The criteria of cutting off the outlier is slightly slow to work until the number of stations more than 8, however, the result of location is very much improved.

  18. Predictability of Landslide Timing From Quasi-Periodic Precursory Earthquakes

    Science.gov (United States)

    Bell, Andrew F.

    2018-02-01

    Accelerating rates of geophysical signals are observed before a range of material failure phenomena. They provide insights into the physical processes controlling failure and the basis for failure forecasts. However, examples of accelerating seismicity before landslides are rare, and their behavior and forecasting potential are largely unknown. Here I use a Bayesian methodology to apply a novel gamma point process model to investigate a sequence of quasiperiodic repeating earthquakes preceding a large landslide at Nuugaatsiaq in Greenland in June 2017. The evolution in earthquake rate is best explained by an inverse power law increase with time toward failure, as predicted by material failure theory. However, the commonly accepted power law exponent value of 1.0 is inconsistent with the data. Instead, the mean posterior value of 0.71 indicates a particularly rapid acceleration toward failure and suggests that only relatively short warning times may be possible for similar landslides in future.

  19. Temporal properties of seismicity and largest earthquakes in SE Carpathians

    Directory of Open Access Journals (Sweden)

    S. Byrdina

    2006-01-01

    Full Text Available In order to estimate the hazard rate distribution of the largest seismic events in Vrancea, South-Eastern Carpathians, we study temporal properties of historical and instrumental catalogues of seismicity. First, on the basis of Generalized Extreme Value theory we estimate the average return period of the largest events. Then, following Bak et al. (2002 and Corral (2005a, we study scaling properties of recurrence times between earthquakes in appropriate spatial volumes. We come to the conclusion that the seismicity is temporally clustered, and that the distribution of recurrence times is significantly different from a Poisson process even for times largely exceeding corresponding periods of foreshock and aftershock activity. Modeling the recurrence times by a gamma distributed variable, we finally estimate hazard rates with respect to the time elapsed from the last large earthquake.

  20. Earthquake prediction in Japan and natural time analysis of seismicity

    Science.gov (United States)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    M9 super-giant earthquake with huge tsunami devastated East Japan on 11 March, causing more than 20,000 casualties and serious damage of Fukushima nuclear plant. This earthquake was predicted neither short-term nor long-term. Seismologists were shocked because it was not even considered possible to happen at the East Japan subduction zone. However, it was not the only un-predicted earthquake. In fact, throughout several decades of the National Earthquake Prediction Project, not even a single earthquake was predicted. In reality, practically no effective research has been conducted for the most important short-term prediction. This happened because the Japanese National Project was devoted for construction of elaborate seismic networks, which was not the best way for short-term prediction. After the Kobe disaster, in order to parry the mounting criticism on their no success history, they defiantly changed their policy to "stop aiming at short-term prediction because it is impossible and concentrate resources on fundamental research", that meant to obtain "more funding for no prediction research". The public were and are not informed about this change. Obviously earthquake prediction would be possible only when reliable precursory phenomena are caught and we have insisted this would be done most likely through non-seismic means such as geochemical/hydrological and electromagnetic monitoring. Admittedly, the lack of convincing precursors for the M9 super-giant earthquake has adverse effect for us, although its epicenter was far out off shore of the range of operating monitoring systems. In this presentation, we show a new possibility of finding remarkable precursory signals, ironically, from ordinary seismological catalogs. In the frame of the new time domain termed natural time, an order parameter of seismicity, κ1, has been introduced. This is the variance of natural time kai weighted by normalised energy release at χ. In the case that Seismic Electric Signals

  1. Significance of earthquake and weapons-test ground motion to structure response and NRC licensing

    International Nuclear Information System (INIS)

    Blume, J.A.

    1984-01-01

    The author feels that of all the problems to be resolved before a nuclear power plant can be licensed to operate, the earthquake problem is the most difficult from the emotional and public relations point of view, as well as technically. It is the one that intervenors and their lawyers thrive upon, as do the demonstrators. These earthquakes can be tectonic, reservoir induced, and/or imaginary. 9 references, 29 figures

  2. Mental Health of Survivors of the 2010 Haitian Earthquake Living in the United States

    Centers for Disease Control (CDC) Podcasts

    2010-04-16

    Thousands of survivors of the 2010 Haitian Earthquake are currently living in the United States. This podcast features a brief non-disease-specific interview with Dr. Marc Safran, CDC's longest serving psychiatrist, about a few of the mental health challenges such survivors may face.  Created: 4/16/2010 by CDC Center of Attribution: Mental and Behavioral Health Team, 2010 CDC Haiti Earthquake Mission, CDC Emergency Operations Center.   Date Released: 5/6/2010.

  3. Earthquake predictions using seismic velocity ratios

    Science.gov (United States)

    Sherburne, R. W.

    1979-01-01

    Since the beginning of modern seismology, seismologists have contemplated predicting earthquakes. The usefulness of earthquake predictions to the reduction of human and economic losses and the value of long-range earthquake prediction to planning is obvious. Not as clear are the long-range economic and social impacts of earthquake prediction to a speicifc area. The general consensus of opinion among scientists and government officials, however, is that the quest of earthquake prediction is a worthwhile goal and should be prusued with a sense of urgency. 

  4. Measuring the size of an earthquake

    Science.gov (United States)

    Spence, W.; Sipkin, S.A.; Choy, G.L.

    1989-01-01

    Earthquakes range broadly in size. A rock-burst in an Idaho silver mine may involve the fracture of 1 meter of rock; the 1965 Rat Island earthquake in the Aleutian arc involved a 650-kilometer length of the Earth's crust. Earthquakes can be even smaller and even larger. If an earthquake is felt or causes perceptible surface damage, then its intensity of shaking can be subjectively estimated. But many large earthquakes occur in oceanic areas or at great focal depths and are either simply not felt or their felt pattern does not really indicate their true size.

  5. Earthquakes-Rattling the Earth's Plumbing System

    Science.gov (United States)

    Sneed, Michelle; Galloway, Devin L.; Cunningham, William L.

    2003-01-01

    Hydrogeologic responses to earthquakes have been known for decades, and have occurred both close to, and thousands of miles from earthquake epicenters. Water wells have become turbid, dry or begun flowing, discharge of springs and ground water to streams has increased and new springs have formed, and well and surface-water quality have become degraded as a result of earthquakes. Earthquakes affect our Earth’s intricate plumbing system—whether you live near the notoriously active San Andreas Fault in California, or far from active faults in Florida, an earthquake near or far can affect you and the water resources you depend on.

  6. Summary of earthquake experience database

    International Nuclear Information System (INIS)

    1999-01-01

    Strong-motion earthquakes frequently occur throughout the Pacific Basin, where power plants or industrial facilities are included in the affected areas. By studying the performance of these earthquake-affected (or database) facilities, a large inventory of various types of equipment installations can be compiled that have experienced substantial seismic motion. The primary purposes of the seismic experience database are summarized as follows: to determine the most common sources of seismic damage, or adverse effects, on equipment installations typical of industrial facilities; to determine the thresholds of seismic motion corresponding to various types of seismic damage; to determine the general performance of equipment during earthquakes, regardless of the levels of seismic motion; to determine minimum standards in equipment construction and installation, based on past experience, to assure the ability to withstand anticipated seismic loads. To summarize, the primary assumption in compiling an experience database is that the actual seismic hazard to industrial installations is best demonstrated by the performance of similar installations in past earthquakes

  7. Earthquake design for controlled structures

    Directory of Open Access Journals (Sweden)

    Nikos G. Pnevmatikos

    2017-04-01

    Full Text Available An alternative design philosophy, for structures equipped with control devices, capable to resist an expected earthquake while remaining in the elastic range, is described. The idea is that a portion of the earthquake loading is under¬taken by the control system and the remaining by the structure which is designed to resist elastically. The earthquake forces assuming elastic behavior (elastic forces and elastoplastic behavior (design forces are first calculated ac¬cording to the codes. The required control forces are calculated as the difference from elastic to design forces. The maximum value of capacity of control devices is then compared to the required control force. If the capacity of the control devices is larger than the required control force then the control devices are accepted and installed in the structure and the structure is designed according to the design forces. If the capacity is smaller than the required control force then a scale factor, α, reducing the elastic forces to new design forces is calculated. The structure is redesigned and devices are installed. The proposed procedure ensures that the structure behaves elastically (without damage for the expected earthquake at no additional cost, excluding that of buying and installing the control devices.

  8. Using Smartphones to Detect Earthquakes

    Science.gov (United States)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  9. Explanation of earthquake response spectra

    OpenAIRE

    Douglas, John

    2017-01-01

    This is a set of five slides explaining how earthquake response spectra are derived from strong-motion records and simple models of structures and their purpose within seismic design and assessment. It dates from about 2002 and I have used it in various introductory lectures on engineering seismology.

  10. Solar eruptions - soil radon - earthquakes

    International Nuclear Information System (INIS)

    Saghatelyan, E.; Petrosyan, L.; Aghbalyan, Yu.; Baburyan, M.; Araratyan, L.

    2004-01-01

    For the first time a new natural phenomenon was established: a contrasting increase in the soil radon level under the influence of solar flares. Such an increase is one of geochemical indicators of earthquakes. Most researchers consider this a phenomenon of exclusively terrestrial processes. Investigations regarding the link of earthquakes to solar activity carried out during the last decade in different countries are based on the analysis of statistical data ΣΕ (t) and W (t). As established, the overall seismicity of the Earth and its separate regions depends of an 11-year long cycle of solar activity. Data provided in the paper based on experimental studies serve the first step on the way of experimental data on revealing cause-and-reason solar-terrestrials bonds in a series s olar eruption-lithosphere radon-earthquakes . They need further collection of experimental data. For the first time, through radon constituent of terrestrial radiation objectification has been made of elementary lattice of the Hartmann's network contoured out by bio location method. As found out, radon concentration variations in Hartmann's network nodes determine the dynamics of solar-terrestrial relationships. Of the three types of rapidly running processes conditioned by solar-terrestrial bonds earthquakes are attributed to rapidly running destructive processes that occur in the most intense way at the juncture of tectonic massifs, along transformed and deep failures. The basic factors provoking the earthquakes are both magnetic-structural effects and a long-term (over 5 months) bombing of the surface of lithosphere by highly energetic particles of corpuscular solar flows, this being approved by photometry. As a result of solar flares that occurred from 29 October to 4 November 2003, a sharply contrasting increase in soil radon was established which is an earthquake indicator on the territory of Yerevan City. A month and a half later, earthquakes occurred in San-Francisco, Iran, Turkey

  11. About Block Dynamic Model of Earthquake Source.

    Science.gov (United States)

    Gusev, G. A.; Gufeld, I. L.

    One may state the absence of a progress in the earthquake prediction papers. The short-term prediction (diurnal period, localisation being also predicted) has practical meaning. Failure is due to the absence of the adequate notions about geological medium, particularly, its block structure and especially in the faults. Geological and geophysical monitoring gives the basis for the notion about geological medium as open block dissipative system with limit energy saturation. The variations of the volume stressed state close to critical states are associated with the interaction of the inhomogeneous ascending stream of light gases (helium and hydrogen) with solid phase, which is more expressed in the faults. In the background state small blocks of the fault medium produce the sliding of great blocks in the faults. But for the considerable variations of ascending gas streams the formation of bound chains of small blocks is possible, so that bound state of great blocks may result (earthquake source). Recently using these notions we proposed a dynamical earthquake source model, based on the generalized chain of non-linear bound oscillators of Fermi-Pasta-Ulam type (FPU). The generalization concerns its in homogeneity and different external actions, imitating physical processes in the real source. Earlier weak inhomogeneous approximation without dissipation was considered. Last has permitted to study the FPU return (return to initial state). Probabilistic properties in quasi periodic movement were found. The chain decay problem due to non-linearity and external perturbations was posed. The thresholds and dependence of life- time of the chain are studied. The great fluctuations of life-times are discovered. In the present paper the rigorous consideration of the inhomogeneous chain including the dissipation is considered. For the strong dissipation case, when the oscillation movements are suppressed, specific effects are discovered. For noise action and constantly arising

  12. Dynamical basis set

    International Nuclear Information System (INIS)

    Blanco, M.; Heller, E.J.

    1985-01-01

    A new Cartesian basis set is defined that is suitable for the representation of molecular vibration-rotation bound states. The Cartesian basis functions are superpositions of semiclassical states generated through the use of classical trajectories that conform to the intrinsic dynamics of the molecule. Although semiclassical input is employed, the method becomes ab initio through the standard matrix diagonalization variational method. Special attention is given to classical-quantum correspondences for angular momentum. In particular, it is shown that the use of semiclassical information preferentially leads to angular momentum eigenstates with magnetic quantum number Vertical BarMVertical Bar equal to the total angular momentum J. The present method offers a reliable technique for representing highly excited vibrational-rotational states where perturbation techniques are no longer applicable

  13. Automated radon-thoron monitoring for earthquake prediction research

    International Nuclear Information System (INIS)

    Shapiro, M.H.; Melvin, J.D.; Copping, N.A.; Tombrello, T.A.; Whitcomb, J.H.

    1980-01-01

    This paper describes an automated instrument for earthquake prediction research which monitors the emission of radon ( 222 Rn) and thoron ( 220 Rn) from rock. The instrument uses aerosol filtration techniques and beta counting to determine radon and thoron levels. Data from the first year of operation of a field prototype suggest an annual cycle in the radon level at the site which is related to thermoelastic strains in the crust. Two anomalous increases in the radon level of short duration have been observed during the first year of operation. One anomaly appears to have been a precursor for a nearby earthquake (2.8 magnitude, Richter scale), and the other may have been associated with changing hydrological conditions resulting from heavy rainfall

  14. Survey of awareness and analyses of related factors to volunteer activities of pharmacy students after the Great East Japan Earthquake

    OpenAIRE

    小武家, 優子; 吉田, 健; 吉武, 毅人

    2012-01-01

    The Great East Japan Earthquake occurred on March 11, 2011. At the time of the earthquake, pharmacist and pharmacy students engaged in volunteer activities such as providing disaster medicine and relief supplies to disaster areas. Questionnaire survey for pharmacy students were carried out in order to clarify awareness to volunteer activities for disaster areas and to use data as a basis of Service-Learning in the 6 years pharmacy education. We divided subjects into pharmacy students those wo...

  15. RAPID EXTRACTION OF LANDSLIDE AND SPATIAL DISTRIBUTION ANALYSIS AFTER JIUZHAIGOU Ms7.0 EARTHQUAKE BASED ON UAV IMAGES

    OpenAIRE

    Q. S. Jiao; Y. Luo; W. H. Shen; Q. Li; X. Wang

    2018-01-01

    Jiuzhaigou earthquake led to the collapse of the mountains and formed lots of landslides in Jiuzhaigou scenic spot and surrounding roads which caused road blockage and serious ecological damage. Due to the urgency of the rescue, the authors carried unmanned aerial vehicle (UAV) and entered the disaster area as early as August 9 to obtain the aerial images near the epicenter. On the basis of summarizing the earthquake landslides characteristics in aerial images, by using the object-oriented an...

  16. Report on the seismic safety examination of nuclear facilities based on the 1995 Hyogoken-Nanbu earthquake

    International Nuclear Information System (INIS)

    2001-01-01

    Just after the Hyogoken-Nanbu Earthquake occurred, Nuclear Safety Commission of Japan established a committee to examine the validity or related guidelines on the seismic design to be used for the safety examination. After the 8 months study, the committee confirmed that the validity of guidelines regulating the seismic design of nuclear facilities is not impaired even though on the basis of the Hyogoken-Nanbu earthquake. This report is the outline of the Committee's study results. (author)

  17. Earthquake protection of nuclear power plant equipment

    Energy Technology Data Exchange (ETDEWEB)

    Nawrotzki, Peter [GERB Vibration Control Systems, Berlin (Germany)

    2010-05-15

    Power plant machinery can be dynamically decoupled from the substructure by the effective use of helical steel springs and viscous dampers. Turbine foundations, boiler feed pumps and other machine foundations benefit from this type of elastic support systems to mitigate the transmission of operational vibration. The application of these devices may also be used to protect against earthquakes and other catastrophic events, i.e. airplane crash, of particular importance in nuclear facilities. This article illustrates basic principles of elastic support systems and applications on power plant buildings in medium and high seismic areas. Spring-damper combinations with special stiffness properties are used to reduce seismic acceleration levels of turbine components and other safety or non-safety related structures. For turbine buildings, the integration of the turbine substructure into the machine building can further reduce stress levels in all structural members. (orig.)

  18. Earthquake protection of nuclear power plant equipment

    International Nuclear Information System (INIS)

    Nawrotzki, Peter

    2010-01-01

    Power plant machinery can be dynamically decoupled from the substructure by the effective use of helical steel springs and viscous dampers. Turbine foundations, boiler feed pumps and other machine foundations benefit from this type of elastic support systems to mitigate the transmission of operational vibration. The application of these devices may also be used to protect against earthquakes and other catastrophic events, i.e. airplane crash, of particular importance in nuclear facilities. This article illustrates basic principles of elastic support systems and applications on power plant buildings in medium and high seismic areas. Spring-damper combinations with special stiffness properties are used to reduce seismic acceleration levels of turbine components and other safety or non-safety related structures. For turbine buildings, the integration of the turbine substructure into the machine building can further reduce stress levels in all structural members. (orig.)

  19. Feasibility Study of Earthquake Early Warning in Hawai`i For the Mauna Kea Thirty Meter Telescope

    Science.gov (United States)

    Okubo, P.; Hotovec-Ellis, A. J.; Thelen, W. A.; Bodin, P.; Vidale, J. E.

    2014-12-01

    Earthquakes, including large damaging events, are as central to the geologic evolution of the Island of Hawai`i as its more famous volcanic eruptions and lava flows. Increasing and expanding development of facilities and infrastructure on the island continues to increase exposure and risk associated with strong ground shaking resulting from future large local earthquakes. Damaging earthquakes over the last fifty years have shaken the most heavily developed areas and critical infrastructure of the island to levels corresponding to at least Modified Mercalli Intensity VII. Hawai`i's most recent damaging earthquakes, the M6.7 Kiholo Bay and M6.0 Mahukona earthquakes, struck within seven minutes of one another off of the northwest coast of the island in October 2006. These earthquakes resulted in damage at all thirteen of the telescopes near the summit of Mauna Kea that led to gaps in telescope operations ranging from days up to four months. With the experiences of 2006 and Hawai`i's history of damaging earthquakes, we have begun a study to explore the feasibility of implementing earthquake early warning systems to provide advanced warnings to the Thirty Meter Telescope of imminent strong ground shaking from future local earthquakes. One of the major challenges for earthquake early warning in Hawai`i is the variety of earthquake sources, from shallow crustal faults to deeper mantle sources, including the basal decollement separating the volcanic pile from the ancient oceanic crust. Infrastructure on the Island of Hawai`i may only be tens of kilometers from these sources, allowing warning times of only 20 s or less. We assess the capability of the current seismic network to produce alerts for major historic earthquakes, and we will provide recommendations for upgrades to improve performance.

  20. Earthquake and welded structures 5: Earthquake damages and anti-earthquake measures of oil storage tanks; 5 kikenbutsu chozo tank no jishin higai to taishin taisaku

    Energy Technology Data Exchange (ETDEWEB)

    Kawano, K. [Chiyoda Chemical Engineering and Construction Co. Ltd., Tokyo (Japan)

    1997-09-05

    The result of field investigation carried out on the state of damages of 236 hazardous material storage tanks out of 687 caused by the Hyogoken Nambu Earthquake in 1995 is introduced together with the cases of damage and the description of the countermeasures. The events of inclination and settlement of tank bodies were confirmed in 44% among those investigated in particular with tanks having a capacity of less than 1000kl and as for the basement and ground settlement, the fact that sand spouted as a result of their fluidization was witnessed as much as 81% among those investigated and the area surrounding tanks was roughly agreed with the area where ground crack appeared. A great number of other damages such as cracking of preventive seals against rain water, breakdown of oil defense banks and so forth were also confirmed. In the latter half of the report, aseismatic standards of old and new regulations as well as on the new criterion concerning the outdoor storage tank body, its basement and ground are tabulated and 4 items of anti-earthquake measures such as the final structural check up with regard to an earthquake exceeding the designed permissible stress, consolidation of tank body structure on the basis of the revised seismic coefficient method, assurance of the steadfast basement, prevention of the elevated platform from falling down and strengthening of water-proof seals and oil defense banks are enumerated in accordance with the report of investigation and examination on the resistibility of hazardous material storage equipment against the earthquake. 3 refs., 5 figs., 3 tabs.

  1. Overview of Mobile Equipment Used in Case of Beyond Design Basis Accident at NPP Krsko

    International Nuclear Information System (INIS)

    Lukacevic, H.; Kopinc, D.; Ivanjko, M.

    2016-01-01

    Terrorist attack in USA in the September 11, 2001 and accident at the Fukushima - Daiichi Nuclear Power Station in the March 11, 2011 highlight the importance of mitigating strategies in responding to Beyond Design Basis Accident (BDBA), while ensuring cooling of reactor core, containment and spent fuel pool. Nuclear Power Plant Krsko (NEK) has acquired additional mobile equipment and made necessary modifications on existing systems for the connection of this equipment (fast couplers). Usage of mobile equipment is not only limited to design basis accident (DBA), but, also to prevent and mitigate the consequences in case of BDBA, when other plant systems are not available. NEK also decided to take steps for upgrade of safety measures and prepared Safety Upgrade Program (SUP), which is consistent with the nuclear industry response to the Fukushima accident and is implementing main projects and modifications related to SUP. NEK mobile equipment is not required to operate under normal reactor plant operation except for periodic surveillance testing and is incorporated into the normal training process. Equipment is dislocated from the reactor building and most of the equipment is located in the new building, able to withstand extreme natural events, including earthquakes and tornadoes. The usage of all mobile equipment is prescribed as an additional option in NEK operating procedures in following cases and enables following options: filling various tanks, filling the steam generators, filling the containment, additional compressed air source, spent fuel pool refilling and spraying, alternative power supply. This document provides an overview of NEK mobile equipment, which consists of various mobile fire protection pumps, air compressors, protective equipment, fire trucks, diesel generators. Sufficient fuel supply for the equipment is provided on site for a minimum three days of operation. (author).

  2. What caused a large number of fatalities in the Tohoku earthquake?

    Science.gov (United States)

    Ando, M.; Ishida, M.; Nishikawa, Y.; Mizuki, C.; Hayashi, Y.

    2012-04-01

    The Mw9.0 earthquake caused 20,000 deaths and missing persons in northeastern Japan. 115 years prior to this event, there were three historical tsunamis that struck the region, one of which is a "tsunami earthquake" resulted with a death toll of 22,000. Since then, numerous breakwaters were constructed along the entire northeastern coasts and tsunami evacuation drills were carried out and hazard maps were distributed to local residents on numerous communities. However, despite the constructions and preparedness efforts, the March 11 Tohoku earthquake caused numerous fatalities. The strong shaking lasted three minutes or longer, thus all residents recognized that this is the strongest and longest earthquake that they had been ever experienced in their lives. The tsunami inundated an enormous area at about 560km2 over 35 cities along the coast of northeast Japan. To find out the reasons behind the high number of fatalities due to the March 11 tsunami, we interviewed 150 tsunami survivors at public evacuation shelters in 7 cities mainly in Iwate prefecture in mid-April and early June 2011. Interviews were done for about 30min or longer focused on their evacuation behaviors and those that they had observed. On the basis of the interviews, we found that residents' decisions not to evacuate immediately were partly due to or influenced by earthquake science results. Below are some of the factors that affected residents' decisions. 1. Earthquake hazard assessments turned out to be incorrect. Expected earthquake magnitudes and resultant hazards in northeastern Japan assessed and publicized by the government were significantly smaller than the actual Tohoku earthquake. 2. Many residents did not receive accurate tsunami warnings. The first tsunami warning were too small compared with the actual tsunami heights. 3. The previous frequent warnings with overestimated tsunami height influenced the behavior of the residents. 4. Many local residents above 55 years old experienced

  3. The limits of earthquake early warning: Timeliness of ground motion estimates

    Science.gov (United States)

    Minson, Sarah E.; Meier, Men-Andrin; Baltay, Annemarie S.; Hanks, Thomas C.; Cochran, Elizabeth S.

    2018-01-01

    The basic physics of earthquakes is such that strong ground motion cannot be expected from an earthquake unless the earthquake itself is very close or has grown to be very large. We use simple seismological relationships to calculate the minimum time that must elapse before such ground motion can be expected at a distance from the earthquake, assuming that the earthquake magnitude is not predictable. Earthquake early warning (EEW) systems are in operation or development for many regions around the world, with the goal of providing enough warning of incoming ground shaking to allow people and automated systems to take protective actions to mitigate losses. However, the question of how much warning time is physically possible for specified levels of ground motion has not been addressed. We consider a zero-latency EEW system to determine possible warning times a user could receive in an ideal case. In this case, the only limitation on warning time is the time required for the earthquake to evolve and the time for strong ground motion to arrive at a user’s location. We find that users who wish to be alerted at lower ground motion thresholds will receive more robust warnings with longer average warning times than users who receive warnings for higher ground motion thresholds. EEW systems have the greatest potential benefit for users willing to take action at relatively low ground motion thresholds, whereas users who set relatively high thresholds for taking action are less likely to receive timely and actionable information.

  4. Recent Mega-Thrust Tsunamigenic Earthquakes and PTHA

    Science.gov (United States)

    Lorito, S.

    2013-05-01

    The occurrence of several mega-thrust tsunamigenic earthquakes in the last decade, including but not limited to the 2004 Sumatra-Andaman, the 2010 Maule, and 2011 Tohoku earthquakes, has been a dramatic reminder of the limitations in our capability of assessing earthquake and tsunami hazard and risk. However, the increasingly high-quality geophysical observational networks allowed the retrieval of most accurate than ever models of the rupture process of mega-thrust earthquakes, thus paving the way for future improved hazard assessments. Probabilistic Tsunami Hazard Analysis (PTHA) methodology, in particular, is less mature than its seismic counterpart, PSHA. Worldwide recent research efforts of the tsunami science community allowed to start filling this gap, and to define some best practices that are being progressively employed in PTHA for different regions and coasts at threat. In the first part of my talk, I will briefly review some rupture models of recent mega-thrust earthquakes, and highlight some of their surprising features that likely result in bigger error bars associated to PTHA results. More specifically, recent events of unexpected size at a given location, and with unexpected rupture process features, posed first-order open questions which prevent the definition of an heterogeneous rupture probability along a subduction zone, despite of several recent promising results on the subduction zone seismic cycle. In the second part of the talk, I will dig a bit more into a specific ongoing effort for improving PTHA methods, in particular as regards epistemic and aleatory uncertainties determination, and the computational PTHA feasibility when considering the full assumed source variability. Only logic trees are usually explicated in PTHA studies, accounting for different possible assumptions on the source zone properties and behavior. The selection of the earthquakes to be actually modelled is then in general made on a qualitative basis or remains implicit

  5. Seismicity and seismic hazard in Sabah, East Malaysia from earthquake and geodetic data

    Science.gov (United States)

    Gilligan, A.; Rawlinson, N.; Tongkul, F.; Stephenson, R.

    2017-12-01

    While the levels of seismicity are low in most of Malaysia, the state of Sabah in northern Borneo has moderate levels of seismicity. Notable earthquakes in the region include the 1976 M6.2 Lahad Datu earthquake and the 2015 M6 Ranau earthquake. The recent Ranau earthquake resulted in the deaths of 18 people on Mt Kinabalu, an estimated 100 million RM ( US$23 million) damage to buildings, roads, and infrastructure from shaking, and flooding, reduced water quality, and damage to farms from landslides. Over the last 40 years the population of Sabah has increased to over four times what it was in 1976, yet seismic hazard in Sabah remains poorly understood. Using seismic and geodetic data we hope to better quantify the hazards posed by earthquakes in Sabah, and thus help to minimize risk. In order to do this we need to know about the locations of earthquakes, types of earthquakes that occur, and faults that are generating them. We use data from 15 MetMalaysia seismic stations currently operating in Sabah to develop a region-specific velocity model from receiver functions and a pre-existing surface wave model. We use this new velocity model to (re)locate earthquakes that occurred in Sabah from 2005-2016, including a large number of aftershocks from the 2015 Ranau earthquake. We use a probabilistic nonlinear earthquake location program to locate the earthquakes and then refine their relative locations using a double difference method. The recorded waveforms are further used to obtain moment tensor solutions for these earthquakes. Earthquake locations and moment tensor solutions are then compared with the locations of faults throughout Sabah. Faults are identified from high-resolution IFSAR images and subsequent fieldwork, with a particular focus on the Lahad Datau and Ranau areas. Used together, these seismic and geodetic data can help us to develop a new seismic hazard model for Sabah, as well as aiding in the delivery of outreach activities regarding seismic hazard

  6. A proposal on restart rule of nuclear power plants with piping having local wall thinning subjected to an earthquake. Former part. Aiming at further application

    International Nuclear Information System (INIS)

    Urabe, Yoshio

    2011-01-01

    Restart rule of nuclear power plants (NPPs) with piping having local wall thinning subjected to an earthquake was proposed taking account of local wall thinning, seismic effects and restart of NPPs with applicability of 'Guidelines for NPP Response to an Earthquake (EPRI NP-6695)' in Japan. Japan Earthquake Damage Intensity Scale (JEDIS) and Earthquake Ground Motion Level (EGML) were introduced. JEDIS was classified into four scales obtained from damage level of components and structures of NPPs subjected to an earthquake, while EGML was divided into four levels by safe shutdown earthquake ground motion (So), elastic design earthquake ground motion (Sd) and design earthquake ground motion (Ss). Combination of JEDIS and EGML formulated 4 x 4 matrix and determined detailed conditions of restart of NPPs. As a response to an earthquake, operator walk inspections and evaluation of earthquake ground motion were conducted to know the level of JEDIS. JEDIS level requested respective allowable conditions of restart of NPP, which were scale level dependent and consisted of weighted combination of damage inspection (operator walk inspections, focused inspections/tests and expanded inspections), integrity evaluation and repair/replacement. If JEDIS were assigned greater than 3 with expanded inspections, inspection of piping with local wall thinning, its integrity evaluation and repair/replacement if necessary were requested. Inspection and evaluation of piping with local wall thinning was performed based on JSME or ASME codes. Detailed work flow charts were presented. Carbon steel piping and elbow was chosen for evaluation. (T. Tanaka)

  7. Anomalous variation in GPS based TEC measurements prior to the 30 September 2009 Sumatra Earthquake

    Science.gov (United States)

    Karia, Sheetal; Pathak, Kamlesh

    This paper investigates the features of pre-earthquake ionospheric anomalies in the total elec-tron content (TEC) data obtained on the basis of regular GPS observations from the GPS receiver at SVNIT Surat (21.16 N, 72.78 E Geog) located at the northern crest of equatorial anomaly region. The data has been analysed for 5 different earthquakes that occurred during 2009 in India and its neighbouring regions. Our observation shows that for the cases of the earthquake, in which the preparation area lies between the crests of the equatorial anomaly close to the geomagnetic equator the enhancement in TEC was followed by a depletion in TEC on the day of earthquake, which may be connected to the equatorial anomaly shape distortions. For the analysis of the ionospheric effects of one of such case-the 30 September 2009 Sumatra earthquake, Global Ionospheric Maps of TEC were used. The possible influence of the earth-quake preparation processes on the main low-latitude ionosphere peculiarity—the equatorial anomaly—is discussed.

  8. Spatial distribution of earthquake hypocenters in the Crimea—Black Sea region

    Science.gov (United States)

    Burmin, V. Yu; Shumlianska, L. O.

    2018-03-01

    Some aspects of the seismicity the Crime—Black Sea region are considered on the basis of the catalogued data on earthquakes that have occurred between 1970 and 2012. The complete list of the Crimean earthquakes for this period contains about 2140 events with magnitude ranging from -1.5 to 5.5. Bulletins contain information about compressional and shear waves arrival times regarding nearly 2000 earthquakes. A new approach to the definition of the coordinates of all of the events was applied to re-establish the hypocenters of the catalogued earthquakes. The obtained results indicate that the bulk of the earthquakes' foci in the region are located in the crust. However, some 2.5% of the foci are located at the depths ranging from 50 to 250 km. The new distribution of foci of earthquakes shows the concentration of foci in the form of two inclined branches, the center of which is located under the Yalto-Alushta seismic focal zone. The whole distribution of foci in depth corresponds to the relief of the lithosphere.

  9. Radioactive Waste Management Basis

    International Nuclear Information System (INIS)

    Perkins, B.K.

    2009-01-01

    The purpose of this Radioactive Waste Management Basis is to describe the systematic approach for planning, executing, and evaluating the management of radioactive waste at LLNL. The implementation of this document will ensure that waste management activities at LLNL are conducted in compliance with the requirements of DOE Order 435.1, Radioactive Waste Management, and the Implementation Guide for DOE Manual 435.1-1, Radioactive Waste Management Manual. Technical justification is provided where methods for meeting the requirements of DOE Order 435.1 deviate from the DOE Manual 435.1-1 and Implementation Guide.

  10. Napa earthquake: An earthquake in a highly connected world

    Science.gov (United States)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  11. Countermeasures to earthquakes in nuclear plants

    International Nuclear Information System (INIS)

    Sato, Kazuhide

    1979-01-01

    The contribution of atomic energy to mankind is unmeasured, but the danger of radioactivity is a special thing. Therefore in the design of nuclear power plants, the safety has been regarded as important, and in Japan where earthquakes occur frequently, the countermeasures to earthquakes have been incorporated in the examination of safety naturally. The radioactive substances handled in nuclear power stations and spent fuel reprocessing plants are briefly explained. The occurrence of earthquakes cannot be predicted effectively, and the disaster due to earthquakes is apt to be remarkably large. In nuclear plants, the prevention of damage in the facilities and the maintenance of the functions are required at the time of earthquakes. Regarding the location of nuclear plants, the history of earthquakes, the possible magnitude of earthquakes, the properties of ground and the position of nuclear plants should be examined. After the place of installation has been decided, the earthquake used for design is selected, evaluating live faults and determining the standard earthquakes. As the fundamentals of aseismatic design, the classification according to importance, the earthquakes for design corresponding to the classes of importance, the combination of loads and allowable stress are explained. (Kako, I.)

  12. Update earthquake risk assessment in Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2017-07-01

    The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety

  13. A 30-year history of earthquake crisis communication in California and lessons for the future

    Science.gov (United States)

    Jones, L.

    2015-12-01

    The first statement from the US Geological Survey to the California Office of Emergency Services quantifying the probability of a possible future earthquake was made in October 1985 about the probability (approximately 5%) that a M4.7 earthquake located directly beneath the Coronado Bay Bridge in San Diego would be a foreshock to a larger earthquake. In the next 30 years, publication of aftershock advisories have become routine and formal statements about the probability of a larger event have been developed in collaboration with the California Earthquake Prediction Evaluation Council (CEPEC) and sent to CalOES more than a dozen times. Most of these were subsequently released to the public. These communications have spanned a variety of approaches, with and without quantification of the probabilities, and using different ways to express the spatial extent and the magnitude distribution of possible future events. The USGS is re-examining its approach to aftershock probability statements and to operational earthquake forecasting with the goal of creating pre-vetted automated statements that can be released quickly after significant earthquakes. All of the previous formal advisories were written during the earthquake crisis. The time to create and release a statement became shorter with experience from the first public advisory (to the 1988 Lake Elsman earthquake) that was released 18 hours after the triggering event, but was never completed in less than 2 hours. As was done for the Parkfield experiment, the process will be reviewed by CEPEC and NEPEC (National Earthquake Prediction Evaluation Council) so the statements can be sent to the public automatically. This talk will review the advisories, the variations in wording and the public response and compare this with social science research about successful crisis communication, to create recommendations for future advisories

  14. Short- and Long-Term Earthquake Forecasts Based on Statistical Models

    Science.gov (United States)

    Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner

    2017-04-01

    The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.

  15. Portals for Real-Time Earthquake Data and Forecasting: Challenge and Promise (Invited)

    Science.gov (United States)

    Rundle, J. B.; Holliday, J. R.; Graves, W. R.; Feltstykket, R.; Donnellan, A.; Glasscoe, M. T.

    2013-12-01

    Earthquake forecasts have been computed by a variety of countries world-wide for over two decades. For the most part, forecasts have been computed for insurance, reinsurance and underwriters of catastrophe bonds. However, recent events clearly demonstrate that mitigating personal risk is becoming the responsibility of individual members of the public. Open access to a variety of web-based forecasts, tools, utilities and information is therefore required. Portals for data and forecasts present particular challenges, and require the development of both apps and the client/server architecture to deliver the basic information in real time. The basic forecast model we consider is the Natural Time Weibull (NTW) method (JBR et al., Phys. Rev. E, 86, 021106, 2012). This model uses small earthquakes (';seismicity-based models') to forecast the occurrence of large earthquakes, via data-mining algorithms combined with the ANSS earthquake catalog. This method computes large earthquake probabilities using the number of small earthquakes that have occurred in a region since the last large earthquake. Localizing these forecasts in space so that global forecasts can be computed in real time presents special algorithmic challenges, which we describe in this talk. Using 25 years of data from the ANSS California-Nevada catalog of earthquakes, we compute real-time global forecasts at a grid scale of 0.1o. We analyze and monitor the performance of these models using the standard tests, which include the Reliability/Attributes and Receiver Operating Characteristic (ROC) tests. It is clear from much of the analysis that data quality is a major limitation on the accurate computation of earthquake probabilities. We discuss the challenges of serving up these datasets over the web on web-based platforms such as those at www.quakesim.org , www.e-decider.org , and www.openhazards.com.

  16. How citizen seismology is transforming rapid public earthquake information and interactions between seismologists and society

    Science.gov (United States)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Fréderic; Caroline, Etivant

    2015-04-01

    Historical earthquakes are only known to us through written recollections and so seismologists have a long experience of interpreting the reports of eyewitnesses, explaining probably why seismology has been a pioneer in crowdsourcing and citizen science. Today, Internet has been transforming this situation; It can be considered as the digital nervous system comprising of digital veins and intertwined sensors that capture the pulse of our planet in near real-time. How can both seismology and public could benefit from this new monitoring system? This paper will present the strategy implemented at Euro-Mediterranean Seismological Centre (EMSC) to leverage this new nervous system to detect and diagnose the impact of earthquakes within minutes rather than hours and how it transformed information systems and interactions with the public. We will show how social network monitoring and flashcrowds (massive website traffic increases on EMSC website) are used to automatically detect felt earthquakes before seismic detections, how damaged areas can me mapped through concomitant loss of Internet sessions (visitors being disconnected) and the benefit of collecting felt reports and geolocated pictures to further constrain rapid impact assessment of global earthquakes. We will also describe how public expectations within tens of seconds of ground shaking are at the basis of improved diversified information tools which integrate this user generated contents. A special attention will be given to LastQuake, the most complex and sophisticated Twitter QuakeBot, smartphone application and browser add-on, which deals with the only earthquakes that matter for the public: the felt and damaging earthquakes. In conclusion we will demonstrate that eyewitnesses are today real time earthquake sensors and active actors of rapid earthquake information.

  17. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    Science.gov (United States)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  18. Coping with the challenges of early disaster response: 24 years of field hospital experience after earthquakes.

    Science.gov (United States)

    Bar-On, Elhanan; Abargel, Avi; Peleg, Kobi; Kreiss, Yitshak

    2013-10-01

    To propose strategies and recommendations for future planning and deployment of field hospitals after earthquakes by comparing the experience of 4 field hospitals deployed by The Israel Defense Forces (IDF) Medical Corps in Armenia, Turkey, India and Haiti. Quantitative data regarding the earthquakes were collected from published sources; data regarding hospital activity were collected from IDF records; and qualitative information was obtained from structured interviews with key figures involved in the missions. The hospitals started operating between 89 and 262 hours after the earthquakes. Their sizes ranged from 25 to 72 beds, and their personnel numbered between 34 and 100. The number of patients treated varied from 1111 to 2400. The proportion of earthquake-related diagnoses ranged from 28% to 67% (P earthquakes, patient caseload and treatment requirements varied widely. The variables affecting the patient profile most significantly were time until deployment, total number of injured, availability of adjacent medical facilities, and possibility of evacuation from the disaster area. When deploying a field hospital in the early phase after an earthquake, a wide variability in patient caseload should be anticipated. Customization is difficult due to the paucity of information. Therefore, early deployment necessitates full logistic self-sufficiency and operational versatility. Also, collaboration with local and international medical teams can greatly enhance treatment capabilities.

  19. Induced earthquakes. Sharp increase in central Oklahoma seismicity since 2008 induced by massive wastewater injection.

    Science.gov (United States)

    Keranen, K M; Weingarten, M; Abers, G A; Bekins, B A; Ge, S

    2014-07-25

    Unconventional oil and gas production provides a rapidly growing energy source; however, high-production states in the United States, such as Oklahoma, face sharply rising numbers of earthquakes. Subsurface pressure data required to unequivocally link earthquakes to wastewater injection are rarely accessible. Here we use seismicity and hydrogeological models to show that fluid migration from high-rate disposal wells in Oklahoma is potentially responsible for the largest swarm. Earthquake hypocenters occur within disposal formations and upper basement, between 2- and 5-kilometer depth. The modeled fluid pressure perturbation propagates throughout the same depth range and tracks earthquakes to distances of 35 kilometers, with a triggering threshold of ~0.07 megapascals. Although thousands of disposal wells operate aseismically, four of the highest-rate wells are capable of inducing 20% of 2008 to 2013 central U.S. seismicity. Copyright © 2014, American Association for the Advancement of Science.

  20. Building Infrastructure for Preservation and Publication of Earthquake Engineering Research Data

    Directory of Open Access Journals (Sweden)

    Stanislav Pejša

    2014-10-01

    Full Text Available The objective of this paper is to showcase the progress of the earthquake engineering community during a decade-long effort supported by the National Science Foundation in the George E. Brown Jr., Network for Earthquake Engineering Simulation (NEES. During the four years that NEES network operations have been headquartered at Purdue University, the NEEScomm management team has facilitated an unprecedented cultural change in the ways research is performed in earthquake engineering. NEES has not only played a major role in advancing the cyberinfrastructure required for transformative engineering research, but NEES research outcomes are making an impact by contributing to safer structures throughout the USA and abroad. This paper reflects on some of the developments and initiatives that helped instil change in the ways that the earthquake engineering and tsunami community share and reuse data and collaborate in general.

  1. Control strategy to limit duty cycle impact of earthquakes on the LIGO gravitational-wave detectors

    Science.gov (United States)

    Biscans, S.; Warner, J.; Mittleman, R.; Buchanan, C.; Coughlin, M.; Evans, M.; Gabbard, H.; Harms, J.; Lantz, B.; Mukund, N.; Pele, A.; Pezerat, C.; Picart, P.; Radkins, H.; Shaffer, T.

    2018-03-01

    Advanced gravitational-wave detectors such as the laser interferometer gravitational-wave observatories (LIGO) require an unprecedented level of isolation from the ground. When in operation, they measure motion of less than 10‑19 m. Strong teleseismic events like earthquakes disrupt the proper functioning of the detectors, and result in a loss of data. An earthquake early-warning system, as well as a prediction model, have been developed to understand the impact of earthquakes on LIGO. This paper describes a control strategy to use this early-warning system to reduce the LIGO downtime by  ∼30%. It also presents a plan to implement this new earthquake configuration in the LIGO automation system.

  2. Limiting the effects of earthquakes on gravitational-wave interferometers

    Science.gov (United States)

    Coughlin, Michael; Earle, Paul; Harms, Jan; Biscans, Sebastien; Buchanan, Christopher; Coughlin, Eric; Donovan, Fred; Fee, Jeremy; Gabbard, Hunter; Guy, Michelle; Mukund, Nikhil; Perry, Matthew

    2017-01-01

    Ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to ground shaking from high-magnitude teleseismic events, which can interrupt their operation in science mode and significantly reduce their duty cycle. It can take several hours for a detector to stabilize enough to return to its nominal state for scientific observations. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining stable operation even at the expense of increased instrumental noise. Here, we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Preliminary low latency hypocenter and magnitude information is generally available in 5 to 20 min of a significant earthquake depending on its magnitude and location. The alerts are used to estimate arrival times and ground velocities at the gravitational-wave detectors. In general, 90% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal. By using a machine learning algorithm, we develop a prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could prevent interruption of operation from 40 to 100 earthquake events in a 6-month time-period.

  3. Limiting the effects of earthquakes on gravitational-wave interferometers

    International Nuclear Information System (INIS)

    Coughlin, Michael; Earle, Paul; Harms, Jan; Biscans, Sebastien; Donovan, Fred; Buchanan, Christopher; Coughlin, Eric; Fee, Jeremy; Guy, Michelle; Gabbard, Hunter; Mukund, Nikhil; Perry, Matthew

    2017-01-01

    Ground-based gravitational wave interferometers such as the Laser Interferometer Gravitational-wave Observatory (LIGO) are susceptible to ground shaking from high-magnitude teleseismic events, which can interrupt their operation in science mode and significantly reduce their duty cycle. It can take several hours for a detector to stabilize enough to return to its nominal state for scientific observations. The down time can be reduced if advance warning of impending shaking is received and the impact is suppressed in the isolation system with the goal of maintaining stable operation even at the expense of increased instrumental noise. Here, we describe an early warning system for modern gravitational-wave observatories. The system relies on near real-time earthquake alerts provided by the U.S. Geological Survey (USGS) and the National Oceanic and Atmospheric Administration (NOAA). Preliminary low latency hypocenter and magnitude information is generally available in 5 to 20 min of a significant earthquake depending on its magnitude and location. The alerts are used to estimate arrival times and ground velocities at the gravitational-wave detectors. In general, 90% of the predictions for ground-motion amplitude are within a factor of 5 of measured values. The error in both arrival time and ground-motion prediction introduced by using preliminary, rather than final, hypocenter and magnitude information is minimal. By using a machine learning algorithm, we develop a prediction model that calculates the probability that a given earthquake will prevent a detector from taking data. Our initial results indicate that by using detector control configuration changes, we could prevent interruption of operation from 40 to 100 earthquake events in a 6-month time-period. (paper)

  4. PROSPECTS OF ESTABLISHING EARTHQUAKE RESISTANT BUILDINGS FROM TUBE CONCRETE CONSTRUCTIONS

    Directory of Open Access Journals (Sweden)

    Abdujafar I. Akaev

    2017-01-01

    Full Text Available Abstract. Objectives The aim of the research is to find optimal design solutions for the erection of buildings that will ensure their reliability and durability, compliance with environmental requirements, fire resistance and earthquake resistance. In this regard, the task is to determine the advantages and prospects of erecting earthquake resistant buildings from tube concrete constructions, since they are distinct in constructive, technological and economic efficiency when are used as vertical load-bearing struts of high-rise buildings. Method The technique for calculating the strength of normal sections of eccentrically-compressed tube concrete elements uses a nonlinear deformation model, taking into account the joint operation of the steel shell and the concrete core under the conditions of triaxial compression. Results In the article the review of the newest world experience of using tube concrete as vertical load-bearing structures for public facilities from the standpoint of earthquake resistant construction is given. The international practices of public facility construction ranging in height from 100 to 600 m with the use of tube concrete technology, including regions with dangerous natural and man-made conditions, have been studied. The structural, operational and technological advantages and disadvantages of tube concrete technology are analysed. Methods for calculating the strength of concrete tube elements in the case of central compression are considered: according to the so-called deformation theory, the state of total destruction of both concrete and tube fluidity attained at maximum pressure are indicated by the beginning of "tube flow on the longitudinal axis". The advantages and disadvantages of both methods are shown. Factors constraining the introduction and wider application of tube concrete constructions in Russia are considered. Conclusion While the advantages of concrete tube constructions in their extensive

  5. Renormalization group theory of earthquakes

    Directory of Open Access Journals (Sweden)

    H. Saleur

    1996-01-01

    Full Text Available We study theoretically the physical origin of the proposed discrete scale invariance of earthquake processes, at the origin of the universal log-periodic corrections to scaling, recently discovered in regional seismic activity (Sornette and Sammis (1995. The discrete scaling symmetries which may be present at smaller scales are shown to be robust on a global scale with respect to disorder. Furthermore, a single complex exponent is sufficient in practice to capture the essential properties of the leading correction to scaling, whose real part may be renormalized by disorder, and thus be specific to the system. We then propose a new mechanism for discrete scale invariance, based on the interplay between dynamics and disorder. The existence of non-linear corrections to the renormalization group flow implies that an earthquake is not an isolated 'critical point', but is accompanied by an embedded set of 'critical points', its foreshocks and any subsequent shocks for which it may be a foreshock.

  6. The 2016 Kumamoto earthquake sequence.

    Science.gov (United States)

    Kato, Aitaro; Nakamura, Kouji; Hiyama, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An M j 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an M j 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest.

  7. Earthquake lights and rupture processes

    Directory of Open Access Journals (Sweden)

    T. V. Losseva

    2005-01-01

    Full Text Available A physical model of earthquake lights is proposed. It is suggested that the magnetic diffusion from the electric and magnetic fields source region is a dominant process, explaining rather high localization of the light flashes. A 3D numerical code allowing to take into account the arbitrary distribution of currents caused by ground motion, conductivity in the ground and at its surface, including the existence of sea water above the epicenter or (and near the ruptured segments of the fault have been developed. Simulations for the 1995 Kobe earthquake were conducted taking into account the existence of sea water with realistic geometry of shores. The results do not contradict the eyewitness reports and scarce measurements of the electric and magnetic fields at large distances from the epicenter.

  8. The 2016 Kumamoto earthquake sequence

    Science.gov (United States)

    KATO, Aitaro; NAKAMURA, Kouji; HIYAMA, Yohei

    2016-01-01

    Beginning in April 2016, a series of shallow, moderate to large earthquakes with associated strong aftershocks struck the Kumamoto area of Kyushu, SW Japan. An Mj 7.3 mainshock occurred on 16 April 2016, close to the epicenter of an Mj 6.5 foreshock that occurred about 28 hours earlier. The intense seismicity released the accumulated elastic energy by right-lateral strike slip, mainly along two known, active faults. The mainshock rupture propagated along multiple fault segments with different geometries. The faulting style is reasonably consistent with regional deformation observed on geologic timescales and with the stress field estimated from seismic observations. One striking feature of this sequence is intense seismic activity, including a dynamically triggered earthquake in the Oita region. Following the mainshock rupture, postseismic deformation has been observed, as well as expansion of the seismicity front toward the southwest and northwest. PMID:27725474

  9. Advancing Understanding of Earthquakes by Drilling an Eroding Convergent Margin

    Science.gov (United States)

    von Huene, R.; Vannucchi, P.; Ranero, C. R.

    2010-12-01

    A program of IODP with great societal relevance is sampling and instrumenting the seismogenic zone. The zone generates great earthquakes that trigger tsunamis, and submarine slides thereby endangering coastal communities containing over sixty percent of the earth’s population. To asses and mitigate this endangerment it is urgent to advance understanding of fault dynamics that allows more timely anticipation of hazardous seismicity. Seismogenesis on accreting and eroding convergent plate boundaries apparently differ because of dissimilar materials along the interplate fault. As the history of instrumentally recorded earthquakes expands the difference becomes clearer. The more homogeneous clay, silt and sand subducted at accreting margins is associated with great earthquakes (M 9) whereas the fragmented upper plate rock that can dominate subducted material along an eroding margin plate interface is associated with many tsunamigenic earthquakes (Bilek, 2010). Few areas have been identified where the seismogenic zone can be reached with scientific drilling. In IODP accreting margins are studied on the NanTroSeize drill transect off Japan where the ultimate drilling of the seismogenic interface may occur by the end of IODP. The eroding Costa Rica margin will be studied in CRISP where a drill program will begin in 2011. The Costa Rican geophysical site survey will be complete with acquisition and processing of 3D seismic data in 2011 but the entire drilling will not be accomplished in IODP. It is appropriate that the accreting margin study be accomplished soon considering the indications of a pending great earthquake that will affect a country that has devoted enormous resources to IODP. However, understanding the erosional end-member is scientifically as important to an understanding of fault mechanics. Transoceanic tsunamis affect the entire Pacific rim where most subduction zones are eroding margins. The Costa Rican subduction zone is less complex operationally and

  10. Dense Ocean Floor Network for Earthquakes and Tsunamis; DONET/ DONET2, Part2 -Development and data application for the mega thrust earthquakes around the Nankai trough-

    Science.gov (United States)

    Kaneda, Y.; Kawaguchi, K.; Araki, E.; Matsumoto, H.; Nakamura, T.; Nakano, M.; Kamiya, S.; Ariyoshi, K.; Baba, T.; Ohori, M.; Hori, T.; Takahashi, N.; Kaneko, S.; Donet Research; Development Group

    2010-12-01

    Yoshiyuki Kaneda Katsuyoshi Kawaguchi*, Eiichiro Araki*, Shou Kaneko*, Hiroyuki Matsumoto*, Takeshi Nakamura*, Masaru Nakano*, Shinichirou Kamiya*, Keisuke Ariyoshi*, Toshitaka Baba*, Michihiro Ohori*, Narumi Takakahashi*, and Takane Hori** * Earthquake and Tsunami Research Project for Disaster Prevention, Leading Project , Japan Agency for Marine-Earth Science and Technology (JAMSTEC) **Institute for Research on Earth Evolution, Japan Agency for Marine-Earth Science and Technology (JAMSTEC) DONET (Dense Ocean Floor Network for Earthquakes and Tsunamis) is the real time monitoring system of the Tonankai seismogenic zones around the Nankai trough southwestern Japan. We were starting to develop DONET to perform real time monitoring of crustal activities over there and the advanced early warning system. DONET will provide important and useful data to understand the Nankai trough maga thrust earthquake seismogenic zones and to improve the accuracy of the earthquake recurrence cycle simulation. Details of DONET concept are as follows. 1) Redundancy, Extendable function and advanced maintenance system using the looped cable system, junction boxes and the ROV/AUV. DONET has 20 observatories and incorporated in a double land stations concept. Also, we are developed ROV for the 10km cable extensions and heavy weight operations. 2) Multi kinds of sensors to observe broad band phenomena such as long period tremors, very low frequency earthquakes and strong motions of mega thrust earthquakes over M8: Therefore, sensors such as a broadband seismometer, an accelerometer, a hydrophone, a precise pressure gauge, a differential pressure gauge and a thermometer are equipped with each observatory in DONET. 3) For speedy detections, evaluations and notifications of earthquakes and tsunamis: DONET system will be deployed around the Tonankai seismogenic zone. 4) Provide data of ocean floor crustal deformations derived from pressure sensors: Simultaneously, the development of data

  11. Seismic ACROSS Transmitter Installed at Morimachi above the Subducting Philippine Sea Plate for the Test Monitoring of the Seismogenic Zone of Tokai Earthquake not yet to Occur

    Science.gov (United States)

    Kunitomo, T.; Kumazawa, M.; Masuda, T.; Morita, N.; Torii, T.; Ishikawa, Y.; Yoshikawa, S.; Katsumata, A.; Yoshida, Y.

    2008-12-01

    Here we report the first seismic monitoring system in active and constant operation for the wave propagation characteristics in tectonic region just above the subducting plate driving the coming catastrophic earthquakes. Developmental works of such a system (ACROSS; acronym for Accurately Controlled, Routinely Operated, Signal System) have been started in 1994 at Nagoya University and since 1996 also at TGC (Tono Geoscience Center) of JAEA promoted by Hyogoken Nanbu Earthquakes (1995 Jan.17, Mj=7.3). The ACROSS is a technology system including theory of signal and data processing based on the brand new concept of measurement methodology of Green function between a signal source and observation site. The works done for first generation system are reported at IWAM04 and in JAEA report (Kumazawa et al.,2007). The Meteorological Research Institute of JMA has started a project of test monitoring of Tokai area in 2004 in corporation with Shizuoka University to realize the practical use of the seismic ACROSS for earthquake prediction researches. The first target was set to Tokai Earthquake not yet to take place. The seismic ACROSS transmitter was designed so as to be appropriate for the sensitive monitoring of the deep active fault zone on the basis of the previous technology elements accumulated so far. The ground coupler (antenna) is a large steel-reinforced concrete block (over 20m3) installed in the basement rocks in order to preserve the stability. Eccentric moment of the rotary transmitter is 82 kgm at maximum, 10 times larger than that of the first generation. Carrier frequency of FM signal for practical use can be from 3.5 to 15 Hz, and the signal phase is accurately controlled by a motor with vector inverter synchronized with GPS clock with a precision of 10-4 radian or better. By referring to the existing structure model in this area (Iidaka et al., 2003), the site of the transmitting station was chosen at Morimachi so as to be appropriate for detecting the

  12. Technical basis of safeguards

    International Nuclear Information System (INIS)

    Buechler, C.

    1975-01-01

    Definition of nuclear materials control. Materials accountancy and physical control as technical possibilities. Legal possibilities and levels of responsibility: material holders, national and international authority. Detection vs. prevention. Physical security and containment surveillance. Accountancy: materials balance concept. Materials measurement: inventory taking, flow determination. IAEA safeguards; verification of operator's statement. (HP) [de

  13. Earthquake Early Warning: A Prospective User's Perspective (Invited)

    Science.gov (United States)

    Nishenko, S. P.; Savage, W. U.; Johnson, T.

    2009-12-01

    With more than 25 million people at risk from high hazard faults in California alone, Earthquake Early Warning (EEW) presents a promising public safety and emergency response tool. EEW represents the real-time end of an earthquake information spectrum which also includes near real-time notifications of earthquake location, magnitude, and shaking levels; as well as geographic information system (GIS)-based products for compiling and visually displaying processed earthquake data such as ShakeMap and ShakeCast. Improvements to and increased multi-national implementation of EEW have stimulated interest in how such information products could be used in the future. Lifeline organizations, consisting of utilities and transportation systems, can use both onsite and regional EEW information as part of their risk management and public safety programs. Regional EEW information can provide improved situational awareness to system operators before automatic system protection devices activate, and allow trained personnel to take precautionary measures. On-site EEW is used for earthquake-actuated automatic gas shutoff valves, triggered garage door openers at fire stations, system controls, etc. While there is no public policy framework for preemptive, precautionary electricity or gas service shutdowns by utilities in the United States, gas shut-off devices are being required at the building owner level by some local governments. In the transportation sector, high-speed rail systems have already demonstrated the ‘proof of concept’ for EEW in several countries, and more EEW systems are being installed. Recently the Bay Area Rapid Transit District (BART) began collaborating with the California Integrated Seismic Network (CISN) and others to assess the potential benefits of EEW technology to mass transit operations and emergency response in the San Francisco Bay region. A key issue in this assessment is that significant earthquakes are likely to occur close to or within the BART

  14. Dim prospects for earthquake prediction

    Science.gov (United States)

    Geller, Robert J.

    I was misquoted by C. Lomnitz's [1998] Forum letter (Eos, August 4, 1998, p. 373), which said: [I wonder whether Sasha Gusev [1998] actually believes that branding earthquake prediction a ‘proven nonscience’ [Geller, 1997a] is a paradigm for others to copy.”Readers are invited to verify for themselves that neither “proven nonscience” norv any similar phrase was used by Geller [1997a].

  15. Advanced Test Reactor Safety Basis Upgrade Lessons Learned Relative to Design Basis Verification and Safety Basis Management

    International Nuclear Information System (INIS)

    G. L. Sharp; R. T. McCracken

    2004-01-01

    The Advanced Test Reactor (ATR) is a pressurized light-water reactor with a design thermal power of 250 MW. The principal function of the ATR is to provide a high neutron flux for testing reactor fuels and other materials. The reactor also provides other irradiation services such as radioisotope production. The ATR and its support facilities are located at the Test Reactor Area of the Idaho National Engineering and Environmental Laboratory (INEEL). An audit conducted by the Department of Energy's Office of Independent Oversight and Performance Assurance (DOE OA) raised concerns that design conditions at the ATR were not adequately analyzed in the safety analysis and that legacy design basis management practices had the potential to further impact safe operation of the facility.1 The concerns identified by the audit team, and issues raised during additional reviews performed by ATR safety analysts, were evaluated through the unreviewed safety question process resulting in shutdown of the ATR for more than three months while these concerns were resolved. Past management of the ATR safety basis, relative to facility design basis management and change control, led to concerns that discrepancies in the safety basis may have developed. Although not required by DOE orders or regulations, not performing design basis verification in conjunction with development of the 10 CFR 830 Subpart B upgraded safety basis allowed these potential weaknesses to be carried forward. Configuration management and a clear definition of the existing facility design basis have a direct relation to developing and maintaining a high quality safety basis which properly identifies and mitigates all hazards and postulated accident conditions. These relations and the impact of past safety basis management practices have been reviewed in order to identify lessons learned from the safety basis upgrade process and appropriate actions to resolve possible concerns with respect to the current ATR safety

  16. Earthquake forewarning in the Cascadia region

    Science.gov (United States)

    Gomberg, Joan S.; Atwater, Brian F.; Beeler, Nicholas M.; Bodin, Paul; Davis, Earl; Frankel, Arthur; Hayes, Gavin P.; McConnell, Laura; Melbourne, Tim; Oppenheimer, David H.; Parrish, John G.; Roeloffs, Evelyn A.; Rogers, Gary D.; Sherrod, Brian; Vidale, John; Walsh, Timothy J.; Weaver, Craig S.; Whitmore, Paul M.

    2015-08-10

    This report, prepared for the National Earthquake Prediction Evaluation Council (NEPEC), is intended as a step toward improving communications about earthquake hazards between information providers and users who coordinate emergency-response activities in the Cascadia region of the Pacific Northwest. NEPEC charged a subcommittee of scientists with writing this report about forewarnings of increased probabilities of a damaging earthquake. We begin by clarifying some terminology; a “prediction” refers to a deterministic statement that a particular future earthquake will or will not occur. In contrast to the 0- or 100-percent likelihood of a deterministic prediction, a “forecast” describes the probability of an earthquake occurring, which may range from >0 to processes or conditions, which may include Increased rates of M>4 earthquakes on the plate interface north of the Mendocino region 

  17. Implementation of the project for the construction and operation of a nuclear heat and power plant on the basis of a floating power unit with KLT-40C reactors

    Energy Technology Data Exchange (ETDEWEB)

    Polushkin, A K; Kuzin, E A [JSC Malaya Energetika, Moscow (Russian Federation); Vorobiov, V M [JSC Atomenergo, St. Petersburg (Russian Federation); Klykov, D M [JSC Iceberg, St. Petersburg (Russian Federation); Panov, J K [OKBM, Nizhny Novgorod (Russian Federation)

    2000-09-01

    This paper presents the results of research and development on floating nuclear power plant (FNPP) for electricity and heat production for remote locations and small island or coastal communities. Evaluations of construction period, social and economic factors as well as safety and operational issues of the non-self-propelled barge-mounted NPP is given. (author)

  18. Implementation of the project for the construction and operation of a nuclear heat and power plant on the basis of a floating power unit with KLT-40C reactors

    International Nuclear Information System (INIS)

    Polushkin, A.K.; Kuzin, E.A.; Vorobiov, V.M.; Klykov, D.M.; Panov, J.K.

    2000-01-01

    This paper presents the results of research and development on floating nuclear power plant (FNPP) for electricity and heat production for remote locations and small island or coastal communities. Evaluations of construction period, social and economic factors as well as safety and operational issues of the non-self-propelled barge-mounted NPP is given. (author)

  19. Understanding Great Earthquakes in Japan's Kanto Region

    Science.gov (United States)

    Kobayashi, Reiji; Curewitz, Daniel

    2008-10-01

    Third International Workshop on the Kanto Asperity Project; Chiba, Japan, 16-19 February 2008; The 1703 (Genroku) and 1923 (Taisho) earthquakes in Japan's Kanto region (M 8.2 and M 7.9, respectively) caused severe damage in the Tokyo metropolitan area. These great earthquakes occurred along the Sagami Trough, where the Philippine Sea slab is subducting beneath Japan. Historical records, paleoseismological research, and geophysical/geodetic monitoring in the region indicate that such great earthquakes will repeat in the future.

  20. Earthquake-triggered landslides in southwest China

    OpenAIRE

    X. L. Chen; Q. Zhou; H. Ran; R. Dong

    2012-01-01

    Southwest China is located in the southeastern margin of the Tibetan Plateau and it is a region of high seismic activity. Historically, strong earthquakes that occurred here usually generated lots of landslides and brought destructive damages. This paper introduces several earthquake-triggered landslide events in this region and describes their characteristics. Also, the historical data of earthquakes with a magnitude of 7.0 or greater, having occurred in this region, is col...

  1. Systems required during and after an earthquake. Summary report. WWER-1000 nuclear power plants

    International Nuclear Information System (INIS)

    Monette, P.

    1995-01-01

    The scope of this document is to list the mechanical, instrumentation and electrical components required during and after earthquake, in order to achieve and maintain safe shutdown conditions of a WWER-1000 type nuclear power plant. The main objective pursued in establishing the systems and equipment list is to provide guidance for the design and implementation of the backfits which are necessary to increase seismic resistance of the components required after earthquake. The presented list is established on generic basis, i.e. it is applicable to any specific WWER-1000

  2. A fast combinatorial enhancement technique for earthquake damage identification based on remote sensing image

    Science.gov (United States)

    Dou, Aixia; Wang, Xiaoqing; Ding, Xiang; Du, Zecheng

    2010-11-01

    On the basis of the study on the enhancement methods of remote sensing images obtained after several earthquakes, the paper designed a new and optimized image enhancement model which was implemented by combining different single methods. The patterns of elementary model units and combined types of model were defined. Based on the enhancement model database, the algorithm of combinatorial model was brought out via C++ programming. The combined model was tested by processing the aerial remote sensing images obtained after 1976 Tangshan earthquake. It was proved that the definition and implementation of combined enhancement model can efficiently improve the ability and flexibility of image enhancement algorithm.

  3. IAEA safety guides in the light of recent developments in earthquake engineering

    International Nuclear Information System (INIS)

    Gurpinar, A.

    1988-11-01

    The IAEA safety guides 50-SG-S1 and 50-SG-S2 emphasize on the determination of the design basis earthquake ground motion and earthquake resistant design considerations for nuclear power plants, respectively. Since the elaboration of these safety guides years have elapsed and a review of some of these concepts is necessary, taking into account the information collected and the technical developments. In this article, topics within the scope of these safety guides are discussed. In particular, the results of some recent research which may have a bearing on the nuclear industry are highlighted. Conclusions and recommendations are presented. 6 fig., 19 refs. (F.M.)

  4. On fundamental concept of anti-earthquake design of equipment and pipings

    International Nuclear Information System (INIS)

    Shibata, H.; Kato, M.

    1979-01-01

    This paper deals with a new concept of anti-earthquake design of equipment and pipings in nuclear power plants. Usual anti-earthquake design of such items starts from the design basis ground motions, via floor responses and ends at the stress analysis of each structural element. However, the same type of equipment are used for plants under various site conditions. The ordinarily used method obliges the repetition of such design procedure on each plant. This new design method has been developed to avoid such time-consuming repetitions. (orig.)

  5. A Method for Estimation of Death Tolls in Disastrous Earthquake

    Science.gov (United States)

    Pai, C.; Tien, Y.; Teng, T.

    2004-12-01

    whether the districts are more urbanized or not. As the present researches are concerned, there were not a good and reliable relationship between the mortality and the characteristics of ground motions. We propose the concept of Equal Population Gaps to resolve the influence of mortality in a rural or urban district and decision of the weighting function to each district. The relationship between PGA Index and the mortality determined in this study can be expressed as:\\[M=28.9/[1+exp{(1.67-0.0029 \\times PGA)}] \\] Here M is mortality in %, and PGA is PGA Index in gals. The corresponding curve matches the data reasonably well, with R2=0.91. We process the estimation for districts in different scales to verify the feasibility of the method. The mortality-based on PGA Index is particularly useful in real-time application for death tolls prediction and assessment--a piece of information most critical for post earthquake emergency response operation.

  6. Emergency feature. Great east Japan earthquake disaster Fukushima Daiichi accident

    International Nuclear Information System (INIS)

    Kawata, Tomio; Tsujikura, Yonezo; Kitamura, Toshiro

    2011-01-01

    The Tohoku Pacific Ocean earthquake occurred in March 11, 2011. The disastrous tsunami attacked Fukushima Daiichi nuclear power plants after automatically shutdown by the earthquake and all motor operated pumps became inoperable due to station black out. Despite the strenuous efforts of operators, if caused serious accident such as loss of cooling function, hydrogen explosion and release of large amount of radioactive materials into the environment, leading to nuclear power emergency that ordered resident to evacuate or remain indoors. This emergency feature consisted of four articles. The first was the interview with the president of JAIF (Japan Atomic Industrial Forum) on how to identify the cause of the accident completely, intensify safety assurance measures and promote discussions on a role of nuclear power in the nation's entire energy policy toward the reconstruction. Others were reactor states and events sequence after the accident with trend data of radiation in the reactor site, statement of president of AESJ (Atomic Energy Society of Japan) on nuclear crisis following Tohoku Pacific Ocean earthquake our response and my experience in evacuation life. (T. Tanaka)

  7. Tsunami Prediction and Earthquake Parameters Estimation in the Red Sea

    KAUST Repository

    Sawlan, Zaid A

    2012-12-01

    Tsunami concerns have increased in the world after the 2004 Indian Ocean tsunami and the 2011 Tohoku tsunami. Consequently, tsunami models have been developed rapidly in the last few years. One of the advanced tsunami models is the GeoClaw tsunami model introduced by LeVeque (2011). This model is adaptive and consistent. Because of different sources of uncertainties in the model, observations are needed to improve model prediction through a data assimilation framework. Model inputs are earthquake parameters and topography. This thesis introduces a real-time tsunami forecasting method that combines tsunami model with observations using a hybrid ensemble Kalman filter and ensemble Kalman smoother. The filter is used for state prediction while the smoother operates smoothing to estimate the earthquake parameters. This method reduces the error produced by uncertain inputs. In addition, state-parameter EnKF is implemented to estimate earthquake parameters. Although number of observations is small, estimated parameters generates a better tsunami prediction than the model. Methods and results of prediction experiments in the Red Sea are presented and the prospect of developing an operational tsunami prediction system in the Red Sea is discussed.

  8. Methods of qualifying electrical cabinets for the load case earthquake

    International Nuclear Information System (INIS)

    Henkel, F.-O.; Kennerknecht, H.; Haefeli, T.; Jorgensen, F.

    2005-01-01

    With the qualification of electrical system cabinets for the load case earthquake it is differentiated between the two objectives: a) stability of the cabinet, and b) functionality of the built-in electrical modules during and after the earthquake. There are three methods to attain these goals: analyses, tests and proof by analogy. A common method is the shaking of a complete cabinet on a shaking table, with the advantage that stability and functionality can be proved at the same time, but with the disadvantage that quite expensive test equipment, especially a multi-axle shaking table, is necessary and that generally a cabinet which was proved for SSE is pre-affected and thus may not be incorporated into the plant offhand, i.e. the extreme example would be that the cabinet must be built twice. As a rule, analyses are currently carried out by means of Finite-Element-Models of the supporting structure with consideration of the electrical components at least with their masses. This analysis can prove the stability and pursue the excitation until the anchoring point of the electrical components (Henkel et al., 1987). The combination of the aforementioned two methods often constitutes the best way. The stability of the cabinet is proved by calculations, the functionality of the safety-relevant modules by tests. Once tested, modules identical in construction can be used for cabinets without further testing for earthquakes of similar or lower levels. Proof by analogy is possible only if tests or analyses of similar cabinets were done in advance. By means of the comparison of supporting structure, mass allocation and distribution, level and shape of the earthquake excitation it can be shown that the cabinet planned is covered by cabinets already tested or analysed (Katona et al., 1995). All facets of the various methods with advantages and disadvantages are discussed and explained on the basis of numerous examples. (authors)

  9. Retrospective analysis of the Spitak earthquake

    Directory of Open Access Journals (Sweden)

    A. K. Tovmassian

    1995-06-01

    Full Text Available Based on the retrospective analysis of numerous data and studies of the Spitak earthquake the present work at- tempts to shed light on different aspects of that catastrophic seismic event which occurred in Northern Arme- nia on December 7, 1988. The authors follow a chronological order of presentation, namely: changes in geo- sphere, atmosphere, biosphere during the preparation of the Spitak earthquake, foreshocks, main shock, after- shocks, focal mechanisms, historical seismicity; seismotectonic position of the source, strong motion records, site effects; the macroseismic effect, collapse of buildings and structures; rescue activities; earthquake conse- quences; and the lessons of the Spitak earthquake.

  10. Smoking prevalence increases following Canterbury earthquakes.

    Science.gov (United States)

    Erskine, Nick; Daley, Vivien; Stevenson, Sue; Rhodes, Bronwen; Beckert, Lutz

    2013-01-01

    A magnitude 7.1 earthquake hit Canterbury in September 2010. This earthquake and associated aftershocks took the lives of 185 people and drastically changed residents' living, working, and social conditions. To explore the impact of the earthquakes on smoking status and levels of tobacco consumption in the residents of Christchurch. Semistructured interviews were carried out in two city malls and the central bus exchange 15 months after the first earthquake. A total of 1001 people were interviewed. In August 2010, prior to any earthquake, 409 (41%) participants had never smoked, 273 (27%) were currently smoking, and 316 (32%) were ex-smokers. Since the September 2010 earthquake, 76 (24%) of the 316 ex-smokers had smoked at least one cigarette and 29 (38.2%) had smoked more than 100 cigarettes. Of the 273 participants who were current smokers in August 2010, 93 (34.1%) had increased consumption following the earthquake, 94 (34.4%) had not changed, and 86 (31.5%) had decreased their consumption. 53 (57%) of the 93 people whose consumption increased reported that the earthquake and subsequent lifestyle changes as a reason to increase smoking. 24% of ex-smokers resumed smoking following the earthquake, resulting in increased smoking prevalence. Tobacco consumption levels increased in around one-third of current smokers.

  11. Thermal infrared anomalies of several strong earthquakes.

    Science.gov (United States)

    Wei, Congxin; Zhang, Yuansheng; Guo, Xiao; Hui, Shaoxing; Qin, Manzhong; Zhang, Ying

    2013-01-01

    In the history of earthquake thermal infrared research, it is undeniable that before and after strong earthquakes there are significant thermal infrared anomalies which have been interpreted as preseismic precursor in earthquake prediction and forecasting. In this paper, we studied the characteristics of thermal radiation observed before and after the 8 great earthquakes with magnitude up to Ms7.0 by using the satellite infrared remote sensing information. We used new types of data and method to extract the useful anomaly information. Based on the analyses of 8 earthquakes, we got the results as follows. (1) There are significant thermal radiation anomalies before and after earthquakes for all cases. The overall performance of anomalies includes two main stages: expanding first and narrowing later. We easily extracted and identified such seismic anomalies by method of "time-frequency relative power spectrum." (2) There exist evident and different characteristic periods and magnitudes of thermal abnormal radiation for each case. (3) Thermal radiation anomalies are closely related to the geological structure. (4) Thermal radiation has obvious characteristics in abnormal duration, range, and morphology. In summary, we should be sure that earthquake thermal infrared anomalies as useful earthquake precursor can be used in earthquake prediction and forecasting.

  12. Impact- and earthquake- proof roof structure

    International Nuclear Information System (INIS)

    Shohara, Ryoichi.

    1990-01-01

    Building roofs are constituted with roof slabs, an earthquake proof layer at the upper surface thereof and an impact proof layer made of iron-reinforced concrete disposed further thereover. Since the roofs constitute an earthquake proof structure loading building dampers on the upper surface of the slabs by the concrete layer, seismic inputs of earthquakes to the buildings can be moderated and the impact-proof layer is formed, to ensure the safety to external conditions such as earthquakes or falling accidents of airplane in important facilities such as reactor buildings. (T.M.)

  13. A minimalist model of characteristic earthquakes

    DEFF Research Database (Denmark)

    Vázquez-Prada, M.; González, Á.; Gómez, J.B.

    2002-01-01

    In a spirit akin to the sandpile model of self- organized criticality, we present a simple statistical model of the cellular-automaton type which simulates the role of an asperity in the dynamics of a one-dimensional fault. This model produces an earthquake spectrum similar to the characteristic-earthquake...... behaviour of some seismic faults. This model, that has no parameter, is amenable to an algebraic description as a Markov Chain. This possibility illuminates some important results, obtained by Monte Carlo simulations, such as the earthquake size-frequency relation and the recurrence time...... of the characteristic earthquake....

  14. Global Significant Earthquake Database, 2150 BC to present

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Significant Earthquake Database is a global listing of over 5,700 earthquakes from 2150 BC to the present. A significant earthquake is classified as one that...

  15. Hazus® estimated annualized earthquake losses for the United States

    Science.gov (United States)

    Jaiswal, Kishor; Bausch, Doug; Rozelle, Jesse; Holub, John; McGowan, Sean

    2017-01-01

    Large earthquakes can cause social and economic disruption that can be unprecedented to any given community, and the full recovery from these impacts may or may not always be achievable. In the United States (U.S.), the 1994 M6.7 Northridge earthquake in California remains the third costliest disaster in U.S. history; and it was one of the most expensive disasters for the federal government. Internationally, earthquakes in the last decade alone have claimed tens of thousands of lives and caused hundreds of billions of dollars of economic impact throughout the globe (~90 billion U.S. dollars (USD) from 2008 M7.9 Wenchuan China, ~20 billion USD from 2010 M8.8 Maule earthquake in Chile, ~220 billion USD from 2011 M9.0 Tohoku Japan earthquake, ~25 billion USD from 2011 M6.3 Christchurch New Zealand, and ~22 billion USD from 2016 M7.0 Kumamoto Japan). Recent earthquakes show a pattern of steadily increasing damages and losses that are primarily due to three key factors: (1) significant growth in earthquake-prone urban areas, (2) vulnerability of the older building stock, including poorly engineered non-ductile concrete buildings, and (3) an increased interdependency in terms of supply and demand for the businesses that operate among different parts of the world. In the United States, earthquake risk continues to grow with increased exposure of population and development even though the earthquake hazard has remained relatively stable except for the regions of induced seismic activity. Understanding the seismic hazard requires studying earthquake characteristics and locales in which they occur, while understanding the risk requires an assessment of the potential damage from earthquake shaking to the built environment and to the welfare of people—especially in high-risk areas. Estimating the varying degree of earthquake risk throughout the United States is critical for informed decision-making on mitigation policies, priorities, strategies, and funding levels in the

  16. Transuranic waste storage and assay facility (TRUSAF) interim safety basis

    International Nuclear Information System (INIS)

    Gibson, K.D.

    1995-09-01

    The TRUSAF ISB is based upon current facility configuration and procedures. The purpose of the document is to provide the basis for interim operation or restrictions on interim operations and the authorization basis for the TRUSAF at the Hanford Site. The previous safety analysis document TRUSAF hazards Identification and Evaluation (WHC 1977) is superseded by this document

  17. Fractal analysis of the ULF geomagnetic data obtained at Izu Peninsula, Japan in relation to the nearby earthquake swarm of June–August 2000

    Directory of Open Access Journals (Sweden)

    K. Gotoh

    2003-01-01

    Full Text Available In our recent papers we applied fractal methods to extract the earthquake precursory signatures from scaling characteristics of the ULF geomagnetic data, obtained in a seismic active region of Guam Island during the large earthquake of 8 August 1993. We found specific dynamics of their fractal characteristics (spectral exponents and fractal dimensions before the earthquake: appearance of the flicker-noise signatures and increase of the time series fractal dimension. Here we analyze ULF geomagnetic data obtained in a seismic active region of Izu Peninsula, Japan during a swarm of the strong nearby earthquakes of June–August 2000 and compare the results obtained in both regions. We apply the same methodology of data processing using the FFT procedure, Higuchi method and Burlaga-Klein approach to calculate the spectral exponents and fractal dimensions of the ULF time series. We found the common features and specific peculiarities in the behavior of fractal characteristics of the ULF time series before Izu and Guam earthquakes. As a common feature, we obtained the same increase of the ULF time series fractal dimension before the earthquakes, and as specific peculiarity – this increase appears to be sharp for Izu earthquake in comparison with gradual increase of the ULF time series fractal dimension for Guam earthquake. The results obtained in both regions are discussed on the basis of the SOC (self-organized criticality concept taking into account the differences in the depths of the earthquake focuses. On the basis of the peculiarities revealed, we advance methodology for extraction of the earthquake precursory signatures. As an adjacent step, we suggest the combined analysis of the ULF time series in the parametric space polarization ratio – fractal dimension. We reason also upon the advantage of the multifractal approach with resp