WorldWideScience

Sample records for conventional seismic standards

  1. Expanding Conventional Seismic Stratigrphy into the Multicomponent Seismic Domain

    Energy Technology Data Exchange (ETDEWEB)

    Innocent Aluka

    2008-08-31

    Multicomponent seismic data are composed of three independent vector-based seismic wave modes. These wave modes are, compressional mode (P), and shear modes SV and SH. The three modes are generated using three orthogonal source-displacement vectors and then recorded using three orthogonal vector sensors. The components travel through the earth at differing velocities and directions. The velocities of SH and SV as they travel through the subsurface differ by only a few percent, but the velocities of SV and SH (Vs) are appreciably lower than the P-wave velocity (Vp). The velocity ratio Vp/Vs varies by an order of magnitude in the earth from a value of 15 to 1.5 depending on the degree of sedimentary lithification. The data used in this study were acquired by nine-component (9C) vertical seismic profile (VSP), using three orthogonal vector sources. The 9C vertical seismic profile is capable of generating P-wave mode and the fundamental S-wave mode (SH-SH and SV-SV) directly at the source station and permits the basic components of elastic wavefield (P, SH-SH and SV-SV) to be separated from one another for the purposes of imaging. Analysis and interpretations of data from the study area show that incident full-elastic seismic wavefield is capable of reflecting four different wave modes, P, SH , SV and C which can be utilized to fully understand the architecture and heterogeneities of geologic sequences. The conventional seismic stratigraphy utilizes only reflected P-wave modes. The notation SH mode is the same as SH-SH; SV mode means SV-SV and C mode which is a converted shear wave is a special SV mode and is the same as P-SV. These four wave modes image unique geologic stratigraphy and facies and at the same time reflect independent stratal surfaces because of the unique orientation of their particle-displacement vectors. As a result of the distinct orientation of individual mode's particle-displacement vector, one mode may react to a critical subsurface sequence

  2. Standardizing Naming Conventions in Radiation Oncology

    Energy Technology Data Exchange (ETDEWEB)

    Santanam, Lakshmi [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, MO (United States); Hurkmans, Coen [Department of Radiation Oncology, Catharina Hospital, Eindhoven (Netherlands); Mutic, Sasa [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, MO (United States); Vliet-Vroegindeweij, Corine van [Department of Radiation Oncology, Thomas Jefferson University Hospital, Philadelphia, PA (United States); Brame, Scott; Straube, William [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, MO (United States); Galvin, James [Department of Radiation Oncology, Thomas Jefferson University Hospital, Philadelphia, PA (United States); Tripuraneni, Prabhakar [Department of Radiation Oncology, Scripps Clinic, LaJolla, CA (United States); Michalski, Jeff [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, MO (United States); Bosch, Walter, E-mail: wbosch@radonc.wustl.edu [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, MO (United States); Advanced Technology Consortium, Image-guided Therapy QA Center, St. Louis, MO (United States)

    2012-07-15

    Purpose: The aim of this study was to report on the development of a standardized target and organ-at-risk naming convention for use in radiation therapy and to present the nomenclature for structure naming for interinstitutional data sharing, clinical trial repositories, integrated multi-institutional collaborative databases, and quality control centers. This taxonomy should also enable improved plan benchmarking between clinical institutions and vendors and facilitation of automated treatment plan quality control. Materials and Methods: The Advanced Technology Consortium, Washington University in St. Louis, Radiation Therapy Oncology Group, Dutch Radiation Oncology Society, and the Clinical Trials RT QA Harmonization Group collaborated in creating this new naming convention. The International Commission on Radiation Units and Measurements guidelines have been used to create standardized nomenclature for target volumes (clinical target volume, internal target volume, planning target volume, etc.), organs at risk, and planning organ-at-risk volumes in radiation therapy. The nomenclature also includes rules for specifying laterality and margins for various structures. The naming rules distinguish tumor and nodal planning target volumes, with correspondence to their respective tumor/nodal clinical target volumes. It also provides rules for basic structure naming, as well as an option for more detailed names. Names of nonstandard structures used mainly for plan optimization or evaluation (rings, islands of dose avoidance, islands where additional dose is needed [dose painting]) are identified separately. Results: In addition to its use in 16 ongoing Radiation Therapy Oncology Group advanced technology clinical trial protocols and several new European Organization for Research and Treatment of Cancer protocols, a pilot version of this naming convention has been evaluated using patient data sets with varying treatment sites. All structures in these data sets were

  3. Sensitivity of Seismic Interferometry and Conventional Reflection Seismics at a Landfil to Processing and Survey Errors

    NARCIS (Netherlands)

    Konstantaki, L.A.; Draganov, D.S.; Heimovaara, T.J.; Ghose, R.

    2013-01-01

    Understanding how sensitive the seismic method is to errors that can occur during a seismic survey or during the processing of the seismic data is of high importance for any exploration geophysical project. Our aim is to image the subsurface of a landfill, which is typically a heterogeneous system

  4. Sandia software guidelines. Volume 3. Standards, practices, and conventions

    Energy Technology Data Exchange (ETDEWEB)

    1986-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies software standards, conventions, and practices. These guidelines are the result of a collective effort within Sandia National Laboratories to define recommended deliverables and to document standards, practices, and conventions which will help ensure quality software. 66 refs., 5 figs., 6 tabs.

  5. Development and Demand Analysis of Convention and Exhibition Industry Standardization

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This article points out the necessity and urgency of accelerating convention and exhibition industrial standardization on the basis of an analysis of the industry demands, developing tendencies, and existing problems during development.

  6. OGC Web Services standards by example : the European Seismic Portal

    Science.gov (United States)

    Frobert, L.; Kamb, L.; Trani, L.; Spinuso, A.; Bossu, R.; Van Eck, T.

    2011-12-01

    NERIES (2006-2010) was an Integrated Infrastructure Initiative (I3) project in the Sixth Framework Program (FP6) of the European Commission (EC), aiming at networking the European seismic networks, improving access to data, allowing access to specific seismic infrastructures and pursuing targeted research developing the next generation of tools for improved service and data analysis. During this project, a web portal was developed using web services to access data and a Visual Web Applications to display them. However these web services were not conform to any standard, making them difficult to consume by any new user interface. Therefore, for the NERA project, the follow-up of NERIES, we have proposed the use of web services standards to access our data. We have decided to use standards defined by the Open Geospatial Consortium (OGC). The OGC defines standards for the Web service interfaces to access geo-tagged data. The events and seismic stations are also geo-tagged making these web services suitable for our purpose. Using standard web services gives us the opportunity to distribute our data across all conformant consumers to these standards through various programming languages and applications We have implemented a preliminary version of web services conforming to the Web Map Service (WMS) and Web Feature Service (WFS) standard to access our catalog of seismic events (nearly 200 000 events). To visualize them we have made four examples demo on our web site using different technologies (Adobe Flash, JavaScript, Java with Nasa World Wind and UDig a desktop GIS application). In the future we hope to implement other OGC Web services standard like : - Sensor Observation Service (SOS) to provide seismic waveform records; - Web Notification Service (WNS); - Catalog Service for the Web (CSW) to provide a search engine of all our web services; - Web Processing Service (WPS) to process data between different services. The power of the use of OGC standards is the easy

  7. Conventionalism and Methodological Standards in Contending with Skepticism about Uncertainty

    Science.gov (United States)

    Brumble, K. C.

    2012-12-01

    What it means to measure and interpret confidence and uncertainty in a result is often particular to a specific scientific community and its methodology of verification. Additionally, methodology in the sciences varies greatly across disciplines and scientific communities. Understanding the accuracy of predictions of a particular science thus depends largely upon having an intimate working knowledge of the methods, standards, and conventions utilized and underpinning discoveries in that scientific field. Thus, valid criticism of scientific predictions and discoveries must be conducted by those who are literate in the field in question: they must have intimate working knowledge of the methods of the particular community and of the particular research under question. The interpretation and acceptance of uncertainty is one such shared, community-based convention. In the philosophy of science, this methodological and community-based way of understanding scientific work is referred to as conventionalism. By applying the conventionalism of historian and philosopher of science Thomas Kuhn to recent attacks upon methods of multi-proxy mean temperature reconstructions, I hope to illuminate how climate skeptics and their adherents fail to appreciate the need for community-based fluency in the methodological standards for understanding uncertainty shared by the wider climate science community. Further, I will flesh out a picture of climate science community standards of evidence and statistical argument following the work of philosopher of science Helen Longino. I will describe how failure to appreciate the conventions of professionalism and standards of evidence accepted in the climate science community results in the application of naïve falsification criteria. Appeal to naïve falsification in turn has allowed scientists outside the standards and conventions of the mainstream climate science community to consider themselves and to be judged by climate skeptics as valid

  8. Finite element analyses for seismic shear wall international standard problem

    Energy Technology Data Exchange (ETDEWEB)

    Park, Y.J.; Hofmayer, C.H.

    1998-04-01

    Two identical reinforced concrete (RC) shear walls, which consist of web, flanges and massive top and bottom slabs, were tested up to ultimate failure under earthquake motions at the Nuclear Power Engineering Corporation`s (NUPEC) Tadotsu Engineering Laboratory, Japan. NUPEC provided the dynamic test results to the OECD (Organization for Economic Cooperation and Development), Nuclear Energy Agency (NEA) for use as an International Standard Problem (ISP). The shear walls were intended to be part of a typical reactor building. One of the major objectives of the Seismic Shear Wall ISP (SSWISP) was to evaluate various seismic analysis methods for concrete structures used for design and seismic margin assessment. It also offered a unique opportunity to assess the state-of-the-art in nonlinear dynamic analysis of reinforced concrete shear wall structures under severe earthquake loadings. As a participant of the SSWISP workshops, Brookhaven National Laboratory (BNL) performed finite element analyses under the sponsorship of the U.S. Nuclear Regulatory Commission (USNRC). Three types of analysis were performed, i.e., monotonic static (push-over), cyclic static and dynamic analyses. Additional monotonic static analyses were performed by two consultants, F. Vecchio of the University of Toronto (UT) and F. Filippou of the University of California at Berkeley (UCB). The analysis results by BNL and the consultants were presented during the second workshop in Yokohama, Japan in 1996. A total of 55 analyses were presented during the workshop by 30 participants from 11 different countries. The major findings on the presented analysis methods, as well as engineering insights regarding the applicability and reliability of the FEM codes are described in detail in this report. 16 refs., 60 figs., 16 tabs.

  9. Global seismic inversion as the next standard step in the processing sequence

    Energy Technology Data Exchange (ETDEWEB)

    Maver, Kim G.; Hansen, Lars S.; Jepsen, Anne-Marie; Rasmussen, Klaus B.

    1998-12-31

    Seismic inversion of post stack seismic data has until recently been regarded as a reservoir oriented method since the standard inversion techniques rely on extensive well control and a detailed user derived input model. Most seismic inversion techniques further requires a stable wavelet. As a consequence seismic inversion is mainly utilised in mature areas focusing of specific zones only after the seismic data has been interpreted and is well understood. By using an advanced 3-D global technique, seismic inversion is presented as the next standard step in the processing sequence. The technique is robust towards noise within the seismic data, utilizes a time variant wavelet, and derives a low frequency model utilizing the stacking velocities and only limited well control. 4 figs.

  10. Studies on the Needs of Seismic Base Isolation Concept and its Standardization

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Min-Seok; Kim, Jong-Hae [Korea Electric Association, Seoul (Korea, Republic of)

    2015-05-15

    seismic protection of the structures of nuclear facility rather than to solely depend on seismic resistance design alone. This can also bring cost reduction effects both in early construction and in disaster control aftermath. Domestically, various types of devices are used for seismic base isolation systems in bridge projects. The 2005 revised edition of 'the highway bridge design standard' includes seismic base isolation standards under seismic resistance design for the first time in history; however, no efforts have been made to implement this technology in other fields. Therefore, KEPIC is developing seismic base isolation technology standards for the electrical industry.

  11. Experimental and analytical studies on the seismic behavior of conventional and hybrid braced frames

    Science.gov (United States)

    Lai, Jiun-Wei

    This dissertation summarizes both experimental and analytical studies on the seismic response of conventional steel concentrically braced frame systems of the type widely used in North America, and preliminary studies of an innovative hybrid braced frame system: the Strong-Back System. The research work is part of NEES small group project entitled "International Hybrid Simulation of Tomorrow's Braced Frames." In the experimental phase, a total of four full-scale, one-bay, two-story conventional braced frame specimens with different bracing member section shapes and gusset plate-to-beam connection details were designed and tested at the NEES Berkeley Laboratory. Three braced frame specimens were tested quasi-statically using the same predefined loading protocol to investigate the inelastic cyclic behavior of code-compliant braced frames at both the global and local level. The last braced frame specimen was nearly identical to one of those tested quasi-statically. However, it was tested using hybrid simulation techniques to examine the sensitivity of inelastic behavior on loading sequence and to relate the behavior observed to different levels of seismic hazard. Computer models of the test specimens were developed using two different computer software programs. In the software framework OpenSees fiber-based line elements were used to simulate global buckling of members and yielding and low-cycle fatigue failure at sections. The LS-DYNA analysis program was also used to model individual struts and the test specimens using shell elements with adaptive meshing and element erosion features. This program provided enhanced ability to simulate section local buckling, strain concentrations and crack development. The numerical results were compared with test results to assess and refine and the ability of the models to predict braced frame behavior. A series of OpenSees numerical cyclic component simulations were then conducted using the validated modeling approach. Two

  12. Multicomponent Seismic Imaging of the Cheyenne Belt: Data Improvement Through Non-Conventional Filtering

    Science.gov (United States)

    Johnson, R. A.; Shoshitaishvili, E.; Sorenson, L. S.

    2001-12-01

    The Cheyenne Belt in southeastern Wyoming separates Archean Wyoming Craton from accreted juvenile Proterozoic crust making it one of the fundamental sutures in the Proterozoic assemblage of western North America. As one of the multidisciplinary components of the Continental Dynamics - Rocky Mountains Transect project (CDROM), reflection seismic data were acquired from south-central Wyoming to central Colorado to characterize crustal structure associated with this boundary and younger Proterozoic shear zones to the south. In addition to acquisition of more conventional vertical-component data, 3-component data were acquired to better constrain rock properties and reflection directionality, providing information that tends to be lost with one-component recording. In order to achieve the highest possible signal-to-noise ratios in the processed data, considerable work was focused on removal of noise caused by private vehicles driving on forest roads during active recording and, perhaps more problematical, harmonic noise generated from power-line and other electrical-equipment interference. Noise generated from these sources was successfully attenuated using 1) short-window 2D FFT filtering to remove irregular, high-amplitude vehicular noise, and 2) harmonic-noise-subtraction algorithms developed at the University of Arizona to remove harmonic electrical-induction noise. This latter filtering procedure used a time-domain-based method of automatic estimation of noise frequencies and their amplitudes, followed by subtraction of these estimated anomalous harmonics from the data. Since the technique estimates the best fit of noise for the entire trace, subtraction of the noise avoids many of the deleterious effects of simple notch filtering. After noise removal, it was possible to pick both P-wave and S-wave first arrivals and model shallow subsurface rock properties. This model provides a link between deeper events and the surface geology.

  13. Standardization of Seismic Microzonification and Probabilistic Seismic Hazard Study Considering Site Effect for Metropolitan Areas in the State of Veracruz

    Science.gov (United States)

    Torres Morales, G. F.; Leonardo Suárez, M.; Dávalos Sotelo, R.; Castillo Aguilar, S.; Mora González, I.

    2014-12-01

    Preliminary results obtained from the project "Seismic Hazard in the State of Veracruz and Xalapa Conurbation" and "Microzonation of geological and hydrometeorological hazards for conurbations of Orizaba, Veracruz, and major sites located in the lower sub-basins: The Antigua and Jamapa" are presented. These projects were sponsored respectively by the PROMEP program and the Joint Funds CONACyT-Veracruz state government. The study consists of evaluating the probabilistic seismic hazard considering the site effect (SE) in the urban zones of cities of Xalapa and Orizaba; the site effects in this preliminary stage were incorporated through a standard format proposed in studies of microzonation and application in computer systems, which allows to optimize and condense microzonation studies of a city. This study stems from the need to know the seismic hazard (SH) in the State of Veracruz and its major cities, defining SH as the probabilistic description of exceedance of a given level of ground motion intensity (generally designated by the acceleration soil or maximum ordinate in the response spectrum of pseudo-acceleration, PGA and Sa, respectively) as a result of the action of an earthquake in the area of influence for a specified period of time. The evaluation results are presented through maps of seismic hazard exceedance rate curves and uniform hazard spectra (UHS) for different spectral ordinates and return periods, respectively.

  14. Characterization of gas hydrate distribution using conventional 3D seismic data in the Pearl River Mouth Basin, South China Sea

    Science.gov (United States)

    Wang, Xiujuan; Qiang, Jin; Collett, Timothy S.; Shi, Hesheng; Yang, Shengxiong; Yan, Chengzhi; Li, Yuanping; Wang, Zhenzhen; Chen, Duanxin

    2016-01-01

    A new 3D seismic reflection data volume acquired in 2012 has allowed for the detailed mapping and characterization of gas hydrate distribution in the Pearl River Mouth Basin in the South China Sea. Previous studies of core and logging data showed that gas hydrate occurrence at high concentrations is controlled by the presence of relatively coarse-grained sediment and the upward migration of thermogenic gas from the deeper sediment section into the overlying gas hydrate stability zone (BGHSZ); however, the spatial distribution of the gas hydrate remains poorly defined. We used a constrained sparse spike inversion technique to generate acoustic-impedance images of the hydrate-bearing sedimentary section from the newly acquired 3D seismic data volume. High-amplitude reflections just above the bottom-simulating reflectors (BSRs) were interpreted to be associated with the accumulation of gas hydrate with elevated saturations. Enhanced seismic reflections below the BSRs were interpreted to indicate the presence of free gas. The base of the BGHSZ was established using the occurrence of BSRs. In areas absent of well-developed BSRs, the BGHSZ was calculated from a model using the inverted P-wave velocity and subsurface temperature data. Seismic attributes were also extracted along the BGHSZ that indicate variations reservoir properties and inferred hydrocarbon accumulations at each site. Gas hydrate saturations estimated from the inversion of acoustic impedance of conventional 3D seismic data, along with well-log-derived rock-physics models were also used to estimate gas hydrate saturations. Our analysis determined that the gas hydrate petroleum system varies significantly across the Pearl River Mouth Basin and that variability in sedimentary properties as a product of depositional processes and the upward migration of gas from deeper thermogenic sources control the distribution of gas hydrates in this basin.

  15. MASW on the standard seismic prospective scale using full spread recording

    Science.gov (United States)

    Białas, Sebastian; Majdański, Mariusz; Trzeciak, Maciej; Gałczyński, Edward; Maksym, Andrzej

    2015-04-01

    The Multichannel Analysis of Surface Waves (MASW) is one of seismic survey methods that use the dispersion curve of surface waves in order to describe the stiffness of the surface. Is is used mainly for geotechnical engineering scale with total length of spread between 5 - 450 m and spread offset between 1 - 100 m, the hummer is the seismic source on this surveys. The standard procedure of MASW survey is: data acquisition, dispersion analysis and inversion of extracting dispersion curve to obtain the closest theoretical curve. The final result includes share-wave velocity (Vs) values at different depth along the surveyed lines. The main goal of this work is to expand this engineering method to the bigger scale with the length of standard prospecting spread of 20 km using 4.5 Hz version of vertical component geophones. The standard vibroseis and explosive method are used as the seismic source. The acquisition were conducted on the full spread all the time during each single shoot. The seismic data acquisition used for this analysis were carried out on the Braniewo 2014 project in north of Poland. The results achieved during standard MASW procedure says that this method can be used on much bigger scale as well. The different methodology of this analysis requires only much stronger seismic source.

  16. 76 FR 59173 - Standard Format and Content of License Applications for Conventional Uranium Mills

    Science.gov (United States)

    2011-09-23

    ... From the Federal Register Online via the Government Publishing Office NUCLEAR REGULATORY COMMISSION Standard Format and Content of License Applications for Conventional Uranium Mills AGENCY: Nuclear... Conventional Uranium Mills.'' DG- 3024 was a proposed Revision 2 of Regulatory Guide (RG) 3.5. However,...

  17. 75 FR 59281 - Recognition of Foreign Certificates Under the International Convention on Standards of Training...

    Science.gov (United States)

    2010-09-27

    ... Convention on Standards of Training, Certification and Watchkeeping for Seafarers, 1978, as amended, (STCW) requires Parties to the Convention to establish procedures to recognize STCW certificates issued by or... these mariners, and the countries that issue their STCW certificates. DATES: Comments and...

  18. Comparison of seismic actions and structural design requirements in Chinese Code GB 50011 and International Standard ISO 3010

    Institute of Scientific and Technical Information of China (English)

    王亚勇

    2004-01-01

    This paper presents a comparison between the Chinese Code GB50011-2001 and the International Standard ISO3010: 2001(E), emphasizing the similarities and differences related to design requirements, seismic actions and analytical approaches. Similarities include: earthquake return period, conceptual design, site classification, structural strength and ductility requirements, deformation limits, response spectra, seismic analysis procedures, isolation and energy dissipation,and nonstructural elements. Differences exist in the following areas: seismic levels, earthquake loading, mode damping factors and structural control.

  19. Inventory of standards and conventions used for the generation of IAG/GGOS products

    Science.gov (United States)

    Angermann, D.; Gruber, T.; Gerstl, M.; Hugentobler, U.; Sanchez, L.; Heinkelmann, R.; Steigenberger, P.

    2014-12-01

    The Bureau of Products and Standards (BPS), a redefinition of the former Bureau for Standards and Conventions (BSC), supports the Global Geodetic Observing System (GGOS) in its goal to obtain geodetic products of highest accuracy and consistency. In order to fully benefit from the ongoing technological improvements of the observing systems contributing to GGOS, it is essential that the analysis of the precise space geodetic observations is based on the definition of common standards and conventions and a unique representation and parameterization of the relevant quantities. This is of crucial importance for the establishment of highly accurate and consistent geodetic reference frames, as the basis for a reliable monitoring of the time-varying shape, rotation and gravity field of the Earth.A major focus was on the compilation of an inventory based on the evaluation of the standards and conventions currently in use by the IAG Services and their contributing analysis centres and for the generation of geometric and gravimetric products, such as geodetic reference frames, Earth orientation parameters, gravity field models and satellite orbits. This product-based inventory presents the current status concerning standards and conventions, indicating that there are several inconsistencies. As a major outcome of this inventory, the BPS will provide recommendations on how to resolve inconsistencies and gaps. In this presention we will briefly report on the mentioned activities and we summarize the most important findings.

  20. The Relationship between Students' Performance on Conventional Standardized Mathematics Assessments and Complex Mathematical Modeling Problems

    Science.gov (United States)

    Kartal, Ozgul; Dunya, Beyza Aksu; Diefes-Dux, Heidi A.; Zawojewski, Judith S.

    2016-01-01

    Critical to many science, technology, engineering, and mathematics (STEM) career paths is mathematical modeling--specifically, the creation and adaptation of mathematical models to solve problems in complex settings. Conventional standardized measures of mathematics achievement are not structured to directly assess this type of mathematical…

  1. 77 FR 62434 - Policy Letters on the International Convention on Standards of Training, Certification and...

    Science.gov (United States)

    2012-10-15

    ... Standards of Training, Certification and Watchkeeping for Seafarers, 1978, as amended (STCW). These letters provide guidance on: The hours of rest requirements of the 2010 amendments to the STCW Convention and Code... guidance to affected parties until regulations implementing amendments to the STCW are promulgated....

  2. 77 FR 232 - Implementation of the 2010 Amendments to the International Convention on Standards of Training...

    Science.gov (United States)

    2012-01-04

    ... Convention on Standards of Training, Certification and Watchkeeping for Seafarers, 1978, as amended, (STCW... 2010 amendments to the STCW will not be published before the 1 January 2012 entry into force date... vessels subject to STCW under current regulations. DATES: This policy is effective January 1,...

  3. Comparison of Size Modulation Standard Automated Perimetry and Conventional Standard Automated Perimetry with a 10-2 Test Program in Glaucoma Patients.

    Science.gov (United States)

    Hirasawa, Kazunori; Takahashi, Natsumi; Satou, Tsukasa; Kasahara, Masayuki; Matsumura, Kazuhiro; Shoji, Nobuyuki

    2017-08-01

    This prospective observational study compared the performance of size modulation standard automated perimetry with the Octopus 600 10-2 test program, with stimulus size modulation during testing, based on stimulus intensity and conventional standard automated perimetry, with that of the Humphrey 10-2 test program in glaucoma patients. Eighty-seven eyes of 87 glaucoma patients underwent size modulation standard automated perimetry with Dynamic strategy and conventional standard automated perimetry using the SITA standard strategy. The main outcome measures were global indices, point-wise threshold, visual defect size and depth, reliability indices, and test duration; these were compared between size modulation standard automated perimetry and conventional standard automated perimetry. Global indices and point-wise threshold values between size modulation standard automated perimetry and conventional standard automated perimetry were moderately to strongly correlated (p 33.40, p size modulation standard automated perimetry than with conventional standard automated perimetry, but the visual-field defect size was smaller (p size modulation-standard automated perimetry than on conventional standard automated perimetry. The reliability indices, particularly the false-negative response, of size modulation standard automated perimetry were worse than those of conventional standard automated perimetry (p size modulation standard automated perimetry than with conventional standard automated perimetry (p = 0.02). Global indices and the point-wise threshold value of the two testing modalities correlated well. However, the potential of a large stimulus presented at an area with a decreased sensitivity with size modulation standard automated perimetry could underestimate the actual threshold in the 10-2 test protocol, as compared with conventional standard automated perimetry.

  4. Characterizing an unconventional reservoir with conventional seismic data: A case study using seismic inversion for the Vaca Muerta Formation, Neuquen Basin, Argentina

    Science.gov (United States)

    Fernandez-Concheso, Jorge E.

    Reservoir characterization for unconventional shale plays ideally requires multi-component, wide-azimuth, long-offset surface seismic data. These data are generally not available, especially in exploration or pre-development stages. Furthermore, it is common to have only a few wells over a large area, along with non-existent or scarce microseismic, engineering and production data. This thesis presents a methodology and workflow to deal with these circumstances of limited data availability. By using a narrow-azimuth, regional P-wave seismic volume and integrating it with wireline logs, cuttings and PLT data, the variability in the geomechanical properties of the Vaca Muerta Formation in Argentina's Neuquen Basin, and their relationships with lithology, stress state and total organic content, were analyzed. Post-stack and pre-stack inversions were performed on the seismic volume. The un- certainties inherent from limited well control in the estimation of elastic properties were investigated using blind well testing. Sensitivity and error analysis was conducted on post-stack vs pre-stack derived P-impedance, the choice of the inversion algorithm (model-based vs sparse-spike) and the definition of the low frequency model (simple kriging model vs complex model derived from multi-attribute stepwise regression) were examined. Also, the use of isotropic AVA equations to approximate the anisotropic (VTI) behaviour of the reservoir was evaluated, using estimates of Thomsen parameters and simple AVA modelling. The integration of the inversion results with the petrophysical analysis and the mechanical stratigraphy work by Bishop (2015), suggests that the rock composition has the largest influence on the geomechanical behaviour of the reservoir. Overpressure is also a major driving factor in that it controls changes in elastic properties. Bishop's cluster analysis was used to identify good quality rock classes. The probabilistic interpretation of these rock classes from seismic

  5. Clarifying atomic weights: A 2016 four-figure table of standard and conventional atomic weights

    Science.gov (United States)

    Coplen, Tyler B.; Meyers, Fabienne; Holden, Norman E.

    2017-01-01

    To indicate that atomic weights of many elements are not constants of nature, in 2009 and 2011 the Commission on Isotopic Abundances and Atomic Weights (CIAAW) of the International Union of Pure and Applied Chemistry (IUPAC) replaced single-value standard atomic weight values with atomic weight intervals for 12 elements (hydrogen, lithium, boron, carbon, nitrogen, oxygen, magnesium, silicon, sulfur, chlorine, bromine, and thallium); for example, the standard atomic weight of nitrogen became the interval [14.00643, 14.00728]. CIAAW recognized that some users of atomic weight data only need representative values for these 12 elements, such as for trade and commerce. For this purpose, CIAAW provided conventional atomic weight values, such as 14.007 for nitrogen, and these values can serve in education when a single representative value is needed, such as for molecular weight calculations. Because atomic weight values abridged to four figures are preferred by many educational users and are no longer provided by CIAAW as of 2015, we provide a table containing both standard atomic weight values and conventional atomic weight values abridged to four figures for the chemical elements. A retrospective review of changes in four-digit atomic weights since 1961 indicates that changes in these values are due to more accurate measurements over time or to the recognition of the impact of natural isotopic fractionation in normal terrestrial materials upon atomic weight values of many elements. Use of the unit “u” (unified atomic mass unit on the carbon mass scale) with atomic weight is incorrect because the quantity atomic weight is dimensionless, and the unit “amu” (atomic mass unit on the oxygen scale) is an obsolete term: Both should be avoided.

  6. 75 FR 13715 - Implementation of the 1995 Amendments to the International Convention on Standards of Training...

    Science.gov (United States)

    2010-03-23

    ... Seafarers, 1978 (STCW Convention), on June 10, 1991. On November 17, 2009, the Coast Guard published a NPRM on the Implementation of the 1995 Amendments to the STCW Convention. The Coast Guard held five public... International Maritime Organization (IMO) is currently developing amendments to the STCW Convention that...

  7. The Caspar microsurgical discectomy and comparison with a conventional standard lumbar disc procedure.

    Science.gov (United States)

    Caspar, W; Campbell, B; Barbier, D D; Kretschmmer, R; Gotfried, Y

    1991-01-01

    The outcome in 119 patients who were operated on with a conventional standard lumbar discectomy procedure was retrospectively compared with that in 299 patients who were operated on with a microsurgical discectomy technique developed in Homburg/Saar, Federal Republic of Germany by the senior author (W.C.). All patients in this consecutive series had "virgin" lumbar radiculopathy evaluated and operated upon by two experienced surgeons at one institution. Determination of the final outcome was made objectively by an impartial third party using identical criteria for both groups, and with a patient self-evaluation form. The study looked at various pertinent aspects of the treatment course and at final outcome. The results in the microsurgical group were significantly favorable: fewer levels were explored: there was less operative blood loss and a decreased incidence of deep venous thrombosis, urinary tract infections, pulmonary emboli, and bladder catheterization; the time to full ambulation, discharge, and return to work was faster: and there was a decrease in change of occupation and a greater percentage of satisfactory final outcomes, as measured both objectively and subjectively. A description of the microsurgical technique used in this study, which differs significantly from existing microdisectomy techniques, is presented. The authors conclude that the microsurgical disectomy technique presented in this study is a safe and effective approach to the treatment of lumbar radiculopathy.

  8. Effectiveness of two conventional methods for seismic retrofit of steel and RC moment resisting frames based on damage control criteria

    Science.gov (United States)

    Beheshti Aval, Seyed Bahram; Kouhestani, Hamed Sadegh; Mottaghi, Lida

    2017-07-01

    This study investigates the efficiency of two types of rehabilitation methods based on economic justification that can lead to logical decision making between the retrofitting schemes. Among various rehabilitation methods, concentric chevron bracing (CCB) and cylindrical friction damper (CFD) were selected. The performance assessment procedure of the frames is divided into two distinct phases. First, the limit state probabilities of the structures before and after rehabilitation are investigated. In the second phase, the seismic risk of structures in terms of life safety and financial losses (decision variables) using the recently published FEMA P58 methodology is evaluated. The results show that the proposed retrofitting methods improve the serviceability and life safety performance levels of steel and RC structures at different rates when subjected to earthquake loads. Moreover, these procedures reveal that financial losses are greatly decreased, and were more tangible by the application of CFD rather than using CCB. Although using both retrofitting methods reduced damage state probabilities, incorporation of a site-specific seismic hazard curve to evaluate mean annual occurrence frequency at the collapse prevention limit state caused unexpected results to be obtained. Contrary to CFD, the collapse probability of the structures retrofitted with CCB increased when compared with the primary structures.

  9. The Language of Seafaring: Standardized Conventions and Discursive Features in Speech Communications

    Directory of Open Access Journals (Sweden)

    Ana Bocanegra Valle

    2011-06-01

    Full Text Available This paper portrays how English language is constructed and displayed by shipboard crews and shore-based personnel when communicating through radiotelephony. Based on internationally-recognized recommendations for implementation when ships communicate with each other or with shore-based stations as well as on examples of current practice contained in marine communication manuals, this paper explores the message patterns, the standardized conventions, and the general and discursive practices governing speech communications at sea. Firstly, marine communications are defined and the role of Maritime English in the shipping industry for ensuring a safe and efficient passage discussed. Then, the standardized language of the sea is explained. Next, a move-step model to the analysis of the stages making up communicative exchanges at sea is applied and the main general and discursive features that prevail in such exchanges described. Finally, two examples help to illustrate the model and features presented and discussed.El presente trabajo aborda el uso de la lengua inglesa por parte de los buques y el personal marítimo de tierra al comunicarse a través de radiotelefonía. Partiendo de recomendaciones internacionales de obligado cumplimiento así como de ejemplos prácticos recogidos en los manuales de comunicaciones marítimas, examinamos los modelos de mensaje, las convenciones normalizadas y las fórmulas discursivas generales y específicas por las que se rigen las comunicaciones orales en el ámbito marítimo. En primer lugar se definen las comunicaciones marítimas y se estudia el papel que juega la lengua inglesa en la navegación. Seguidamente se describe el lenguaje normalizado del mar. A continuación, y mediante un modelo de movimientos y pasos se analizan las fases que conforman un evento comunicativo marítimo y se describen las características generales y discursivas principales que predominan en dichos eventos. Por último, se incluyen

  10. 76 FR 46217 - Implementation of the Amendments to the International Convention on Standards of Training...

    Science.gov (United States)

    2011-08-02

    ..., and propose to incorporate the 2010 amendments to the STCW Convention that will come into force on... Organization (IMO) embarked on a comprehensive review of the entire STCW Convention and STCW Code. The Coast... what positions U.S. delegations should advocate and to exchange views about amendments to STCW...

  11. Clarifying Atomic Weights: A 2016 Four-Figure Table of Standard and Conventional Atomic Weights

    Science.gov (United States)

    Coplen, Tyler B.; Meyers, Fabienne; Holden, Norman E.

    2017-01-01

    To indicate that atomic weights of many elements are not constants of nature, in 2009 and 2011 the Commission on Isotopic Abundances and Atomic Weights (CIAAW) of the International Union of Pure and Applied Chemistry (IUPAC) replaced single-value standard atomic weight values with atomic weight intervals for 12 elements (hydrogen, lithium, boron,…

  12. ɛK^{} in the Standard Model and the kaon phase conventions

    Science.gov (United States)

    Sala F.

    2017-07-01

    The parameter ɛ_K , that quantifies CP violation in kaon mixing, is the observable setting the strongest constraints on new physics with a generic flavour and CP structure. While its experimental uncertainty is at the half percent level, the theoretical one is at the level of 15%. One of the largest sources of the latter uncertainty is the poor perturbative behaviour of the short-distance contribution of the box diagram with two charm quarks. In this proceeding, based on Ligeti and Sala arXiv:1602.08494 [hep-ph], I summarise how that contribution can be removed, from the imaginary part of the mixing amplitude, by a rephasing of the kaon fields. A first outcome is a mild reduction of the total theoretical uncertainty of ɛ_K : while this might look counterintuitive at first sight, if different "pieces" ( i.e. short- and long-distance) of an observable are computed with different techniques, then it is possible to choose a phase convention where the total uncertainty of that observable is optimised. Moreover, it is worthy to discuss if and how this freedom of rephasing, which has been somehow overlooked in the past, can help in making progress in lattice QCD computations of immediate relevance for ɛ_K.

  13. Epsilon_K in the Standard Model and the kaon phase conventions

    CERN Document Server

    Sala, Filippo

    2016-01-01

    The parameter epsilon_K, that quantifies CP violation in kaon mixing, is the observable setting the strongest constraints on new physics with a generic flavour and CP structure. While its experimental uncertainty is at the half percent level, the theoretical one is at the level of 15%. One of the largest sources of the latter uncertainty is the poor perturbative behaviour of the short-distance contribution of the box diagram with two charm quarks. In this proceeding, based on arXiv:1602.08494, I summarise how that contribution can be removed, from the imaginary part of the mixing amplitude, by a rephasing of the kaon fields. A first outcome is a mild reduction of the total theoretical uncertainty of epsilon_K: while this might look counterintuitive at first sight, if different "pieces" (i.e. short- and long-distance) of an observable are computed with different techniques, then it is possible to choose a phase convention where the total uncertainty of that observable is optimised. Moreover, it is worthy to di...

  14. Comparison of two minimal invasive techniques of splenectomy: Standard laparoscopy versus transumbilical multiport single-site laparoscopy with conventional instruments

    Directory of Open Access Journals (Sweden)

    Baris Bayraktar

    2015-01-01

    Full Text Available Background: Laparoendoscopic single-site (LESS splenectomy which is performed on small number of patients, has been introduced with better cosmetic outcome, less postoperative pain, greater patient satisfaction and faster recovery compared to standard laparoscopy. Materials and Methods : Thirty six patients were included in the study comparing standard laparoscopic splenectomy (LS, 17 patients transumbilical multiport splenectomy performed with conventional laparoscopic instruments (TUMP-LS, 19 patients. Two groups of patients were compared retrospectively by means of operation time, intra- and postoperative blood loss, perioperative complications, packed red cell and platelet requirements, lenght of hospitalization, pain scores and patient satisfaction. Results: There was no mortality in any of the groups, and no significant differences determined in operative time (P = 0,069, intraoperative blood loss (P = 0,641, patient satisfaction (P = 0,506, pain scores (P = 0,173 and the average length of hospital stay (P = 0,257. Umbilical incisions healed uneventfully and no hernia formation or wound infection was observed during follow-up period (2-34 months. There were no conversions to open surgery. Conclusions: Transumbilical multiport splenectomy performed with the conventional laparoscopic instruments is feasible and could be a logical alternative to classical laparoscopic splenectomy by combining the advantages of single access techniques and standard laparoscopy.

  15. Performance Based Plastic Design of Concentrically Braced Frame attuned with Indian Standard code and its Seismic Performance Evaluation

    Directory of Open Access Journals (Sweden)

    Sejal Purvang Dalal

    2015-12-01

    Full Text Available In the Performance Based Plastic design method, the failure is predetermined; making it famous throughout the world. But due to lack of proper guidelines and simple stepwise methodology, it is not quite popular in India. In this paper, stepwise design procedure of Performance Based Plastic Design of Concentrically Braced frame attuned with the Indian Standard code has been presented. The comparative seismic performance evaluation of a six storey concentrically braced frame designed using the displacement based Performance Based Plastic Design (PBPD method and currently used force based Limit State Design (LSD method has also been carried out by nonlinear static pushover analysis and time history analysis under three different ground motions. Results show that Performance Based Plastic Design method is superior to the current design in terms of displacement and acceleration response. Also total collapse of the frame is prevented in the PBPD frame.

  16. A decade of international cooperation brings a standard seismic point of view

    Science.gov (United States)

    Whitcomb, H. S

    1971-01-01

    Whether in a castle in Italy, a police station in Iceland, o an abandoned gold mine in Australia, the sensitive instruments in the Worldwide Seismograph Network send a steady flow of standard earthquake records to the geophysical scientific community. They provide the raw data that make possible very precise earthquake studies, precise because the instruments are identical and their product standard. A truly international program, this network of 115 stations in 61 countires and territories has laid the foundation for reserach in seismology for many years to come. 

  17. Hyperfractionated versus conventional radiotherapy followed by chemotherapy in standard-risk medulloblastoma: Results from the randomized multicenter HIT-SIOP PNET 4 trial

    NARCIS (Netherlands)

    B. Lannering (Birgitta); P. Rutkowski (Piotr); F.F. Doz (François); B. Pizer (Barry); G. Gustafsson (Göran); A. Navajas (Aurora); M. Massimino (Maura); R.E. Reddingius (Roel); M. Benesch (Martin); C. Carrie (Christian); R. Taylor; L. Gandola (Lorenza); T. Bjor̈k-Eriksson (Thomas); S. Giralt; F. Oldenburger (Foppe); T. Pietsch (Torsten); D. Figarella-Branger (Dominique); K. Robson (Kathryn); G. Forni (Gianluca); S.C. Clifford (Steven); M. Warmuth-Metz (Monica); D.D. Von Hoff; A. Faldum (Andreas); V. Mosseri (Véronique); B. Kortmann

    2012-01-01

    textabstractPurpose: To compare event-free survival (EFS), overall survival (OS), pattern of relapse, and hearing loss in children with standard-risk medulloblastoma treated by postoperative hyperfractionated or conventionally fractionated radiotherapy followed by maintenance chemotherapy. Patients

  18. Seismic texture classification. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Vinther, R.

    1997-12-31

    The seismic texture classification method, is a seismic attribute that can both recognize the general reflectivity styles and locate variations from these. The seismic texture classification performs a statistic analysis for the seismic section (or volume) aiming at describing the reflectivity. Based on a set of reference reflectivities the seismic textures are classified. The result of the seismic texture classification is a display of seismic texture categories showing both the styles of reflectivity from the reference set and interpolations and extrapolations from these. The display is interpreted as statistical variations in the seismic data. The seismic texture classification is applied to seismic sections and volumes from the Danish North Sea representing both horizontal stratifications and salt diapers. The attribute succeeded in recognizing both general structure of successions and variations from these. Also, the seismic texture classification is not only able to display variations in prospective areas (1-7 sec. TWT) but can also be applied to deep seismic sections. The seismic texture classification is tested on a deep reflection seismic section (13-18 sec. TWT) from the Baltic Sea. Applied to this section the seismic texture classification succeeded in locating the Moho, which could not be located using conventional interpretation tools. The seismic texture classification is a seismic attribute which can display general reflectivity styles and deviations from these and enhance variations not found by conventional interpretation tools. (LN)

  19. Comparison of size modulation and conventional standard automated perimetry with the 24-2 test protocol in glaucoma patients

    Science.gov (United States)

    Hirasawa, Kazunori; Shoji, Nobuyuki; Kasahara, Masayuki; Matsumura, Kazuhiro; Shimizu, Kimiya

    2016-05-01

    This prospective randomized study compared test results of size modulation standard automated perimetry (SM-SAP) performed with the Octopus 600 and conventional SAP (C-SAP) performed with the Humphrey Field Analyzer (HFA) in glaucoma patients. Eighty-eight eyes of 88 glaucoma patients underwent SM-SAP and C-SAP tests with the Octopus 600 24-2 Dynamic and HFA 24-2 SITA-Standard, respectively. Fovea threshold, mean defect, and square loss variance of SM-SAP were significantly correlated with the corresponding C-SAP indices (P < 0.001). The false-positive rate was slightly lower, and false-negative rate slightly higher, with SM-SAP than C-SAP (P = 0.002). Point-wise threshold values obtained with SM-SAP were moderately to strongly correlated with those obtained with C-SAP (P < 0.001). The correlation coefficients of the central zone were significantly lower than those of the middle to peripheral zone (P = 0.031). The size and depth of the visual field (VF) defect were smaller (P = 0.039) and greater (P = 0.043), respectively, on SM-SAP than on C-SAP. Although small differences were observed in VF sensitivity in the central zone, the defect size and depth and the reliability indices between SM-SAP and C-SAP, global indices of the two testing modalities were well correlated.

  20. Standard penetration test-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    Science.gov (United States)

    Cetin, K.O.; Seed, R.B.; Der Kiureghian, A.; Tokimatsu, K.; Harder, L.F.; Kayen, R.E.; Moss, R.E.S.

    2004-01-01

    This paper presents'new correlations for assessment of the likelihood of initiation (or triggering) of soil liquefaction. These new correlations eliminate several sources of bias intrinsic to previous, similar correlations, and provide greatly reduced overall uncertainty and variance. Key elements in the development of these new correlations are (1) accumulation of a significantly expanded database of field performance case histories; (2) use of improved knowledge and understanding of factors affecting interpretation of standard penetration test data; (3) incorporation of improved understanding of factors affecting site-specific earthquake ground motions (including directivity effects, site-specific response, etc.); (4) use of improved methods for assessment of in situ cyclic shear stress ratio; (5) screening of field data case histories on a quality/uncertainty basis; and (6) use of high-order probabilistic tools (Bayesian updating). The resulting relationships not only provide greatly reduced uncertainty, they also help to resolve a number of corollary issues that have long been difficult and controversial including: (1) magnitude-correlated duration weighting factors, (2) adjustments for fines content, and (3) corrections for overburden stress. ?? ASCE.

  1. Visualization of seismic tomography on Google Earth: Improvement of KML generator and its web application to accept the data file in European standard format

    Science.gov (United States)

    Yamagishi, Y.; Yanaka, H.; Tsuboi, S.

    2009-12-01

    We have developed a conversion tool for the data of seismic tomography into KML, called KML generator, and made it available on the web site (http://www.jamstec.go.jp/pacific21/google_earth). The KML generator enables us to display vertical and horizontal cross sections of the model on Google Earth in three-dimensional manner, which would be useful to understand the Earth's interior. The previous generator accepts text files of grid-point data having longitude, latitude, and seismic velocity anomaly. Each data file contains the data for each depth. Metadata, such as bibliographic reference, grid-point interval, depth, are described in other information file. We did not allow users to upload their own tomographic model to the web application, because there is not standard format to represent tomographic model. Recently European seismology research project, NEIRES (Network of Research Infrastructures for European Seismology), advocates that the data of seismic tomography should be standardized. They propose a new format based on JSON (JavaScript Object Notation), which is one of the data-interchange formats, as a standard one for the tomography. This format consists of two parts, which are metadata and grid-point data values. The JSON format seems to be powerful to handle and to analyze the tomographic model, because the structure of the format is fully defined by JavaScript objects, thus the elements are directly accessible by a script. In addition, there exist JSON libraries for several programming languages. The International Federation of Digital Seismograph Network (FDSN) adapted this format as a FDSN standard format for seismic tomographic model. There might be a possibility that this format would not only be accepted by European seismologists but also be accepted as the world standard. Therefore we improve our KML generator for seismic tomography to accept the data file having also JSON format. We also improve the web application of the generator so that the

  2. Time-Dependent Seismic Tomography

    Science.gov (United States)

    Julian, B. R.

    2008-12-01

    Temporal changes in seismic wave speeds in the Earth's crust have been measured at several locations, notably The Geysers geothermal area in California, in studies that used three-dimensional seismic tomography. These studies have used conventional tomography methods to invert multiple seismic-wave arrival time data sets independently and assumed that any differences in the derived structures reflect real temporal variations. Such an assumption is dangerous because the results of repeated tomography experiments would differ even if the structure did not change, simply because of variation in the seismic ray distribution caused by the natural variation in earthquake locations. This problem can be severe when changes in the seismicity distribution are systematic, as, for example, at the onset of an aftershock sequence. The sudden change in the ray distribution can produce artifacts that mimic changes in the seismic wave speeds at the time of a large earthquake. Even if the source locations did not change (if only explosion data were used, for example), derived structures would inevitably differ because of observational errors. A better approach to determining what temporal changes are truly required by the data is to invert multiple data sets simultaneously, imposing constraints to minimize differences between the models for different epochs. This problem is similar to that of seeking models similar to some a priori initial assumption, and a method similar to "damped least squares" can solve it. The order of the system of normal equations for inverting data from two epochs is twice as large as that for a single epoch, and solving it by standard methods requires eight times the computational labor. We present an algorithm for reducing this factor to two, so that inverting multiple epochs simultaneously is comparable in difficulty to inverting them independently, and illustrate its performance using synthetic arrival times and observed data from several areas in

  3. Linearized inversion frameworks toward high-resolution seismic imaging

    KAUST Repository

    Aldawood, Ali

    2016-09-01

    internally multiply scattered seismic waves to obtain highly resolved images delineating vertical faults that are otherwise not easily imaged by primaries. Seismic interferometry is conventionally based on the cross-correlation and convolution of seismic traces to transform seismic data from one acquisition geometry to another. The conventional interferometric transformation yields virtual data that suffers from low temporal resolution, wavelet distortion, and correlation/convolution artifacts. I therefore incorporate a least-squares datuming technique to interferometrically transform vertical-seismic-profile surface-related multiples to surface-seismic-profile primaries. This yields redatumed data with high temporal resolution and less artifacts, which are subsequently imaged to obtain highly resolved subsurface images. Tests on synthetic examples demonstrate the efficiency of the proposed techniques, yielding highly resolved migrated sections compared with images obtained by imaging conventionally redatumed data. I further advance the recently developed cost-effective Generalized Interferometric Multiple Imaging procedure, which aims to not only image first but also higher-order multiples as well. I formulate this procedure as a linearized inversion framework and solve it as a least-squares problem. Tests of the least-squares Generalized Interferometric Multiple imaging framework on synthetic datasets and demonstrate that it could provide highly resolved migrated images and delineate vertical fault planes compared with the standard procedure. The results support the assertion that this linearized inversion framework can illuminate subsurface zones that are mainly illuminated by internally scattered energy.

  4. 15/16 ips Operation of the Precision Instrument Company Model P15100 tape recorder to record the standard (30 Hz) NCER seismic data multiplex system

    Science.gov (United States)

    Eaton, Jerry P.

    1976-01-01

    In recent months the need has arisen to record special seismic networks consisting of a dozen or more standard NCER seismic systems telemetered to a central collection point on a reliable, portable, low-power tape recorder. Because of its simplicity and the ease with which it can be adapted for the purpose, the PI 5100 field recorder should be considered for such use. In the tests described here, a PI 5100 was speeded up to run at 15/16 inches per second (ips) and signals from the standard multiplex system test modulator bank were recorded on one tape track by means of a simple, improvised AM record amplifier. The results of these tests are extremely encouraging: the dynamic range of the system when played back on the Bell and Howell Model 3700 B reproduce machine, with subtractive compensation, is nearly as high as for the system employing the B&H 3700 B for recording. These notes indicate the principle employed to speed up the recorder, outline the circuit required to drive the tape heads in the AM record mode, and describe the tests carried out to evaluate the system's performance.

  5. Seismic Disaster Reduction in China

    Institute of Scientific and Technical Information of China (English)

    Ministry of Construction

    2001-01-01

    @@ Great accomplishments have been made in seismic disaster reduction in China's engineering construction and city construction projects during the past decade (1990~2000). A new national map on the division of seismic intensity has been promulgated, and a series of anti-seismic standards and norms have been drafted or revised, which has further improved the country's technical code system on anti-seismic engineering measures.

  6. Sensory profile of breast meat from broilers reared in an organic niche production system and conventional standard broilers

    DEFF Research Database (Denmark)

    Horsted, Klaus; Allesen-Holm, Bodil Helene; Hermansen, John Erik

    2012-01-01

    standard products (A and B) and three organic niche genotypes (I657, L40 and K8) reared in an apple orchard. RESULTS: Thirteen out of 22 sensory attributes differed significantly between the products. The aroma attributes ‘chicken’, ‘bouillon’ and ‘fat’ scored highest and the ‘iron/liver’ aroma lowest...... for the niche products. The meat was more ‘tender’, ‘short’ and ‘crumbly’ and less ‘hard’ and ‘stringy’ in the standard products than in one or more of the niche products. Product ‘I 657’ was less ‘juicy’ than the rest. Products ‘I 657’ and ‘L 40’ were more ‘cohesive’ and tasted more ‘sourish’ and less...... of ‘sweet/maize’ than the standard products. The ‘overall liking’ score was significantly higher for the ‘K 8’ product than for the ‘Standard A’ and ‘L 40’ products. The ‘overall liking’ score was significantly correlated with the scores for aroma and taste of ‘chicken’, ‘umami/bouillon’, ‘iron...

  7. Evaluation of seismic design spectrum based on UHS implementing fourth-generation seismic hazard maps of Canada

    Science.gov (United States)

    Ahmed, Ali; Hasan, Rafiq; Pekau, Oscar A.

    2016-12-01

    Two recent developments have come into the forefront with reference to updating the seismic design provisions for codes: (1) publication of new seismic hazard maps for Canada by the Geological Survey of Canada, and (2) emergence of the concept of new spectral format outdating the conventional standardized spectral format. The fourth -generation seismic hazard maps are based on enriched seismic data, enhanced knowledge of regional seismicity and improved seismic hazard modeling techniques. Therefore, the new maps are more accurate and need to incorporate into the Canadian Highway Bridge Design Code (CHBDC) for its next edition similar to its building counterpart National Building Code of Canada (NBCC). In fact, the code writers expressed similar intentions with comments in the commentary of CHBCD 2006. During the process of updating codes, NBCC, and AASHTO Guide Specifications for LRFD Seismic Bridge Design, American Association of State Highway and Transportation Officials, Washington (2009) lowered the probability level from 10 to 2% and 10 to 5%, respectively. This study has brought five sets of hazard maps corresponding to 2%, 5% and 10% probability of exceedance in 50 years developed by the GSC under investigation. To have a sound statistical inference, 389 Canadian cities are selected. This study shows the implications of the changes of new hazard maps on the design process (i.e., extent of magnification or reduction of the design forces).

  8. Seismic risk perception test

    Science.gov (United States)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro

    2013-04-01

    The perception of risks involves the process of collecting, selecting and interpreting signals about uncertain impacts of events, activities or technologies. In the natural sciences the term risk seems to be clearly defined, it means the probability distribution of adverse effects, but the everyday use of risk has different connotations (Renn, 2008). The two terms, hazards and risks, are often used interchangeably by the public. Knowledge, experience, values, attitudes and feelings all influence the thinking and judgement of people about the seriousness and acceptability of risks. Within the social sciences however the terminology of 'risk perception' has become the conventional standard (Slovic, 1987). The mental models and other psychological mechanisms which people use to judge risks (such as cognitive heuristics and risk images) are internalized through social and cultural learning and constantly moderated (reinforced, modified, amplified or attenuated) by media reports, peer influences and other communication processes (Morgan et al., 2001). Yet, a theory of risk perception that offers an integrative, as well as empirically valid, approach to understanding and explaining risk perception is still missing". To understand the perception of risk is necessary to consider several areas: social, psychological, cultural, and their interactions. Among the various research in an international context on the perception of natural hazards, it seemed promising the approach with the method of semantic differential (Osgood, C.E., Suci, G., & Tannenbaum, P. 1957, The measurement of meaning. Urbana, IL: University of Illinois Press). The test on seismic risk perception has been constructed by the method of the semantic differential. To compare opposite adjectives or terms has been used a Likert's scale to seven point. The test consists of an informative part and six sections respectively dedicated to: hazard; vulnerability (home and workplace); exposed value (with reference to

  9. Microwave-assisted versus conventional decomposition procedures applied to a ceramic potsherd standard reference material by inductively coupled plasma atomic emission spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Papadopoulou, D.N.; Zachariadis, G.A.; Anthemidis, A.N.; Tsirliganis, N.C.; Stratis, J.A

    2004-03-03

    Inductively coupled plasma atomic emission spectrometry (ICP-AES) is a powerful, sensitive analytical technique with numerous applications in chemical characterization including that of ancient pottery, mainly due to its multi-element character, and the relatively short time required for the analysis. A critical step in characterization studies of ancient pottery is the selection of a suitable decomposition procedure for the ceramic matrix. The current work presents the results of a comparative study of six decomposition procedures applied on a standard ceramic potsherd reference material, SARM 69. The investigated decomposition procedures included three microwave-assisted decomposition procedures, one wet decomposition (WD) procedure by conventional heating, one combined microwave-assisted and conventional heating WD procedure, and one fusion procedure. Chemical analysis was carried out by ICP-AES. Five major (Si, Al, Fe, Ca, Mg), three minor (Mn, Ba, Ti) and two trace (Cu, Co) elements were determined and compared with their certified values. Quantitation was performed at two different spectral lines for each element and multi-element matrix-matched calibration standards were used. The recovery values for the six decomposition procedures ranged between 75 and 110% with a few notable exceptions. Data were processed statistically in order to evaluate the investigated decomposition procedures in terms of recovery, accuracy and precision, and eventually select the most appropriate one for ancient pottery analysis.

  10. Comparison of seismic sources for shallow seismic: sledgehammer and pyrotechnics

    Directory of Open Access Journals (Sweden)

    Brom Aleksander

    2015-10-01

    Full Text Available The pyrotechnic materials are one of the types of the explosives materials which produce thermal, luminous or sound effects, gas, smoke and their combination as a result of a self-sustaining chemical reaction. Therefore, pyrotechnics can be used as a seismic source that is designed to release accumulated energy in a form of seismic wave recorded by tremor sensors (geophones after its passage through the rock mass. The aim of this paper was to determine the utility of pyrotechnics for shallow seismic engineering. The work presented comparing the conventional method of seismic wave excitation for seismic refraction method like plate and hammer and activating of firecrackers on the surface. The energy released by various sources and frequency spectra was compared for the two types of sources. The obtained results did not determine which sources gave the better results but showed very interesting aspects of using pyrotechnics in seismic measurements for example the use of pyrotechnic materials in MASW.

  11. Standardization of seismic tomographic models and earthquake focal mechanisms data sets based on web technologies, visualization with keyhole markup language

    Science.gov (United States)

    Postpischl, Luca; Danecek, Peter; Morelli, Andrea; Pondrelli, Silvia

    2011-01-01

    We present two projects in seismology that have been ported to web technologies, which provide results in Keyhole Markup Language (KML) visualization layers. These use the Google Earth geo-browser as the flexible platform that can substitute specialized graphical tools to perform qualitative visual data analyses and comparisons. The Network of Research Infrastructures for European Seismology (NERIES) Tomographic Earth Model Repository contains data sets from over 20 models from the literature. A hierarchical structure of folders that represent the sets of depths for each model is implemented in KML, and this immediately results into an intuitive interface for users to navigate freely and to compare tomographic plots. The KML layer for the European-Mediterranean Regional Centroid-Moment Tensor Catalog displays the focal mechanism solutions or moderate-magnitude Earthquakes from 1997 to the present. Our aim in both projects was to also propose standard representations of scientific data sets. Here, the general semantic approach of an XML framework has an important impact that must be further explored, although we find the KML syntax to more emphasis on aspects of detailed visualization. We have thus used, and propose the use of, Javascript Object Notation (JSON), another semantic notation that stems from the web-development community that provides a compact, general-purpose, data-exchange format.

  12. Comparison of validity of DIAGNOdent with conventional methods for detection of occlusal caries in primary molars using the histological gold standard: An in vivo study

    Directory of Open Access Journals (Sweden)

    Goel A

    2009-01-01

    Full Text Available Aim: This study was conducted to compare the in vivo effectiveness of DIAGNOdent with other conventional methods (visual, tactile and bitewing radiographs for the detection of occlusal caries in primary molars. Another objective of the study was to calculate new cut-off limits for the detection of caries by DIAGNOdent in primary teeth. Materials and Methods: Eighty-four primary molars in 52 children (aged 8-12 years, which were indicated for extraction, were selected and evaluated for dental caries using DIAGNOdent, visual and tactile examination and bitewing radiographs. Histological examination of the sections, prepared subsequent to extraction of the teeth, served as the gold standard for comparison of the above-mentioned methods. Results: When considering enamel caries, values obtained for sensitivity, specificity and accuracy were 48.15, 100 and 49.40% for visual examination, 48.15, 100.00 and 49.40% for tactile examination, 49.38, 50.00 and 49.40% for bitewing radiographs, 85.19, 50.00 and 84.34% for DIAGNOdent scores interpreted according to manufacturer′s cut-off limits and 81.48, 100.00 and 81.93% for DIAGNOdent scores interpreted according to newly formulated cut-off limits, respectively. At dentin caries cut-off levels, the values of sensitivity, specificity and accuracy for visual examination were 52.78, 89.36 and 73.49%; 50.00, 91.49 and 73.49% for tactile examination; 30.56, 82.98 and 60.24% for bitewing radiographs; 72.22, 76.60 and 74.70% for DIAGNOdent scores when interpreted according to manufacturer′s cut-off limits and 77.48, 74.47 and 75.90%, respectively, for the DIAGNOdent scores when interpreted according to the newly formulated cut-off limits. Conclusions: DIAGNOdent showed higher sensitivity and accuracy as compared with other conventional methods for detection of enamel caries, whereas for detection of dentinal caries, even though the sensitivity was high, accuracy of the DIAGNOdent device was similar to other

  13. Combining Standard Conventional Measures and Ecological Momentary Assessment of Depression, Anxiety and Coping Using Smartphone Application in Minor Stroke Population: A Longitudinal Study Protocol

    Directory of Open Access Journals (Sweden)

    Camille Vansimaeys

    2017-07-01

    Full Text Available Context: Stroke has several consequences on survivors’ daily life even for those who experience short-lasting neurological symptoms with no functional disability. Depression and anxiety are common psychological disorders occurring after a stroke. They affect long-term outcomes and quality of life but they are difficult to diagnose because of the neurobiological consequences of brain lesions. Current research priority is given to the improvement of the detection and prevention of those post-stroke psychological disorders. Although previous studies have brought promising perspectives, their designs based on retrospective tools involve some limits regarding their ecological validity. Ecological Momentary Assessment (EMA is an alternative to conventional instruments that could be a key in research for understanding processes that underlined post-stroke depression and anxiety onset. We aim to evaluate the feasibility and validity of anxiety, depression and coping EMA for minor stroke patients.Methods: Patients hospitalized in an Intensive Neuro-vascular Care Unit between April 2016 and January 2017 for a minor stroke is involved in a study based on an EMA methodology. We use a smartphone application in order to assess anxiety and depression symptoms and coping strategies four times a day during 1 week at three different times after stroke (hospital discharge, 2 and 4 months. Participants’ self-reports and clinician-rates of anxiety, depression and coping are collected simultaneously using conventional and standard instruments. Feasibility of the EMA method will be assessed considering the participation and compliance rate. Validity will be the assessed by comparing EMA and conventional self-report and clinician-rated measures.Discussion: We expect this study to contribute to the development of EMA using smartphone in minor stroke population. EMA method offers promising research perspective in the assessment and understanding of post

  14. Seismic Creep

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Seismic creep is the constant or periodic movement on a fault as contrasted with the sudden erupture associated with an earthquake. It is a usually slow deformation...

  15. Seismic seiches

    Science.gov (United States)

    McGarr, Arthur; Gupta, Harsh K.

    2011-01-01

    Seismic seiche is a term first used by Kvale (1955) to discuss oscillations of lake levels in Norway and England caused by the Assam earthquake of August 15, 1950. This definition has since been generalized to apply to standing waves set up in closed, or partially closed, bodies of water including rivers, shipping channels, lakes, swimming pools and tanks due to the passage of seismic waves from an earthquake.

  16. Seismic Data Gathering and Validation

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-02-01

    Three recent earthquakes in the last seven years have exceeded their design basis earthquake values (so it is implied that damage to SSC’s should have occurred). These seismic events were recorded at North Anna (August 2011, detailed information provided in [Virginia Electric and Power Company Memo]), Fukushima Daichii and Daini (March 2011 [TEPCO 1]), and Kaswazaki-Kariwa (2007, [TEPCO 2]). However, seismic walk downs at some of these plants indicate that very little damage occurred to safety class systems and components due to the seismic motion. This report presents seismic data gathered for two of the three events mentioned above and recommends a path for using that data for two purposes. One purpose is to determine what margins exist in current industry standard seismic soil-structure interaction (SSI) tools. The second purpose is the use the data to validated seismic site response tools and SSI tools. The gathered data represents free field soil and in-structure acceleration time histories data. Gathered data also includes elastic and dynamic soil properties and structural drawings. Gathering data and comparing with existing models has potential to identify areas of uncertainty that should be removed from current seismic analysis and SPRA approaches. Removing uncertainty (to the extent possible) from SPRA’s will allow NPP owners to make decisions on where to reduce risk. Once a realistic understanding of seismic response is established for a nuclear power plant (NPP) then decisions on needed protective measures, such as SI, can be made.

  17. seismic-py: Reading seismic data with Python

    Directory of Open Access Journals (Sweden)

    2008-08-01

    Full Text Available The field of seismic exploration of the Earth has changed
    dramatically over the last half a century. The Society of Exploration
    Geophysicists (SEG has worked to create standards to store the vast
    amounts of seismic data in a way that will be portable across computer
    architectures. However, it has been impossible to predict the needs of the
    immense range of seismic data acquisition systems. As a result, vendors have
    had to bend the rules to accommodate the needs of new instruments and
    experiment types. For low level access to seismic data, there is need for a
    standard open source library to allow access to a wide range of vendor data
    files that can handle all of the variations. A new seismic software package,
    seismic-py, provides an infrastructure for creating and managing drivers for
    each particular format. Drivers can be derived from one of the known formats
    and altered to handle any slight variations. Alternatively drivers can be
    developed from scratch for formats that are very different from any previously
    defined format. Python has been the key to making driver development easy
    and efficient to implement. The goal of seismic-py is to be the base system
    that will power a wide range of experimentation with seismic data and at the
    same time provide clear documentation for the historical record of seismic
    data formats.

  18. Setting norms in the United Nations system: the draft convention on the protection of the rights of all migrant workers and their families in relation to ILO in standards on migrant workers.

    Science.gov (United States)

    Hasenau, M

    1990-06-01

    The author reviews the U.N.'s draft proposal concerning the rights of migrant workers and their families. "This article examines the nature and scope of obligations under the United Nations Convention and contrasts them with existing international standards. In the light of the elaboration of the U.N. Convention, the conditions of future normative activities to limit negative consequences of a proliferation of instruments and supervisory mechanisms are outlined." Consideration is given to human and trade union rights, employment, social security, living and working conditions, workers' families, expulsion, and conditions of international migration. (SUMMARY IN FRE AND SPA)

  19. Seismic Studies

    Energy Technology Data Exchange (ETDEWEB)

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground

  20. Seismic Symphonies

    Science.gov (United States)

    Strinna, Elisa; Ferrari, Graziano

    2015-04-01

    The project started in 2008 as a sound installation, a collaboration between an artist, a barrel organ builder and a seismologist. The work differs from other attempts of sound transposition of seismic records. In this case seismic frequencies are not converted automatically into the "sound of the earthquake." However, it has been studied a musical translation system that, based on the organ tonal scale, generates a totally unexpected sequence of sounds which is intended to evoke the emotions aroused by the earthquake. The symphonies proposed in the project have somewhat peculiar origins: they in fact come to life from the translation of graphic tracks into a sound track. The graphic tracks in question are made up by copies of seismograms recorded during some earthquakes that have taken place around the world. Seismograms are translated into music by a sculpture-instrument, half a seismograph and half a barrel organ. The organ plays through holes practiced on paper. Adapting the documents to the instrument score, holes have been drilled on the waves' peaks. The organ covers about three tonal scales, starting from heavy and deep sounds it reaches up to high and jarring notes. The translation of the seismic records is based on a criterion that does match the highest sounds to larger amplitudes with lower ones to minors. Translating the seismogram in the organ score, the larger the amplitude of recorded waves, the more the seismogram covers the full tonal scale played by the barrel organ and the notes arouse an intense emotional response in the listener. Elisa Strinna's Seismic Symphonies installation becomes an unprecedented tool for emotional involvement, through which can be revived the memory of the greatest disasters of over a century of seismic history of the Earth. A bridge between art and science. Seismic Symphonies is also a symbolic inversion: the instrument of the organ is most commonly used in churches, and its sounds are derived from the heavens and

  1. Seismic Isolation Working Meeting Gap Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Sabharwall, Piyush [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  2. Estimation of slip rates and seismic hazard parameters using conventional techniques of structural geology in a slow-moving fault: Alhama de Murcia - Alcantarilla segment of the Alhama de Murcia Fault (Murcia, SE Spain)

    Science.gov (United States)

    Herrero-Barbero, Paula; Álvarez-Gómez, José Antonio; Jesús Martínez-Díaz, Jose

    2017-04-01

    The convergence between Nubian and Eurasian plates in the Western Mediterranean is being accommodated by the Eastern Betic Shear Zone, located in Southeastern Iberia. This is a low strain region whose faults show low slip rates and long recurrence periods of their maximum earthquakes, so they do not provide clear evidence of their seismogenic activity. The Alhama de Murcia - Alcantarilla segment, defined as the NE end of the Alhama de Murcia Fault, is one of the structures of the Eastern Betic Shear Zone and there are few in-depth studies about its seismic potential. In order to assess the seismogenic potential and slip-rate of this segment we have carried out a structural analysis. We have built a 3D geological model of the area where the fault is currently bounding the Neogene Fortuna basin. The structural model is based on seismic reflection profiles which have been later input in MOVE, structural modelling and analysis software. The analysis of the model has revealed several structural features related to positive inversion tectonics in Fortuna basin, specifically a typical "harpoon" structure whose deformation is estimated to have begun since Upper Miocene (Messinian). Geometric models and area balance methods (e.g. depth-to-detachment method) applied to the previously mentioned structure have allowed to estimate the heave of the fault, representing the amount of shortening observed in the fault section during its recent activity. The horizontal shortening rate estimated is between 0.09 and 0.26 mm/yr during the last 5.3 - 2.6 Ma. Projecting the obtained shortening onto the fault plane and considering the present regional tectonic shortening it has been possible to obtain a net slip rate between 0.13 and 0.37 mm/yr. Such parameters suggest that the Alhama de Murcia - Alcantarilla segment has less activity than other segments of the fault. The result obtained is consistent with the fact that the Carrascoy Fault, oriented parallel and located to the south of the

  3. Static behaviour of induced seismicity

    CERN Document Server

    Mignan, Arnaud

    2015-01-01

    The standard paradigm to describe seismicity induced by fluid injection is to apply nonlinear diffusion dynamics in a poroelastic medium. I show that the spatiotemporal behaviour and rate evolution of induced seismicity can, instead, be expressed by geometric operations on a static stress field produced by volume change at depth. I obtain laws similar in form to the ones derived from poroelasticity while requiring a lower description length. Although fluid flow is known to occur in the ground, it is not pertinent to the behaviour of induced seismicity. The proposed model is equivalent to the static stress model for tectonic foreshocks generated by the Non- Critical Precursory Accelerating Seismicity Theory. This study hence verifies the explanatory power of this theory outside of its original scope.

  4. Additive Complex Ayurvedic Treatment in Patients with Fibromyalgia Syndrome Compared to Conventional Standard Care Alone: A Nonrandomized Controlled Clinical Pilot Study (KAFA Trial

    Directory of Open Access Journals (Sweden)

    Christian S. Kessler

    2013-01-01

    Full Text Available Background. Fibromyalgia (FMS is a challenging condition for health care systems worldwide. Only limited trial data is available for FMS for outcomes of complex treatment interventions of complementary and integrative (CIM approaches. Methods. We conducted a controlled, nonrandomized feasibility study that compared outcomes in 21 patients treated with Ayurveda with those of 11 patients treated with a conventional approach at the end of a two-week inpatient hospital stay. Primary outcome was the impact of fibromyalgia on patients as assessed by the FIQ. Secondary outcomes included scores of pain intensity, pain perception, depression, anxiety, and quality of sleep. Follow-up assessments were done after 6 months. Results. At 2 weeks, there were comparable and significant improvements in the FIQ and for most of secondary outcomes in both groups with no significant in-between-group differences. The beneficial effects for both treatment groups were partly maintained for the main outcome and a number of secondary outcomes at the 6-month followup, again with no significant in-between-group differences. Discussion. The findings of this feasibility study suggest that Ayurvedic therapy is noninferior to conventional treatment in patients with severe FMS. Since Ayurveda was only used as add-on treatment, RCTs on Ayurveda alone are warranted to increase model validity. This trial is registered with NCT01389336.

  5. Advanced Seismic While Drilling System

    Energy Technology Data Exchange (ETDEWEB)

    Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

    2008-06-30

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII

  6. 7 CFR 1792.104 - Seismic acknowledgments.

    Science.gov (United States)

    2010-01-01

    ... registered architect or engineer responsible for the building design stating that seismic provisions pursuant to § 1792.103 of this subpart will be used in the design of the building. (a) For projects in which... include the identification and date of the model code or standard that is used in the seismic design...

  7. Seismic displacement of gravity retaining walls

    Directory of Open Access Journals (Sweden)

    Kamal Mohamed Hafez Ismail Ibrahim

    2015-08-01

    Full Text Available Seismic displacement of gravity walls had been studied using conventional static methods for controlled displacement design. In this study plain strain numerical analysis is performed using Plaxis dynamic program where prescribed displacement is applied at the bottom boundary of the soil to simulate the applied seismic load. Constrained absorbent side boundaries are introduced to prevent any wave reflection. The studied soil is chosen dense granular sand and modeled as elasto-plastic material according to Mohr–Column criteria while the gravity wall is assumed elastic. By comparing the resulted seismic wall displacements calculated by numerical analysis for six historical ground motions with that calculated by the pseudo-static method, it is found that numerical seismic displacements are either equal to or greater than corresponding pseudo-static values. Permissible seismic wall displacement calculated by AASHTO can be used for empirical estimation of seismic displacement. It is also found that seismic wall displacement is directly proportional with the positive angle of inclination of the back surface of the wall, soil flexibility and with the earthquake maximum ground acceleration. Seismic wall sliding is dominant and rotation is negligible for rigid walls when the ratio between the wall height and the foundation width is less than 1.4, while for greater ratios the wall becomes more flexible and rotation (rocking increases till the ratio reaches 1.8 where overturning is susceptible to take place. Cumulative seismic wall rotation increases with dynamic time and tends to be constant at the end of earthquake.

  8. Updated Colombian Seismic Hazard Map

    Science.gov (United States)

    Eraso, J.; Arcila, M.; Romero, J.; Dimate, C.; Bermúdez, M. L.; Alvarado, C.

    2013-05-01

    The Colombian seismic hazard map used by the National Building Code (NSR-98) in effect until 2009 was developed in 1996. Since then, the National Seismological Network of Colombia has improved in both coverage and technology providing fifteen years of additional seismic records. These improvements have allowed a better understanding of the regional geology and tectonics which in addition to the seismic activity in Colombia with destructive effects has motivated the interest and the need to develop a new seismic hazard assessment in this country. Taking advantage of new instrumental information sources such as new broad band stations of the National Seismological Network, new historical seismicity data, standardized global databases availability, and in general, of advances in models and techniques, a new Colombian seismic hazard map was developed. A PSHA model was applied. The use of the PSHA model is because it incorporates the effects of all seismic sources that may affect a particular site solving the uncertainties caused by the parameters and assumptions defined in this kind of studies. First, the seismic sources geometry and a complete and homogeneous seismic catalog were defined; the parameters of seismic rate of each one of the seismic sources occurrence were calculated establishing a national seismotectonic model. Several of attenuation-distance relationships were selected depending on the type of seismicity considered. The seismic hazard was estimated using the CRISIS2007 software created by the Engineering Institute of the Universidad Nacional Autónoma de México -UNAM (National Autonomous University of Mexico). A uniformly spaced grid each 0.1° was used to calculate the peak ground acceleration (PGA) and response spectral values at 0.1, 0.2, 0.3, 0.5, 0.75, 1, 1.5, 2, 2.5 and 3.0 seconds with return periods of 75, 225, 475, 975 and 2475 years. For each site, a uniform hazard spectrum and exceedance rate curves were calculated. With the results, it is

  9. Risk factors associated with bulk tank standard plate count, bulk tank coliform count, and the presence of Staphylococcus aureus on organic and conventional dairy farms in the United States.

    Science.gov (United States)

    Cicconi-Hogan, K M; Gamroth, M; Richert, R; Ruegg, P L; Stiglbauer, K E; Schukken, Y H

    2013-01-01

    The purpose of this study was to assess the association of bulk tank milk standard plate counts, bulk tank coliform counts (CC), and the presence of Staphylococcus aureus in bulk tank milk with various management and farm characteristics on organic and conventional dairy farms throughout New York, Wisconsin, and Oregon. Data from size-matched organic farms (n=192), conventional nongrazing farms (n=64), and conventional grazing farms (n=36) were collected at a single visit for each farm. Of the 292 farms visited, 290 bulk tank milk samples were collected. Statistical models were created using data from all herds in the study, as well as exclusively for the organic subset of herds. Because of incomplete data, 267 of 290 herds were analyzed for total herd modeling, and 173 of 190 organic herds were analyzed for the organic herd modeling. Overall, more bulk tanks from organic farms had Staph. aureus cultured from them (62% of organic herds, 42% conventional nongrazing herds, and 43% of conventional grazing herds), whereas fewer organic herds had a high CC, defined as ≥50 cfu/mL, than conventional farms in the study. A high standard plate count (×1,000 cfu/mL) was associated with decreased body condition score of adult cows and decreased milk production in both models. Several variables were significant only in the model created using all herds or only in organic herds. The presence of Staph. aureus in the bulk tank milk was associated with fewer people treating mastitis, increased age of housing, and a higher percentage of cows with 3 or fewer teats in both the organic and total herd models. The Staph. aureus total herd model also showed a relationship with fewer first-lactation animals, higher hock scores, and less use of automatic takeoffs at milking. High bulk tank CC was related to feeding a total mixed ration and using natural service in nonlactating heifers in both models. Overall, attentive management and use of outside resources were useful with regard to CC

  10. Quality of Survival and Growth in Children and Young Adults in the PNET4 European Controlled Trial of Hyperfractionated Versus Conventional Radiation Therapy for Standard-Risk Medulloblastoma

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, Colin, E-mail: crk1@soton.ac.uk [University of Southampton Faculty of Medicine and University Hospital Southampton National Health Service Foundation Trust, Southampton (United Kingdom); Bull, Kim [University of Southampton Faculty of Medicine and University Hospital Southampton National Health Service Foundation Trust, Southampton (United Kingdom); Chevignard, Mathilde [Hôpitaux de Saint Maurice, Saint Maurice (France); Neurophysiology, University of Pierre et Marie-Curie Paris 6, Paris (France); Culliford, David [University of Southampton Faculty of Medicine and University Hospital Southampton National Health Service Foundation Trust, Southampton (United Kingdom); Dörr, Helmuth G. [Kinder- und Jugendklinik der Universität Erlangen, Erlangen (Germany); Doz, François [Institut Curie and University Paris Descartes, Sorbonne Paris Cité (France); Kortmann, Rolf-Dieter [Department of Radiation Therapy, University of Leipzig, Leipzig (Germany); Lannering, Birgitta [Department of Pediatrics, The Sahlgren Academy, University of Gothenburg, Gothenburg (Sweden); Massimino, Maura [Fondazione Istituto di Ricovero e Cura a Carattere Scientifico IRCCS Istituto Nazionale dei Tumori, Milan (Italy); Navajas Gutiérrez, Aurora [Hospital Universitario Cruces, Baracaldo-Vizcaya (Spain); Rutkowski, Stefan [University Medical Center Hamburg-Eppendorf, Hamburg (Germany); Spoudeas, Helen A. [Center for Pediatric Endocrinology, University College London, London (United Kingdom); Calaminus, Gabriele [Pediatric Oncology, University of Muenster, Muenster (Germany)

    2014-02-01

    Purpose: To compare quality of survival in “standard-risk” medulloblastoma after hyperfractionated radiation therapy of the central nervous system with that after standard radiation therapy, combined with a chemotherapy regimen common to both treatment arms, in the PNET4 randomised controlled trial. Methods and Materials: Participants in the PNET4 trial and their parents/caregivers in 7 participating anonymized countries completed standardized questionnaires in their own language on executive function, health status, behavior, health-related quality of life, and medical, educational, employment, and social information. Pre- and postoperative neurologic status and serial heights and weights were also recorded. Results: Data were provided by 151 of 244 eligible survivors (62%) at a median age at assessment of 15.2 years and median interval from diagnosis of 5.8 years. Compared with standard radiation therapy, hyperfractionated radiation therapy was associated with lower (ie, better) z-scores for executive function in all participants (mean intergroup difference 0.48 SDs, 95% confidence interval 0.16-0.81, P=.004), but health status, behavioral difficulties, and health-related quality of life z-scores were similar in the 2 treatment arms. Data on hearing impairment were equivocal. Hyperfractionated radiation therapy was also associated with greater decrement in height z-scores (mean intergroup difference 0.43 SDs, 95% confidence interval 0.10-0.76, P=.011). Conclusions: Hyperfractionated radiation therapy was associated with better executive function and worse growth but without accompanying change in health status, behavior, or quality of life.

  11. Assessment of rock burst hazards by means of seismic methods

    Energy Technology Data Exchange (ETDEWEB)

    Proskuryakov, V.M.

    1984-10-01

    Use of seismic methods for assessment of stress distribution in coal seams and in rock strata adjacent to coal seams is discussed. Analysis of information on stress distribution permits rock burst hazards to be forecast. Schemes of seismic logging used in coal mining are compared. Recommendations developed by the VNIMI Institute for optimization of seismic logging are analyzed: selecting a seismic method considering tectonics, stratification and rock properties, arrangement of seismic sources and seismic detectors, selecting the optimum parameters of seismic waves (wave frequency recommended for rocks ranges from 400 to 1000 Hz; recommended wave frequency for coal ranges from 200 to 600 Hz), measuring instruments (e.g. the ShchTsS-2 system), and calculation methods used for evaluations of seismic logging. A standardized procedure for seismic logging is recommended.

  12. Large, moderate, small earthquakes and seismic fortification criterion

    Institute of Scientific and Technical Information of China (English)

    沈建文; 石树中

    2004-01-01

    This paper discusses the relation between two-step seismic design and the standard of probability of exceedance, and the relation of three-levels seismic ground motion parameters given by probability method and comprehensive probability method. The relative size relations of the ground motions with 2%, 10%, 63% probability of exceedance in 50 years, namely "large earthquake","moderate earthquake", and "small earthquake", are discussed through a practical example of seismic hazard analysis. The methods to determine seismic fortification criterion are discussed.

  13. Comparison between Chinese and American design standards of anti-seismic calculation for large-scale storage tank%中美大型储罐设计标准抗震计算对比

    Institute of Scientific and Technical Information of China (English)

    刘佳; 袁玲; 唐悦影; 卢向红

    2013-01-01

    基于石油储罐设计日益大型化和浮放式的现状,对比分析了现行中美大型储罐设计标准GB 50341-2003和API 650-2012关于抗震计算的相关规定,总结了其在抗震设防基准、设计准则、数学模型及其参数和计算方法等方面的差异.依据两国标准规定的方法,对某项目5 000 m3内浮顶罐的抗震计算不同,尤其是罐壁临界许用应力,两标准的计算结果相差近3倍,是所有计算结果中差别最大的参数,也是两者设防目标不同在数值上的表现.GB 50341-2003规定储罐上部自由空间即为晃动波高,而API 650-2012规定储罐上部自由空间的确定需要充分考虑储罐地震用途组别等因素,更有针对性地定义了储罐上部自由空间与晃动波高的关系.%Based on the status quo of increasingly upsizing and free standing oil tank, contrastive analysis is conducted for relevant regulations of Chinese and American design standards (GB 50341 -2003 and API 650-2012) for large-scale storage tank on the seismic calculation to summarize the differences in the seismic fortification basis, design criteria, mathematical model as well as its parameters and calculation methods. The methods specified in Chinese and American standards produce different seismic calculations for the 5 000 m3 covered floating roof tank in the project. Especially with respect to the critical allowable stress of the tank wall, the difference in calculation results of both standards can be nearly 3 times as large. It is the parameter with the biggest difference of calculation results but also the numerical reflection of different fortification objectives in both standards. GB 50341-2003 specifies the top free space of the storage tank as the sloshing wave height, while API 650-2012 specifies that the top free space of the storage tank shall be determined in full consideration of tank seismic purpose, group and other factors to more accurately define the relationship between the top

  14. SEISMIC DETERMINATION OF RESERVOIR HETEROGENEITY; APPLICATION TO THE CHARACTERIZATION OF HEAVY OIL RESERVOIRS

    Energy Technology Data Exchange (ETDEWEB)

    Matthias G. Imhof; James W. Castle

    2003-11-01

    The objective of the project is to examine how seismic and geologic data can be used to improve characterization of small-scale heterogeneity and their parameterization in reservoir models. The study is performed at West Coalinga Field in California. We continued our investigation on the nature of seismic reactions from heterogeneous reservoirs. We began testing our algorithm to infer parameters of object-based reservoir models from seismic data. We began integration of seismic and geologic data to determine the deterministic limits of conventional seismic data interpretation. Lastly, we began integration of seismic and geologic heterogeneity using stochastic models conditioned both on wireline and seismic data.

  15. EMERALD: A Flexible Framework for Managing Seismic Data

    Science.gov (United States)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2010-12-01

    The seismological community is challenged by the vast quantity of new broadband seismic data provided by large-scale seismic arrays such as EarthScope’s USArray. While this bonanza of new data enables transformative scientific studies of the Earth’s interior, it also illuminates limitations in the methods used to prepare and preprocess those data. At a recent seismic data processing focus group workshop, many participants expressed the need for better systems to minimize the time and tedium spent on data preparation in order to increase the efficiency of scientific research. Another challenge related to data from all large-scale transportable seismic experiments is that there currently exists no system for discovering and tracking changes in station metadata. This critical information, such as station location, sensor orientation, instrument response, and clock timing data, may change over the life of an experiment and/or be subject to post-experiment correction. Yet nearly all researchers utilize metadata acquired with the downloaded data, even though subsequent metadata updates might alter or invalidate results produced with older metadata. A third long-standing issue for the seismic community is the lack of easily exchangeable seismic processing codes. This problem stems directly from the storage of seismic data as individual time series files, and the history of each researcher developing his or her preferred data file naming convention and directory organization. Because most processing codes rely on the underlying data organization structure, such codes are not easily exchanged between investigators. To address these issues, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The goal of the EMERALD project is to provide seismic researchers with a unified, user-friendly, extensible system for managing seismic event data, thereby increasing the efficiency of scientific enquiry. EMERALD stores seismic data and metadata in a

  16. SEISMIC GEOLOGY

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    <正>20091465 Cai Xuelin(College of Earth Sciences,Chengdu University of Technology,Chengdu 610059,China);Cao Jiamin Preliminary Study on the 3-D Crust Structure for the Longmen Lithosphere and the Genesis of the Huge Wenchuan Earthquake,Sichuan Province,China(Journal of Chengdu University of Technology,ISSN1671-9727,CN51-1634/N,35(4),2008,p.357-365,8 illus.,39 refs.)Key words:deep-seated structures,large earthquakes,Longmenshan Fracture ZoneBased on a structural analysis of many seismic sounding profiles,there are two fault systems in Longmen collisional orogenic belt,Sichuan Province,China.They are both different obviously and correlative closely.One is shallow fault system composed mainly of brittle shear zones in surface crust,and the other is deep fault system composed mainly of crust-mantle ductile shear zones cutting Moho discontinuity.Based on the result of researching geological structure and seismic sounding profiles,

  17. USAGE OF RESERVIOR INFORMATION TO IMPROVE RESOLUTION OF SEISMIC DATA

    Institute of Scientific and Technical Information of China (English)

    SONG; Jian-guo; DU; Shi-tong; SUN; Xi-ping

    2001-01-01

    The poor resolution of conventional seismic data could not fit it for reservoir description.Meanwhile only seismic data could provide 3-D information of reservoir,it is very important to improve resolution of seismic data.Here a method is put forward,by using inversion techniques,to improve the seismic data with the quality of higher resolution and lower noise.The specific character of this method is the usage of geologic rules in processing seismic data,which is quite different from the hypothesis of some deconvolution.There are always some assumptions about wavelet or reflect series in conventional deconvolution is not appropriate.Application of this method on seismic data from several oilfields show its effectivenese and efficency.

  18. Simultaneous assessment of the median annual seismicity rates and their dispersions for Taiwan earthquakes in different depth ranges

    Science.gov (United States)

    Chang, Wen-Yen; Chen, Kuei-Pao; Tsai, Yi-Ben

    2017-03-01

    The main purpose of this study is to apply an innovative approach to assess simultaneously the median annual seismicity rates and their dispersions for Taiwan earthquakes in different depth ranges. In this approach an alternative Gutenberg-Richter (G-R) relation is explicitly expressed in terms of both the logarithmic mean annual seismicity rate and its standard deviation, instead of only by the arithmetic mean in the conventional G-R relation. Seismicity data from 1975 to 2014 in a Taiwan earthquake catalog with homogenized Mw moment magnitudes are used in this study. This catalog consists of high-quality earthquake data originally obtained by the Institute of Earth Sciences (IES) and the Central Weather Bureau (CWB). The selected seismicity data set is shown to be complete for Mw ⩾ 3.0 . The logarithmic mean annual seismicity rate and its standard deviation from the observed annual seismicity rates of individual years are obtained initially for different Mw ranges. It is shown subsequently that the logarithmic annual seismicity rates indeed possess a well-behaved lognormal distribution. It is further shown that our new approach has an added merit that tends to suppress the influences of anomalously high annual seismicity rates due to large numbers of aftershocks from major earthquake sequences. Finally, the observed logarithmic mean annual seismicity rates with their standard deviations for 3.0 ⩽ Mw ⩽ 5.0 are used to obtain the alternative Gutenberg-Richter relations for different depth ranges. The results are as follows: log10 N = 5.75 - 0.90Mw ± (0.25 - 0.01Mw) for focal depth 0-300 km; log10 N = 5.78 - 0.94Mw ± (0.20 + 0.01Mw) for focal depth 0-35 km; log10 N = 4.72 - 0.89Mw ± (- 0.08 + 0.08Mw) for focal depth 35-70 km; log10 N = 4.69 - 0.88Mw ± (- 0.47 + 0.16Mw) for focal depth 70-300 km. In above equations log10N represents the logarithmic annual seismicity rate. These G-R relations give distinctly different values of the parameters a and b for

  19. Comparative study of codes for the seismic design of structures

    Directory of Open Access Journals (Sweden)

    S. H. C. Santos

    Full Text Available A general evaluation of some points of the South American seismic codes is presented herein, comparing them among themselves and with the American Standard ASCE/SEI 7/10 and with the European Standard Eurocode 8. The study is focused in design criteria for buildings. The Western border of South America is one of the most seismically active regions of the World. It corresponds to the confluence of the South American and Nazca plates. This region corresponds roughly to the vicinity of the Andes Mountains. This seismicity diminishes in the direction of the comparatively seismically quieter Eastern South American areas. The South American countries located in its Western Border possess standards for seismic design since some decades ago, being the Brazilian Standard for seismic design only recently published. This study is focused in some critical topics: definition of the recurrence periods for establishing the seismic input; definition of the seismic zonation and design ground motion values; definition of the shape of the design response spectra; consideration of soil amplification, soil liquefaction and soil-structure interaction; classification of the structures in different importance levels; definition of the seismic force-resisting systems and respective response modification coefficients; consideration of structural irregularities and definition of the allowable procedures for the seismic analyses. A simple building structure is analyzed considering the criteria of the several standards and obtained results are compared.

  20. Quantitative Seismic Amplitude Analysis

    OpenAIRE

    Dey, A. K.

    2011-01-01

    The Seismic Value Chain quantifies the cyclic interaction between seismic acquisition, imaging and reservoir characterization. Modern seismic innovation to address the global imbalance in hydrocarbon supply and demand requires such cyclic interaction of both feed-forward and feed-back processes. Currently, the seismic value chain paradigm is in a feed-forward mode. Modern seismic data now have the potential to yield the best images in terms of spatial resolution, amplitude accuracy, and incre...

  1. Application of seismic interferometry by multidimensional deconvolution to ambient seismic noise recorded in Malargüe, Argentina

    NARCIS (Netherlands)

    Weemstra, Cornelis; Draganov, Deyan; Ruigrok, Elmer N.; Hunziker, Jürg; Gomez, Martin; Wapenaar, Kees

    Obtaining new seismic responses from existing recordings is generally referred to as seismic interferometry (SI). Conventionally, the SI responses are retrieved by simple crosscorrelation of recordings made by separate receivers: one of the receivers acts as a 'virtual source' whose response is

  2. Improving Reservoir Simulation using Seismic Data

    Science.gov (United States)

    Shamsa, Amir

    The principal premise of this thesis is that the ambiguities of reservoir simulation can be and should be reduced by using time-lapse seismic data. Such data can be considered as a sort of reservoir dynamic data, with distinctive features compared to the typical reservoir production data. While well production data are sparse in space and dense in time, 4D timelapse seismic can be utilized to fill the spatial data gaps between wells. This provides an opportunity to constrain reservoir dynamic behaviour not only at well locations but also between them by honoring time lapse response of the reservoir. This means that seismic assisted history matching should involve a simultaneous minimization of the mismatch between all types of measured and simulated data including seismic data. This thesis is an effort to discuss critical aspects of integrating 4D time-lapse data in reservoir simulation and history matching. I have illustrated a detailed scheme of seismic assisted history matching with implications on real data, to emphasize the extra value that seismic data can bring into the conventional reservoir history matching. This goal was followed by developing a software application to assess the feasibility of the theory at industrial scales. In addition to the conventional oils, a significant effort has been devoted to extend the scope of the work to viscoelastic heavy oils and their fluid substitution models in thermal cases. I also studied the production/injection induced stresses impacts on anisotropic velocity variations, using coupled geomechanical-flow simulations. (Abstract shortened by UMI.).

  3. National Seismic Network of Georgia

    Science.gov (United States)

    Tumanova, N.; Kakhoberashvili, S.; Omarashvili, V.; Tserodze, M.; Akubardia, D.

    2016-12-01

    Georgia, as a part of the Southern Caucasus, is tectonically active and structurally complex region. It is one of the most active segments of the Alpine-Himalayan collision belt. The deformation and the associated seismicity are due to the continent-continent collision between the Arabian and Eurasian plates. Seismic Monitoring of country and the quality of seismic data is the major tool for the rapid response policy, population safety, basic scientific research and in the end for the sustainable development of the country. National Seismic Network of Georgia has been developing since the end of 19th century. Digital era of the network started from 2003. Recently continuous data streams from 25 stations acquired and analyzed in the real time. Data is combined to calculate rapid location and magnitude for the earthquake. Information for the bigger events (Ml>=3.5) is simultaneously transferred to the website of the monitoring center and to the related governmental agencies. To improve rapid earthquake location and magnitude estimation the seismic network was enhanced by installing additional 7 new stations. Each new station is equipped with coupled Broadband and Strong Motion seismometers and permanent GPS system as well. To select the sites for the 7 new base stations, we used standard network optimization techniques. To choose the optimal sites for new stations we've taken into account geometry of the existed seismic network, topographic conditions of the site. For each site we studied local geology (Vs30 was mandatory for each site), local noise level and seismic vault construction parameters. Due to the country elevation, stations were installed in the high mountains, no accessible in winter due to the heavy snow conditions. To secure online data transmission we used satellite data transmission as well as cell data network coverage from the different local companies. As a result we've already have the improved earthquake location and event magnitudes. We

  4. Research on performance-based seismic design criteria

    Institute of Scientific and Technical Information of China (English)

    谢礼立; 马玉宏

    2002-01-01

    The seismic design criterion adopted in the existing seismic design codes is reviewed. It is pointed out that the presently used seismic design criterion is not satisfied with the requirements of nowadays social and economic development. A new performance-based seismic design criterion that is composed of three components is presented in this paper. It can not only effectively control the economic losses and casualty, but also ensure the building(s function in proper operation during earthquakes. The three components are: classification of seismic design for buildings, determination of seismic design intensity and/or seismic design ground motion for controlling seismic economic losses and casualties, and determination of the importance factors in terms of service periods of buildings. For controlling the seismic human losses, the idea of socially acceptable casualty level is presented and the (Optimal Economic Decision Model( and (Optimal Safe Decision Model( are established. Finally, a new method is recommended for calculating the importance factors of structures by adjusting structures service period on the base of more important structure with longer service period than the conventional ones. Therefore, the more important structure with longer service periods will be designed for higher seismic loads, in case the exceedance probability of seismic hazard in different service period is same.

  5. Linearized inversion of multiple scattering seismic energy

    Science.gov (United States)

    Aldawood, Ali; Hoteit, Ibrahim; Zuberi, Mohammad

    2014-05-01

    Internal multiples deteriorate the quality of the migrated image obtained conventionally by imaging single scattering energy. So, imaging seismic data with the single-scattering assumption does not locate multiple bounces events in their actual subsurface positions. However, imaging internal multiples properly has the potential to enhance the migrated image because they illuminate zones in the subsurface that are poorly illuminated by single scattering energy such as nearly vertical faults. Standard migration of these multiples provides subsurface reflectivity distributions with low spatial resolution and migration artifacts due to the limited recording aperture, coarse sources and receivers sampling, and the band-limited nature of the source wavelet. The resultant image obtained by the adjoint operator is a smoothed depiction of the true subsurface reflectivity model and is heavily masked by migration artifacts and the source wavelet fingerprint that needs to be properly deconvolved. Hence, we proposed a linearized least-square inversion scheme to mitigate the effect of the migration artifacts, enhance the spatial resolution, and provide more accurate amplitude information when imaging internal multiples. The proposed algorithm uses the least-square image based on single-scattering assumption as a constraint to invert for the part of the image that is illuminated by internal scattering energy. Then, we posed the problem of imaging double-scattering energy as a least-square minimization problem that requires solving the normal equation of the following form: GTGv = GTd, (1) where G is a linearized forward modeling operator that predicts double-scattered seismic data. Also, GT is a linearized adjoint operator that image double-scattered seismic data. Gradient-based optimization algorithms solve this linear system. Hence, we used a quasi-Newton optimization technique to find the least-square minimizer. In this approach, an estimate of the Hessian matrix that contains

  6. Dataset on the mean, standard deviation, broad-sense heritability and stability of wheat quality bred in three different ways and grown under organic and low-input conventional systems

    Directory of Open Access Journals (Sweden)

    Marianna Rakszegi

    2016-06-01

    Full Text Available An assessment was previously made of the effects of organic and low-input field management systems on the physical, grain compositional and processing quality of wheat and on the performance of varieties developed using different breeding methods (“Comparison of quality parameters of wheat varieties with different breeding origin under organic and low-input conventional conditions” [1]. Here, accompanying data are provided on the performance and stability analysis of the genotypes using the coefficient of variation and the ‘ranking’ and ‘which-won-where’ plots of GGE biplot analysis for the most important quality traits. Broad-sense heritability was also evaluated and is given for the most important physical and quality properties of the seed in organic and low-input management systems, while mean values and standard deviation of the studied properties are presented separately for organic and low-input fields.

  7. Dataset on the mean, standard deviation, broad-sense heritability and stability of wheat quality bred in three different ways and grown under organic and low-input conventional systems.

    Science.gov (United States)

    Rakszegi, Marianna; Löschenberger, Franziska; Hiltbrunner, Jürg; Vida, Gyula; Mikó, Péter

    2016-06-01

    An assessment was previously made of the effects of organic and low-input field management systems on the physical, grain compositional and processing quality of wheat and on the performance of varieties developed using different breeding methods ("Comparison of quality parameters of wheat varieties with different breeding origin under organic and low-input conventional conditions" [1]). Here, accompanying data are provided on the performance and stability analysis of the genotypes using the coefficient of variation and the 'ranking' and 'which-won-where' plots of GGE biplot analysis for the most important quality traits. Broad-sense heritability was also evaluated and is given for the most important physical and quality properties of the seed in organic and low-input management systems, while mean values and standard deviation of the studied properties are presented separately for organic and low-input fields.

  8. Studies on seismic waves

    Institute of Scientific and Technical Information of China (English)

    张海明; 陈晓非

    2003-01-01

    The development of seismic wave study in China in the past four years is reviewed. The discussion is divided into several aspects, including seismic wave propagation in laterally homogeneous media, laterally heterogeneous media, anisotropic and porous media, surface wave and seismic wave inversion, and seismic wave study in prospecting and logging problems. Important projects in the current studies on seismic wave is suggested as the development of high efficient numerical methods, and applying them to the studies of excitation and propagation of seismic waves in complex media and strong ground motion, which will form a foundation for refined earthquake hazard analysis and prediction.

  9. Techniques for Surveying Urban Active Faults by Seismic Methods

    Institute of Scientific and Technical Information of China (English)

    Xu Mingcai; Gao Jinghua; Liu Jianxun; Rong Lixin

    2005-01-01

    Using the seismic method to detect active faults directly below cities is an irreplaceable prospecting technique. The seismic method can precisely determine the fault position. Seismic method itself can hardly determine the geological age of fault. However, by considering in connection with the borehole data and the standard geological cross-section of the surveyed area, the geological age of reflected wave group can be qualitatively (or semi-quantitatively)determined from the seismic depth profile. To determine the upper terminal point of active faults directly below city, it is necessary to use the high-resolution seismic reflection technique.To effectively determine the geometric feature of deep faults, especially to determine the relation between deep and shallow fracture structures, the seismic reflection method is better than the seismic refraction method.

  10. Quantitative Seismic Amplitude Analysis

    NARCIS (Netherlands)

    Dey, A.K.

    2011-01-01

    The Seismic Value Chain quantifies the cyclic interaction between seismic acquisition, imaging and reservoir characterization. Modern seismic innovation to address the global imbalance in hydrocarbon supply and demand requires such cyclic interaction of both feed-forward and feed-back processes.

  11. Quantitative Seismic Amplitude Analysis

    NARCIS (Netherlands)

    Dey, A.K.

    2011-01-01

    The Seismic Value Chain quantifies the cyclic interaction between seismic acquisition, imaging and reservoir characterization. Modern seismic innovation to address the global imbalance in hydrocarbon supply and demand requires such cyclic interaction of both feed-forward and feed-back processes. Cur

  12. Robotization in Seismic Acquisition

    NARCIS (Netherlands)

    Blacquière, G.; Berkhout, A.J.

    2013-01-01

    The amount of sources and detectors in the seismic method follows "Moore’s Law of seismic data acquisition", i.e., it increases approximately by a factor of 10 every 10 years. Therefore automation is unavoidable, leading to robotization of seismic data acquisition. Recently, we introduced a new

  13. Advanced Seismic While Drilling System

    Energy Technology Data Exchange (ETDEWEB)

    Robert Radtke; John Fontenot; David Glowka; Robert Stokes; Jeffery Sutherland; Ron Evans; Jim Musser

    2008-06-30

    A breakthrough has been discovered for controlling seismic sources to generate selectable low frequencies. Conventional seismic sources, including sparkers, rotary mechanical, hydraulic, air guns, and explosives, by their very nature produce high-frequencies. This is counter to the need for long signal transmission through rock. The patent pending SeismicPULSER{trademark} methodology has been developed for controlling otherwise high-frequency seismic sources to generate selectable low-frequency peak spectra applicable to many seismic applications. Specifically, we have demonstrated the application of a low-frequency sparker source which can be incorporated into a drill bit for Drill Bit Seismic While Drilling (SWD). To create the methodology of a controllable low-frequency sparker seismic source, it was necessary to learn how to maximize sparker efficiencies to couple to, and transmit through, rock with the study of sparker designs and mechanisms for (a) coupling the sparker-generated gas bubble expansion and contraction to the rock, (b) the effects of fluid properties and dynamics, (c) linear and non-linear acoustics, and (d) imparted force directionality. After extensive seismic modeling, the design of high-efficiency sparkers, laboratory high frequency sparker testing, and field tests were performed at the University of Texas Devine seismic test site. The conclusion of the field test was that extremely high power levels would be required to have the range required for deep, 15,000+ ft, high-temperature, high-pressure (HTHP) wells. Thereafter, more modeling and laboratory testing led to the discovery of a method to control a sparker that could generate low frequencies required for deep wells. The low frequency sparker was successfully tested at the Department of Energy Rocky Mountain Oilfield Test Center (DOE RMOTC) field test site in Casper, Wyoming. An 8-in diameter by 26-ft long SeismicPULSER{trademark} drill string tool was designed and manufactured by TII

  14. Seismic spatial wavefield gradient and rotational rate measurements as new observables in land seismic exploration

    Science.gov (United States)

    Schmelzbach, Cedric; Sollberger, David; Van Renterghem, Cédéric; Häusler, Mauro; Robertsson, Johan; Greenhalgh, Stewart

    2016-04-01

    Traditionally, land-seismic data acquisition is conducted using vertical-component sensors. A more complete representation of the seismic wavefield can be obtained by employing multicomponent sensors recording the full vector wavefield. If groups of multicomponent sensors are deployed, then spatial seismic wavefield gradients and rotational rates can be estimated by differencing the outputs of closely spaced sensors. Such data capture all six degrees of freedom of a rigid body (three components of translation and three components of rotation), and hence allow an even more complete representation of the seismic wavefield compared to single station triaxial data. Seismic gradient and rotation data open up new possibilities to process land-seismic data. Potential benefits and applications of wavefield gradient data include local slowness estimation, improved arrival identification, wavefield separation and noise suppression. Using synthetic and field data, we explored the reliability and sensitivity of various multicomponent sensor layouts to estimate seismic wavefield gradients and rotational rates. Due to the wavelength and incidence-angle dependence of sensor-group reception patterns as a function of the number of sensors, station spacing and layout, one has to counterbalance the impacts of truncation errors, random noise attenuation, and sensitivity to perturbations such as amplitude variations and positioning errors when searching for optimum receiver configurations. Field experiments with special rotational rate sensors were used to verify array-based rotational-rate estimates. Seismic wavefield gradient estimates and inferred wavefield attributes such as instantaneous slowness enable improved arrival identification, e.g. wave type and path. Under favorable conditions, seismic-wavefield gradient attributes can be extracted from conventional vertical-component data and used to, for example, enhance the identification of shear waves. A further promising

  15. Angola Seismicity MAP

    Science.gov (United States)

    Neto, F. A. P.; Franca, G.

    2014-12-01

    The purpose of this job was to study and document the Angola natural seismicity, establishment of the first database seismic data to facilitate consultation and search for information on seismic activity in the country. The study was conducted based on query reports produced by National Institute of Meteorology and Geophysics (INAMET) 1968 to 2014 with emphasis to the work presented by Moreira (1968), that defined six seismogenic zones from macro seismic data, with highlighting is Zone of Sá da Bandeira (Lubango)-Chibemba-Oncócua-Iona. This is the most important of Angola seismic zone, covering the epicentral Quihita and Iona regions, geologically characterized by transcontinental structure tectono-magmatic activation of the Mesozoic with the installation of a wide variety of intrusive rocks of ultrabasic-alkaline composition, basic and alkaline, kimberlites and carbonatites, strongly marked by intense tectonism, presenting with several faults and fractures (locally called corredor de Lucapa). The earthquake of May 9, 1948 reached intensity VI on the Mercalli-Sieberg scale (MCS) in the locality of Quihita, and seismic active of Iona January 15, 1964, the main shock hit the grade VI-VII. Although not having significant seismicity rate can not be neglected, the other five zone are: Cassongue-Ganda-Massano de Amorim; Lola-Quilengues-Caluquembe; Gago Coutinho-zone; Cuima-Cachingues-Cambândua; The Upper Zambezi zone. We also analyzed technical reports on the seismicity of the middle Kwanza produced by Hidroproekt (GAMEK) region as well as international seismic bulletins of the International Seismological Centre (ISC), United States Geological Survey (USGS), and these data served for instrumental location of the epicenters. All compiled information made possible the creation of the First datbase of seismic data for Angola, preparing the map of seismicity with the reconfirmation of the main seismic zones defined by Moreira (1968) and the identification of a new seismic

  16. DISPLACEMENT BASED SEISMIC DESIGN CRITERIA

    Energy Technology Data Exchange (ETDEWEB)

    HOFMAYER,C.H.

    1999-03-29

    The USNRC has initiated a project to determine if any of the likely revisions to traditional earthquake engineering practice are relevant to seismic design of the specialized structures, systems and components of nuclear power plants and of such significance to suggest that a change in design practice might be warranted. As part of the initial phase of this study, a literature survey was conducted on the recent changes in seismic design codes/standards, on-going activities of code-writing organizations/communities, and published documents on displacement-based design methods. This paper provides a summary of recent changes in building codes and on-going activities for future codes. It also discusses some technical issues for further consideration.

  17. Displacement Based Seismic Design Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Costello, J.F.; Hofmayer, C.; Park, Y.J.

    1999-03-29

    The USNRC has initiated a project to determine if any of the likely revisions to traditional earthquake engineering practice are relevant to seismic design of the specialized structures, systems and components of nuclear power plants and of such significance to suggest that a change in design practice might be warranted. As part of the initial phase of this study, a literature survey was conducted on the recent changes in seismic design codes/standards, on-going activities of code-writing organizations/communities, and published documents on displacement-based design methods. This paper provides a summary of recent changes in building codes and on-going activities for future codes. It also discusses some technical issues for further consideration.

  18. Pliocene paleoenvironment evolution as interpreted from 3D-seismic data in the southern North Sea, Dutch offshore sector

    NARCIS (Netherlands)

    Kuhlmann, G.; Wong, T.E.

    2008-01-01

    A high-resolution 3D-seismic survey from the Dutch offshore sector has been interpreted and subsequently correlated with existing regional seismo-stratigraphic concepts derived from conventional 2D-seismic data sets. The interpreted 13 seismic units have been related to a newly established chrono-st

  19. Seismic base isolation technologies for Korea advanced liquid metal reactor

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, B.; Lee, J.-H.; Koo, G.-H.; Lee, H.-Y.; Kim, J.-B. [Korea Atomic Energy Research Inst., Taejon (Korea, Republic of)

    2000-06-01

    This paper describes the status and prospects of the seismic base isolation technologies for Korea Advanced Liquid Metal Reactor (KALIMER). The research and development program on the seismic base isolation for KALIMER began in 1993 by KAERI under the national long-term R and D program. The objective of this program is to enhance the seismic safety, to accomplish the economic design, and to standardize the plant design through the establishment of technologies on seismic base isolation for liquid metal reactors. In this paper, tests and analyses performed in the program are presented. (orig.)

  20. Seismic explosion sources on an ice cap

    DEFF Research Database (Denmark)

    Shulgin, Alexey; Thybo, Hans

    2015-01-01

    Controlled source seismic investigation of crustal structure below ice covers is an emerging technique. We have recently conducted an explosive refraction/wide-angle reflection seismic experiment on the ice cap in east-central Greenland. The data-quality is high for all shot points and a full...... crustal model can be modelled. A crucial challenge for applying the technique is to control the sources. Here, we present data that describe the efficiency of explosive sources in the ice cover. Analysis of the data shows, that the ice cap traps a significant amount of energy, which is observed...... as a strong ice wave. The ice cap leads to low transmission of energy into the crust such that charges need be larger than in conventional onshore experiments to obtain reliable seismic signals. The strong reflection coefficient at the base of the ice generates strong multiples which may mask for secondary...

  1. Automating Shallow Seismic Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Steeples, Don W.

    2004-12-09

    analogs. Near-surface seismology is in the vanguard of non-intrusive approaches to increase knowledge of the shallow subsurface; our work is a significant departure from conventional seismic-survey field procedures.

  2. Simplified seismic risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pellissetti, Manuel; Klapp, Ulrich [AREVA NP GmbH, Erlangen (Germany)

    2011-07-01

    Within the context of probabilistic safety analysis (PSA) for nuclear power plants (NPP's), seismic risk assessment has the purpose to demonstrate that the contribution of seismic events to overall risk is not excessive. The most suitable vehicle for seismic risk assessment is a full scope seismic PSA (SPSA), in which the frequency of core damage due to seismic events is estimated. An alternative method is represented by seismic margin assessment (SMA), which aims at showing sufficient margin between the site-specific safe shutdown earthquake (SSE) and the actual capacity of the plant. Both methods are based on system analysis (fault-trees and event-trees) and hence require fragility estimates for safety relevant systems, structures and components (SSC's). If the seismic conditions at a specific site of a plant are not very demanding, then it is reasonable to expect that the risk due to seismic events is low. In such cases, the cost-benefit ratio for performing a full scale, site-specific SPSA or SMA will be excessive, considering the ultimate objective of seismic risk analysis. Rather, it will be more rational to rely on a less comprehensive analysis, used as a basis for demonstrating that the risk due to seismic events is not excessive. The present paper addresses such a simplified approach to seismic risk assessment which is used in AREVA to: - estimate seismic risk in early design stages, - identify needs to extend the design basis, - define a reasonable level of seismic risk analysis Starting from a conservative estimate of the overall plant capacity, in terms of the HCLPF (High Confidence of Low Probability of Failure), and utilizing a generic value for the variability, the seismic risk is estimated by convolution of the hazard and the fragility curve. Critical importance is attached to the selection of the plant capacity in terms of the HCLPF, without performing extensive fragility calculations of seismically relevant SSC's. A suitable basis

  3. Multi-waveform classification for seismic facies analysis

    Science.gov (United States)

    Song, Chengyun; Liu, Zhining; Wang, Yaojun; Li, Xingming; Hu, Guangmin

    2017-04-01

    Seismic facies analysis provides an effective way to delineate the heterogeneity and compartments within a reservoir. Traditional method is using the single waveform to classify the seismic facies, which does not consider the stratigraphy continuity, and the final facies map may affect by noise. Therefore, by defining waveforms in a 3D window as multi-waveform, we developed a new seismic facies analysis algorithm represented as multi-waveform classification (MWFC) that combines the multilinear subspace learning with self-organizing map (SOM) clustering techniques. In addition, we utilize multi-window dip search algorithm to extract multi-waveform, which reduce the uncertainty of facies maps in the boundaries. Testing the proposed method on synthetic data with different S/N, we confirm that our MWFC approach is more robust to noise than the conventional waveform classification (WFC) method. The real seismic data application on F3 block in Netherlands proves our approach is an effective tool for seismic facies analysis.

  4. Seismic Catalogue and Seismic Network in Haiti

    Science.gov (United States)

    Belizaire, D.; Benito, B.; Carreño, E.; Meneses, C.; Huerfano, V.; Polanco, E.; McCormack, D.

    2013-05-01

    The destructive earthquake occurred on January 10, 2010 in Haiti, highlighted the lack of preparedness of the country to address seismic phenomena. At the moment of the earthquake, there was no seismic network operating in the country, and only a partial control of the past seismicity was possible, due to the absence of a national catalogue. After the 2010 earthquake, some advances began towards the installation of a national network and the elaboration of a seismic catalogue providing the necessary input for seismic Hazard Studies. This paper presents the state of the works carried out covering both aspects. First, a seismic catalogue has been built, compiling data of historical and instrumental events occurred in the Hispaniola Island and surroundings, in the frame of the SISMO-HAITI project, supported by the Technical University of Madrid (UPM) and Developed in cooperation with the Observatoire National de l'Environnement et de la Vulnérabilité of Haiti (ONEV). Data from different agencies all over the world were gathered, being relevant the role of the Dominican Republic and Puerto Rico seismological services which provides local data of their national networks. Almost 30000 events recorded in the area from 1551 till 2011 were compiled in a first catalogue, among them 7700 events with Mw ranges between 4.0 and 8.3. Since different magnitude scale were given by the different agencies (Ms, mb, MD, ML), this first catalogue was affected by important heterogeneity in the size parameter. Then it was homogenized to moment magnitude Mw using the empirical equations developed by Bonzoni et al (2011) for the eastern Caribbean. At present, this is the most exhaustive catalogue of the country, although it is difficult to assess its degree of completeness. Regarding the seismic network, 3 stations were installed just after the 2010 earthquake by the Canadian Government. The data were sent by telemetry thought the Canadian System CARINA. In 2012, the Spanish IGN together

  5. Seismic microzonation of Bangalore, India

    Indian Academy of Sciences (India)

    P Anbazhagan; T G Sitharam

    2008-11-01

    In the present study, an attempt has been made to evaluate the seismic hazard considering local site effects by carrying out detailed geotechnical and geophysical site characterization in Bangalore, India to develop microzonation maps. An area of 220 km2, encompassing Bangalore Mahanagara Palike (BMP) has been chosen as the study area. Seismic hazard analysis and microzonation of Bangalore are addressed in three parts: in the first part, estimation of seismic hazard is done using seismotectonic and geological information. Second part deals with site characterization using geotechnical and shallow geophysical techniques. In the last part, local site effects are assessed by carrying out one-dimensional (1-D) ground response analysis (using the program SHAKE 2000) using both standard penetration test (SPT) data and shear wave velocity data from multichannel analysis of surface wave (MASW) survey. Further, field experiments using microtremor studies have also been carried out for evaluation of predominant frequency of the soil columns. The same has been assessed using 1-D ground response analysis and compared with microtremor results. Further, the Seed and Idriss simplified approach has been adopted to evaluate the soil liquefaction susceptibility and liquefaction resistance assessment. Microzonation maps have been prepared with a scale of 1:20,000. The detailed methodology, along with experimental details, collated data, results and maps are presented in this paper.

  6. Seismic isolation of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Whittaker, Andrew S.; Kuma, Manish [Dept. of Civil, Structural and Environmental Engineering, State University of New York, Buffalo (United States)

    2014-10-15

    Seismic isolation is a viable strategy for protecting safety-related nuclear structures from the effects of moderate to severe earthquake shaking. Although seismic isolation has been deployed in nuclear structures in France and South Africa, it has not seen widespread use because of limited new build nuclear construction in the past 30 years and a lack of guidelines, codes and standards for the analysis, design and construction of isolation systems specific to nuclear structures. The funding by the United States Nuclear Regulatory Commission of a research project to the Lawrence Berkeley National Laboratory and MCEER/University at Buffalo facilitated the writing of a soon-to-be-published NUREG on seismic isolation. Funding of MCEER by the National Science Foundation led to research products that provide the technical basis for a new section in ASCE Standard 4 on the seismic isolation of safety-related nuclear facilities. The performance expectations identified in the NUREG and ASCE 4 for seismic isolation systems, and superstructures and substructures are described in the paper. Robust numerical models capable of capturing isolator behaviors under extreme loadings, which have been verified and validated following ASME protocols, and implemented in the open source code OpenSees, are introduced.

  7. Imaging seismic reflections

    NARCIS (Netherlands)

    Op 't Root, Timotheus Johannes Petrus Maria

    2011-01-01

    The goal of reflection seismic imaging is making images of the Earth subsurface using surface measurements of reflected seismic waves. Besides the position and orientation of subsurface reflecting interfaces it is a challenge to recover the size or amplitude of the discontinuities. We investigate tw

  8. Seismic Design Guidelines For Port Structures

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Bernal, Alberto; Blazquez, Rafael

    In order to mitigate hazards and losses due to earthquakes, seismic design methodologies have been developed and implemented in design practice in many regions since the early twentieth century, often in the form of codes and standards. Most of these methodologies are based on a force-balance app...

  9. The Hague Judgments Convention

    DEFF Research Database (Denmark)

    Nielsen, Peter Arnt

    2011-01-01

    The Hague Judgments Convention of 2005 is the first global convention on international jurisdiction and recognition and enforcement of judgments in civil and commercial matters. The author explains the political and legal background of the Convention, its content and certain crucial issues during...

  10. SOAR Telescope seismic performance II: seismic mitigation

    Science.gov (United States)

    Elias, Jonathan H.; Muñoz, Freddy; Warner, Michael; Rivera, Rossano; Martínez, Manuel

    2016-07-01

    We describe design modifications to the SOAR telescope intended to reduce the impact of future major earthquakes, based on the facility's experience during recent events, most notably the September 2015 Illapel earthquake. Specific modifications include a redesign of the encoder systems for both azimuth and elevation, seismic trigger for the emergency stop system, and additional protections for the telescope secondary mirror system. The secondary mirror protection may combine measures to reduce amplification of seismic vibration and "fail-safe" components within the assembly. The status of these upgrades is presented.

  11. Seismic hazard assessment of Syria using seismicity, DEM, slope, active tectonic and GIS

    Science.gov (United States)

    Ahmad, Raed; Adris, Ahmad; Singh, Ramesh

    2016-07-01

    In the present work, we discuss the use of an integrated remote sensing and Geographical Information System (GIS) techniques for evaluation of seismic hazard areas in Syria. The present study is the first time effort to create seismic hazard map with the help of GIS. In the proposed approach, we have used Aster satellite data, digital elevation data (30 m resolution), earthquake data, and active tectonic maps. Many important factors for evaluation of seismic hazard were identified and corresponding thematic data layers (past earthquake epicenters, active faults, digital elevation model, and slope) were generated. A numerical rating scheme has been developed for spatial data analysis using GIS to identify ranking of parameters to be included in the evaluation of seismic hazard. The resulting earthquake potential map delineates the area into different relative susceptibility classes: high, moderate, low and very low. The potential earthquake map was validated by correlating the obtained different classes with the local probability that produced using conventional analysis of observed earthquakes. Using earthquake data of Syria and the peak ground acceleration (PGA) data is introduced to the model to develop final seismic hazard map based on Gutenberg-Richter (a and b values) parameters and using the concepts of local probability and recurrence time. The application of the proposed technique in Syrian region indicates that this method provides good estimate of seismic hazard map compared to those developed from traditional techniques (Deterministic (DSHA) and probabilistic seismic hazard (PSHA). For the first time we have used numerous parameters using remote sensing and GIS in preparation of seismic hazard map which is found to be very realistic.

  12. Swept Impact Seismic Technique (SIST)

    Science.gov (United States)

    Park, C.B.; Miller, R.D.; Steeples, D.W.; Black, R.A.

    1996-01-01

    A coded seismic technique is developed that can result in a higher signal-to-noise ratio than a conventional single-pulse method does. The technique is cost-effective and time-efficient and therefore well suited for shallow-reflection surveys where high resolution and cost-effectiveness are critical. A low-power impact source transmits a few to several hundred high-frequency broad-band seismic pulses during several seconds of recording time according to a deterministic coding scheme. The coding scheme consists of a time-encoded impact sequence in which the rate of impact (cycles/s) changes linearly with time providing a broad range of impact rates. Impact times used during the decoding process are recorded on one channel of the seismograph. The coding concept combines the vibroseis swept-frequency and the Mini-Sosie random impact concepts. The swept-frequency concept greatly improves the suppression of correlation noise with much fewer impacts than normally used in the Mini-Sosie technique. The impact concept makes the technique simple and efficient in generating high-resolution seismic data especially in the presence of noise. The transfer function of the impact sequence simulates a low-cut filter with the cutoff frequency the same as the lowest impact rate. This property can be used to attenuate low-frequency ground-roll noise without using an analog low-cut filter or a spatial source (or receiver) array as is necessary with a conventional single-pulse method. Because of the discontinuous coding scheme, the decoding process is accomplished by a "shift-and-stacking" method that is much simpler and quicker than cross-correlation. The simplicity of the coding allows the mechanical design of the source to remain simple. Several different types of mechanical systems could be adapted to generate a linear impact sweep. In addition, the simplicity of the coding also allows the technique to be used with conventional acquisition systems, with only minor modifications.

  13. Diagnostic accuracy of a standardized scheme for identification of Streptococcus uberis in quarter milk samples: A comparison between conventional bacteriological examination, modified Rambach agar medium culturing, and 16S rRNA gene sequencing.

    Science.gov (United States)

    Wald, Regina; Baumgartner, Martina; Urbantke, Verena; Stessl, Beatrix; Wittek, Thomas

    2017-02-01

    Bacteriological examination of milk samples is a prerequisite for pathogen-specific therapy and aids in limiting antimicrobial resistance. The aims of this study were to establish a standardized scheme for reliable Streptococcus uberis identification in routine diagnosis and to evaluate the accuracy of conventional tests and growing patterns of Strep. uberis on a selective medium (modified Rambach agar medium, MRAM) using 16S rRNA gene sequencing analysis as a reference method. We obtained isolates of presumptive Strep. uberis (n = 336) from quarter milk samples of dairy cows with intramammary infections and classified the isolates into 2 clusters using biochemical characterization. In cluster 1 (n = 280), cocci grew as non-hemolytic colonies, hydrolyzing esculin, carrying no Lancefield antigen (A/B/C/D/G) or Christie Atkins Munch-Petersen factor, and their growth was inhibited on an Enterococcus agar. Production of β-d-galactosidase on MRAM was shown by 257 of the cluster 1 isolates (91.79%), and 16S rRNA gene sequencing verified 271 (96.79%) of the isolates to be Strep. uberis. In 264 isolates (94.29%), MRAM agreed with the sequencing results. In cluster 2 (n = 56), isolates showed different characteristics: 37 (66.07%) were β-d-galactosidase-positive, and based on 16S sequencing results, 36 (64.29%) were identified correctly as Strep. uberis using biochemical methods. Identification success in this group differed significantly between routine diagnosis and MRAM application: MRAM agreed with sequencing results in 47 isolates (83.93%). To identify Strep. uberis and differentiate it from other lactic acid bacteria in routine diagnosis, we suggest using catalase reaction, hemolysis, esculin hydrolysis, and growth on enterococci agar. Isolates that show a typical biochemical profile can be identified satisfactorily with these tests. For Strep. uberis isolates with divergent patterns, application of MRAM as a follow-up test increased the diagnostic accuracy to 94

  14. Functional and numerical responses of ovenbirds (Seiurus aurocapilla) to changing seismic exploration practices in Alberta's boreal forest

    Energy Technology Data Exchange (ETDEWEB)

    Bayne, E.M.; Boutin, S.; Tracz, B.; Charest, K. [Alberta Univ., Edmonton, AB (Canada). Dept. of Biological Sciences

    2005-07-01

    Seismic lines created by energy developers are a major source of anthropogenic fragmentation in Boreal forests. Conventional seismic lines are traditionally made by a bulldozer that creates a series of 8 metre-wide linear features on a loose grid system with a typical spacing of 300 to 500 metres between seismic lines. Concerns over regeneration rates and impacts on wildlife have led some companies to create significantly narrower seismic lines of between 2 to 3 metres wide. This paper presented the results of a study assessing the impact of narrower seismic lines on wildlife. The functional and numerical response of male ovenbirds (Seiurus aurocapilla) to both conventional and low-impact seismic lines in mature aspen forest in northeastern Alberta was investigated. Information on the territorial behaviour of the birds was collected using radio-telemetry. Spot-mapping was used to determine which individual birds were closest to the seismic line; to locate birds using unsolicited singing behaviour prior to capture; and, to capture singing from birds within 10 metres of the edge of the line. The singing locations of all males within the plots were also recorded. In total, 12 plots were mapped, with 4 having no seismic lines, 4 having a conventional seismic line, and 4 having approximately 5 low-impact lines. Results showed that the ovenbirds perceived conventional seismic lines as creating a gap in the forest and used it as a territory boundary. However, the ovenbirds incorporated low-impact seismic lines within their territories. Spot-mapping data suggested no differences in ovenbird density in stands with a single conventional seismic line, multiple low-impact line, or reference plots with no seismic lines. It was concluded that energy companies should consider using low-impact approaches in seismic operations to minimize ecological risks. 26 refs., 3 figs.

  15. New national seismic zoning map of China

    Institute of Scientific and Technical Information of China (English)

    高孟潭

    2003-01-01

    A new set of seismic zoning maps were published in August 1,200l. It includes two maps, one is the seismic zon-ing map of peak acceleration, and the other is the zoning map of the characteristic period of the response spectrum.The exceeding probability of the map is 10% within 50 years. The scale of the map is 1:4 000 000. These mapsserve as the national standard. The background of this project, technical approach and key scientific measures, thebasic feature and the application of the maps are introduced in this paper.

  16. Considerations on seismic microzonation in areas with two-dimensional hills

    Indian Academy of Sciences (India)

    Mohsen Kamalian; Abdollah Sohrabi-Bidar; Arash Razmkhah; Amirata Taghavi; Iraj Rahmani

    2008-11-01

    This paper presents the results of an extensive numerical parametric study on seismic behavior of 2D homogenous hills subjected to vertically propagating incident SV waves. It is shown that the amplification potential of these hills is strongly influenced by the wavelength, by the shape ratio, by the shape of the hill and in a less order of importance, by the Poisson ratio of the media. The 2D topography effect could be ignored, only if the hill has a shape ratio of less than 0.1 or if it is subjected to incident waves with predominant dimensionless periods of greater than 13 times the shape ratio. In incidence of waves with wavelengths longer than the width of the hill, the amplification curve usually finds its maximum at the crest and decreases towards the base of the hill. Else, some de-amplification zones would occur along the hill. Among hills with similar shape ratios, those with intermediate cross section areas show intermediate seismic behavior, too. Estimated seismic site coefficients for the crest of a 2D rocky hill depend on its shape ratio and could reach even 1.7, which encourages one to classify it according to standard site categorization procedures as soil profile types SC or SD instead of the conventional SB type.

  17. A seismic design of nuclear reactor building structures applying seismic isolation system in a seismicity region-a feasibility case study in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Kubo, Tetsuo [The University of Tokyo, Tokyo (Japan); Yamamoto, Tomofumi; Sato, Kunihiko [Mitsubishi Heavy Industries, Ltd., Kobe (Japan); Jimbo, Masakazu [Toshiba Corporation, Yokohama (Japan); Imaoka, Tetsuo [Hitachi-GE Nuclear Energy, Ltd., Hitachi (Japan); Umeki, Yoshito [Chubu Electric Power Co. Inc., Nagoya (Japan)

    2014-10-15

    A feasibility study on the seismic design of nuclear reactor buildings with application of a seismic isolation system is introduced. After the Hyogo-ken Nanbu earthquake in Japan of 1995, seismic isolation technologies have been widely employed for commercial buildings. Having become a mature technology, seismic isolation systems can be applied to NPP facilities in areas of high seismicity. Two reactor buildings are discussed, representing the PWR and BWR buildings in Japan, and the application of seismic isolation systems is discussed. The isolation system employing rubber bearings with a lead plug positioned (LRB) is examined. Through a series of seismic response analyses using the so-named standard design earthquake motions covering the design basis earthquake motions obtained for NPP sites in Japan, the responses of the seismic isolated reactor buildings are evaluated. It is revealed that for the building structures examined herein: (1) the responses of both isolated buildings and isolating LRBs fulfill the specified design criteria; (2) the responses obtained for the isolating LRBs first reach the ultimate condition when intensity of motion is 2.0 to 2.5 times as large as that of the design-basis; and (3) the responses of isolated reactor building fall below the range of the prescribed criteria.

  18. The Seismic Wavefield

    Science.gov (United States)

    Kennett, B. L. N.

    2002-12-01

    The two volumes of The Seismic Wavefield are a comprehensive guide to the understanding of seismograms in terms of physical propagation processes within the Earth. The focus is on the observation of earthquakes and man-made sources on all scales, for both body waves and surface waves. Volume I provides a general introduction and a development of the theoretical background for seismic waves. Volume II looks at the way in which observed seismograms relate to the propagation processes. Volume II also discusses local and regional seismic events, global wave propagation, and the three-dimensional Earth.

  19. Seismic migration in generalized coordinates

    Science.gov (United States)

    Arias, C.; Duque, L. F.

    2017-06-01

    Reverse time migration (RTM) is a technique widely used nowadays to obtain images of the earth’s sub-surface, using artificially produced seismic waves. This technique has been developed for zones with flat surface and when applied to zones with rugged topography some corrections must be introduced in order to adapt it. This can produce defects in the final image called artifacts. We introduce a simple mathematical map that transforms a scenario with rugged topography into a flat one. The three steps of the RTM can be applied in a way similar to the conventional ones just by changing the Laplacian in the acoustic wave equation for a generalized one. We present a test of this technique using the Canadian foothills SEG velocity model.

  20. Research on seismic stress triggering

    Institute of Scientific and Technical Information of China (English)

    万永革; 吴忠良; 周公威; 黄静; 秦立新

    2002-01-01

    This paper briefly reviews basic theory of seismic stress triggering. Recent development on seismic stress triggering has been reviewed in the views of seismic static and dynamic stress triggering, application of viscoelastic model in seismic stress triggering, the relation between earthquake triggering and volcanic eruption or explosion, other explanation of earthquake triggering, etc. And some suggestions for further study on seismic stress triggering in near future are given.

  1. Preliminary consideration on the seismic actions recorded during the 2016 Central Italy seismic sequence

    Science.gov (United States)

    Carlo Ponzo, Felice; Ditommaso, Rocco; Nigro, Antonella; Nigro, Domenico S.; Iacovino, Chiara

    2017-04-01

    After the Mw 6.0 mainshock of August 24, 2016 at 03.36 a.m. (local time), with the epicenter located between the towns of Accumoli (province of Rieti), Amatrice (province of Rieti) and Arquata del Tronto (province of Ascoli Piceno), several activities were started in order to perform some preliminary evaluations on the characteristics of the recent seismic sequence in the areas affected by the earthquake. Ambient vibration acquisitions have been performed using two three-directional velocimetric synchronized stations, with a natural frequency equal to 0.5Hz and a digitizer resolution of equal to 24bit. The activities are continuing after the events of the seismic sequence of October 26 and October 30, 2016. In this paper, in order to compare recorded and code provision values in terms of peak (PGA, PGV and PGD), spectral and integral (Housner Intensity) seismic parameters, several preliminary analyses have been performed on accelerometric time-histories acquired by three near fault station of the RAN (Italian Accelerometric Network): Amatrice station (station code AMT), Norcia station (station code NRC) and Castelsantangelo sul Nera station (station code CNE). Several comparisons between the elastic response spectra derived from accelerometric recordings and the elastic demand spectra provided by the Italian seismic code (NTC 2008) have been performed. Preliminary results retrieved from these analyses highlight several apparent difference between experimental data and conventional code provision. Then, the ongoing seismic sequence appears compatible with the historical seismicity in terms of integral parameters, but not in terms of peak and spectral values. It seems appropriate to reconsider the necessity to revise the simplified design approach based on the conventional spectral values. Acknowledgements This study was partially funded by the Italian Department of Civil Protection within the project DPC-RELUIS 2016 - RS4 ''Seismic observatory of structures and

  2. Varieties of conventional implicature

    Directory of Open Access Journals (Sweden)

    Eric Scott McCready

    2010-07-01

    Full Text Available This paper provides a system capable of analyzing the combinatorics of a wide range of conventionally implicated and expressive constructions in natural language via an extension of Potts's (2005 L_CI logic for supplementary conventional implicatures. In particular, the system is capable of analyzing objects of mixed conventionally implicated/expressive and at-issue type, and objects with conventionally implicated or expressive meanings which provide the main content of their utterances. The logic is applied to a range of constructions and lexical items in several languages. doi:10.3765/sp.3.8 BibTeX info

  3. Applying the seismic interferometry method to vertical seismic profile data using tunnel excavation noise as source

    Science.gov (United States)

    Jurado, Maria Jose; Teixido, Teresa; Martin, Elena; Segarra, Miguel; Segura, Carlos

    2013-04-01

    In the frame of the research conducted to develop efficient strategies for investigation of rock properties and fluids ahead of tunnel excavations the seismic interferometry method was applied to analyze the data acquired in boreholes instrumented with geophone strings. The results obtained confirmed that seismic interferometry provided an improved resolution of petrophysical properties to identify heterogeneities and geological structures ahead of the excavation. These features are beyond the resolution of other conventional geophysical methods but can be the cause severe problems in the excavation of tunnels. Geophone strings were used to record different types of seismic noise generated at the tunnel head during excavation with a tunnelling machine and also during the placement of the rings covering the tunnel excavation. In this study we show how tunnel construction activities have been characterized as source of seismic signal and used in our research as the seismic source signal for generating a 3D reflection seismic survey. The data was recorded in vertical water filled borehole with a borehole seismic string at a distance of 60 m from the tunnel trace. A reference pilot signal was obtained from seismograms acquired close the tunnel face excavation in order to obtain best signal-to-noise ratio to be used in the interferometry processing (Poletto et al., 2010). The seismic interferometry method (Claerbout 1968) was successfully applied to image the subsurface geological structure using the seismic wave field generated by tunneling (tunnelling machine and construction activities) recorded with geophone strings. This technique was applied simulating virtual shot records related to the number of receivers in the borehole with the seismic transmitted events, and processing the data as a reflection seismic survey. The pseudo reflective wave field was obtained by cross-correlation of the transmitted wave data. We applied the relationship between the transmission

  4. A review of shallow seismic methods

    Directory of Open Access Journals (Sweden)

    D. W. Steeples

    2000-06-01

    Full Text Available Shallow seismic methods have historical roots dating to the 1930s, when limited shallow refraction work was performed using the Intercept-Time (IT method. Because of high costs and the general lack of appropriate equipment - particularly data-processing equipment and software - the shallow-reflection and surface-wave techniques did not catch on as quickly as the refraction techniques. However, since 1980 substantial progress has been made in the development of all of the shallow seismic approaches. The seismic-reflection method has been used increasingly in applications at depths of less than 30 m, incorporating both the standard Common-Midpoint (CMP method of the petroleum industry and the Common-Offset (CO method, which was developed specifically as a low-cost technique for use in shallow surveying. In refraction studies, the Generalized Reciprocal Method (GRM largely has replaced the classical intercept-time method, and tomographic approaches are rapidly gaining popularity. The Spectral Analysis of Surface Waves (SASW has been developed by civil engineers, and surface-wave analysis involving many seismograph channels (MASW recently has shown promise. With any of the shallow seismic methods, however, selecting the appropriate seismic recording equipment, energy sources, and data-acquisition parameters, along with processing and interpretation strategies, often is critical to the success of a project.

  5. Reproducibility in Seismic Imaging

    Directory of Open Access Journals (Sweden)

    González-Verdejo O.

    2012-04-01

    Full Text Available Within the field of exploration seismology, there is interest at national level of integrating reproducibility in applied, educational and research activities related to seismic processing and imaging. This reproducibility implies the description and organization of the elements involved in numerical experiments. Thus, a researcher, teacher or student can study, verify, repeat, and modify them independently. In this work, we document and adapt reproducibility in seismic processing and imaging to spread this concept and its benefits, and to encourage the use of open source software in this area within our academic and professional environment. We present an enhanced seismic imaging example, of interest in both academic and professional environments, using Mexican seismic data. As a result of this research, we prove that it is possible to assimilate, adapt and transfer technology at low cost, using open source software and following a reproducible research scheme.

  6. Seismic Fault Preserving Diffusion

    CERN Document Server

    Lavialle, Olivier; Germain, Christian; Donias, Marc; Guillon, Sebastien; Keskes, Naamen; Berthoumieu, Yannick

    2007-01-01

    This paper focuses on the denoising and enhancing of 3-D reflection seismic data. We propose a pre-processing step based on a non linear diffusion filtering leading to a better detection of seismic faults. The non linear diffusion approaches are based on the definition of a partial differential equation that allows us to simplify the images without blurring relevant details or discontinuities. Computing the structure tensor which provides information on the local orientation of the geological layers, we propose to drive the diffusion along these layers using a new approach called SFPD (Seismic Fault Preserving Diffusion). In SFPD, the eigenvalues of the tensor are fixed according to a confidence measure that takes into account the regularity of the local seismic structure. Results on both synthesized and real 3-D blocks show the efficiency of the proposed approach.

  7. Seismic fault preserving diffusion

    Science.gov (United States)

    Lavialle, Olivier; Pop, Sorin; Germain, Christian; Donias, Marc; Guillon, Sebastien; Keskes, Naamen; Berthoumieu, Yannick

    2007-02-01

    This paper focuses on the denoising and enhancing of 3-D reflection seismic data. We propose a pre-processing step based on a non-linear diffusion filtering leading to a better detection of seismic faults. The non-linear diffusion approaches are based on the definition of a partial differential equation that allows us to simplify the images without blurring relevant details or discontinuities. Computing the structure tensor which provides information on the local orientation of the geological layers, we propose to drive the diffusion along these layers using a new approach called SFPD (Seismic Fault Preserving Diffusion). In SFPD, the eigenvalues of the tensor are fixed according to a confidence measure that takes into account the regularity of the local seismic structure. Results on both synthesized and real 3-D blocks show the efficiency of the proposed approach.

  8. BUILDING 341 Seismic Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Halle, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-06-15

    The Seismic Evaluation of Building 341 located at Lawrence Livermore National Laboratory in Livermore, California has been completed. The subject building consists of a main building, Increment 1, and two smaller additions; Increments 2 and 3.

  9. Seismic facies; Facies sismicas

    Energy Technology Data Exchange (ETDEWEB)

    Johann, Paulo Roberto Schroeder [PETROBRAS, Rio de Janeiro, RJ (Brazil). Exploracao e Producao Corporativo. Gerencia de Reservas e Reservatorios]. E-mail: johann@petrobras.com.br

    2004-11-01

    The method presented herein describes the seismic facies as representations of curves and vertical matrixes of the lithotypes proportions. The seismic facies are greatly interested in capturing the spatial distributions (3D) of regionalized variables, as for example, lithotypes, sedimentary facies groups and/ or porosity and/or other properties of the reservoirs and integrate them into the 3D geological modeling (Johann, 1997). Thus when interpreted as curves or vertical matrixes of proportions, seismic facies allow us to build a very important tool for structural analysis of regionalized variables. The matrixes have an important application in geostatistical modeling. In addition, this approach provides results about the depth and scale of the wells profiles, that is, seismic data is integrated to the characterization of reservoirs in depth maps and in high resolution maps. The link between the different necessary technical phases involved in the classification of the segments of seismic traces is described herein in groups of predefined traces of two approaches: a) not supervised and b) supervised by the geological knowledge available on the studied reservoir. The multivariate statistical methods used to obtain the maps of the seismic facies units are interesting tools to be used to provide a lithostratigraphic and petrophysical understanding of a petroleum reservoir. In the case studied these seismic facies units are interpreted as representative of the depositional system as a part of the Namorado Turbiditic System, Namorado Field, Campos Basin.Within the scope of PRAVAP 19 (Programa Estrategico de Recuperacao Avancada de Petroleo - Strategic Program of Advanced Petroleum Recovery) some research work on algorithms is underway to select new optimized attributes to apply seismic facies. One example is the extraction of attributes based on the wavelet transformation and on the time-frequency analysis methodology. PRAVAP is also carrying out research work on an

  10. Seismic Consequence Abstraction

    Energy Technology Data Exchange (ETDEWEB)

    M. Gross

    2004-10-25

    The primary purpose of this model report is to develop abstractions for the response of engineered barrier system (EBS) components to seismic hazards at a geologic repository at Yucca Mountain, Nevada, and to define the methodology for using these abstractions in a seismic scenario class for the Total System Performance Assessment - License Application (TSPA-LA). A secondary purpose of this model report is to provide information for criticality studies related to seismic hazards. The seismic hazards addressed herein are vibratory ground motion, fault displacement, and rockfall due to ground motion. The EBS components are the drip shield, the waste package, and the fuel cladding. The requirements for development of the abstractions and the associated algorithms for the seismic scenario class are defined in ''Technical Work Plan For: Regulatory Integration Modeling of Drift Degradation, Waste Package and Drip Shield Vibratory Motion and Seismic Consequences'' (BSC 2004 [DIRS 171520]). The development of these abstractions will provide a more complete representation of flow into and transport from the EBS under disruptive events. The results from this development will also address portions of integrated subissue ENG2, Mechanical Disruption of Engineered Barriers, including the acceptance criteria for this subissue defined in Section 2.2.1.3.2.3 of the ''Yucca Mountain Review Plan, Final Report'' (NRC 2003 [DIRS 163274]).

  11. Evaluating the Use of Declustering for Induced Seismicity Hazard Assessment

    Science.gov (United States)

    Llenos, A. L.; Michael, A. J.

    2016-12-01

    The recent dramatic seismicity rate increase in the central and eastern US (CEUS) has motivated the development of seismic hazard assessments for induced seismicity (e.g., Petersen et al., 2016). Standard probabilistic seismic hazard assessment (PSHA) relies fundamentally on the assumption that seismicity is Poissonian (Cornell, BSSA, 1968); therefore, the earthquake catalogs used in PSHA are typically declustered (e.g., Petersen et al., 2014) even though this may remove earthquakes that may cause damage or concern (Petersen et al., 2015; 2016). In some induced earthquake sequences in the CEUS, the standard declustering can remove up to 90% of the sequence, reducing the estimated seismicity rate by a factor of 10 compared to estimates from the complete catalog. In tectonic regions the reduction is often only about a factor of 2. We investigate how three declustering methods treat induced seismicity: the window-based Gardner-Knopoff (GK) algorithm, often used for PSHA (Gardner and Knopoff, BSSA, 1974); the link-based Reasenberg algorithm (Reasenberg, JGR,1985); and a stochastic declustering method based on a space-time Epidemic-Type Aftershock Sequence model (Ogata, JASA, 1988; Zhuang et al., JASA, 2002). We apply these methods to three catalogs that likely contain some induced seismicity. For the Guy-Greenbrier, AR earthquake swarm from 2010-2013, declustering reduces the seismicity rate by factors of 6-14, depending on the algorithm. In northern Oklahoma and southern Kansas from 2010-2015, the reduction varies from factors of 1.5-20. In the Salton Trough of southern California from 1975-2013, the rate is reduced by factors of 3-20. Stochastic declustering tends to remove the most events, followed by the GK method, while the Reasenberg method removes the fewest. Given that declustering and choice of algorithm have such a large impact on the resulting seismicity rate estimates, we suggest that more accurate hazard assessments may be found using the complete catalog.

  12. Seismicity in Northern Germany

    Science.gov (United States)

    Bischoff, Monika; Gestermann, Nicolai; Plenefisch, Thomas; Bönnemann, Christian

    2013-04-01

    Northern Germany is a region of low tectonic activity, where only few and low-magnitude earthquakes occur. The driving tectonic processes are not well-understood up to now. In addition, seismic events during the last decade concentrated at the borders of the natural gas fields. The source depths of these events are shallow and in the depth range of the gas reservoirs. Based on these observations a causal relationship between seismicity near gas fields and the gas production is likely. The strongest of these earthquake had a magnitude of 4.5 and occurred near Rotenburg in 2004. Also smaller seismic events were considerably felt by the public and stimulated the discussion on the underlying processes. The latest seismic event occurred near Langwedel on 22nd November 2012 and had a magnitude of 2.8. Understanding the causes of the seismicity in Northern Germany is crucial for a thorough evaluation. Therefore the Seismological Service of Lower Saxony (NED) was established at the State Office for Mining, Energy and Geology (LBEG) of Lower Saxony in January 2013. Its main task is the monitoring and evaluation of the seismicity in Lower Saxony and adjacent areas. Scientific and technical questions are addressed in close cooperation with the Seismological Central Observatory (SZO) at the Federal Institute for Geosciences and Natural Resources (BGR). The seismological situation of Northern Germany will be presented. Possible causes of seismicity are introduced. Rare seismic events at greater depths are distributed over the whole region and probably are purely tectonic whereas events in the vicinity of natural gas fields are probably related to gas production. Improving the detection threshold of seismic events in Northern Germany is necessary for providing a better statistical basis for further analyses answering these questions. As a first step the existing seismic network will be densified over the next few years. The first borehole station was installed near Rethem by BGR

  13. Least-Squares Seismic Inversion with Stochastic Conjugate Gradient Method

    Institute of Scientific and Technical Information of China (English)

    Wei Huang; Hua-Wei Zhou

    2015-01-01

    With the development of computational power, there has been an increased focus on data-fitting related seismic inversion techniques for high fidelity seismic velocity model and image, such as full-waveform inversion and least squares migration. However, though more advanced than conventional methods, these data fitting methods can be very expensive in terms of computational cost. Recently, various techniques to optimize these data-fitting seismic inversion problems have been implemented to cater for the industrial need for much improved efficiency. In this study, we propose a general stochastic conjugate gradient method for these data-fitting related inverse problems. We first prescribe the basic theory of our method and then give synthetic examples. Our numerical experiments illustrate the potential of this method for large-size seismic inversion application.

  14. Seismic Base Isolation Analysis for PASCAR Liquid Metal Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kuk Hee; Yoo, Bong; Kim, Yun Jae [Korea Univ., Seoul (Korea, Republic of)

    2008-10-15

    This paper presents a study for developing a seismic isolation system for the PASCAR (Proliferation resistant, Accident-tolerant, Self-supported, Capsular and Assured Reactor) liquid metal reactor design. PASCAR use lead-bismuth eutectic (LBE) as coolant. Because the density (10,000kg/m{sup 3}) of LBE coolant is very heavier than sodium coolant and water, this presents a challenge to designers of the seismic isolation systems that will be used with these heavy liquid metal reactors. Finite element analysis is adapted to determine the characteristics of the isolator device. Results are presented from a study on the use of three-dimensional seismic isolation devices to the full-scale reactor. The seismic analysis responses of the two-dimensional and the three-dimensional isolation systems for the PASCAR are compared with that of the conventional fixed base system.

  15. Achieving cost savings through collaborative seismic testing

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, G. [Union Electric Co., Fulton, MO (United States); Richards, J. [Duke Power Company (United States); Loflin, L. [EPRI, Plant Support Engineering (PSE) (United States)

    1998-05-01

    Seismic qualification of nuclear power plant equipment through testing has been perceived by utility personnel as a costly and complicated process. Certainly, some equipment types may only be qualified by test due to the complexity of the item and/or the inability to represent the item in a quantitative analysis. Other factors also influence the reluctance by some to resort to seismic testing. These include cost, dealing with test failures, lack of understanding of the testing process, and greater reliance on new analytical techniques. A group of utilities have allied themselves to address these issues and have formed the seismic qualification reporting and testing standardization (SQURTS) group. SQURTS has implemented a program whereby testing is performed at a low cost, often lower than a comparable analytical solution. Testing is conducted at generic seismic levels using generic test procedures, which broadens the applicability of results. Test reports are published in a standard format which shortens the process of review and approval. Testing is no longer just a requirement, but a cost-effective option. (orig.)

  16. Community Seismic Network (CSN)

    Science.gov (United States)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.; Liu, A.; Strand, L.

    2012-12-01

    We report on developments in sensor connectivity, architecture, and data fusion algorithms executed in Cloud computing systems in the Community Seismic Network (CSN), a network of low-cost sensors housed in homes and offices by volunteers in the Pasadena, CA area. The network has over 200 sensors continuously reporting anomalies in local acceleration through the Internet to a Cloud computing service (the Google App Engine) that continually fuses sensor data to rapidly detect shaking from earthquakes. The Cloud computing system consists of data centers geographically distributed across the continent and is likely to be resilient even during earthquakes and other local disasters. The region of Southern California is partitioned in a multi-grid style into sets of telescoping cells called geocells. Data streams from sensors within a geocell are fused to detect anomalous shaking across the geocell. Temporal spatial patterns across geocells are used to detect anomalies across regions. The challenge is to detect earthquakes rapidly with an extremely low false positive rate. We report on two data fusion algorithms, one that tessellates the surface so as to fuse data from a large region around Pasadena and the other, which uses a standard tessellation of equal-sized cells. Since September 2011, the network has successfully detected earthquakes of magnitude 2.5 or higher within 40 Km of Pasadena. In addition to the standard USB device, which connects to the host's computer, we have developed a stand-alone sensor that directly connects to the internet via Ethernet or wifi. This bypasses security concerns that some companies have with the USB-connected devices, and allows for 24/7 monitoring at sites that would otherwise shut down their computers after working hours. In buildings we use the sensors to model the behavior of the structures during weak events in order to understand how they will perform during strong events. Visualization models of instrumented buildings ranging

  17. The Geometry of Conventionality

    CERN Document Server

    Weatherall, James Owen

    2013-01-01

    Hans Reichenbach famously argued that the geometry of spacetime is conventional in relativity theory, in the sense that one can freely choose the spacetime metric so long as one is willing to postulate a "universal force field". Here we make precise a sense in which the field Reichenbach defines fails to be a "force". We then argue that there is an interesting and perhaps tenable sense in which geometry is conventional in classical spacetimes. We conclude with a no-go result showing that the variety of conventionalism available in classical spacetimes does not extend to relativistic spacetimes.

  18. Workshop on induced Seismicity due to fluid injection/production from Energy-Related Applications

    Energy Technology Data Exchange (ETDEWEB)

    Majer, E.L.; Asanuma, Hiroshi; Rueter, Horst; Stump, Brian; Segall, Paul; Zoback, Mark; Nelson, Jim; Frohlich, Cliff; Rutledge, Jim; Gritto, Roland; Baria, Roy; Hickman, Steve; McGarr, Art; Ellsworth, Bill; Lockner, Dave; Oppenheimer, David; Henning, Peter; Rosca, Anca; Hornby, Brian; Wang, Herb; Beeler, Nick; Ghassemi, Ahmad; Walters, Mark; Robertson-Tait, Ann; Dracos, Peter; Fehler, Mike; Abou-Sayed, Ahmed; Ake, Jon; Vorobiev, Oleg; Julian, Bruce

    2011-04-01

    Geothermal energy, carbon sequestration, and enhanced oil and gas recovery have a clear role in U.S. energy policy, both in securing cost-effective energy and reducing atmospheric CO{sub 2} accumulations. Recent publicity surrounding induced seismicity at several geothermal and oil and gas sites points out the need to develop improved standards and practices to avoid issues that may unduly inhibit or stop the above technologies from fulfilling their full potential. It is critical that policy makers and the general community be assured that EGS, CO{sub 2} sequestration, enhanced oil/gas recovery, and other technologies relying on fluid injections, will be designed to reduce induced seismicity to an acceptable level, and be developed in a safe and cost-effective manner. Induced seismicity is not new - it has occurred as part of many different energy and industrial applications (reservoir impoundment, mining, oil recovery, construction, waste disposal, conventional geothermal). With proper study/research and engineering controls, induced seismicity should eventually allow safe and cost-effective implementation of any of these technologies. In addition, microseismicity is now being used as a remote sensing tool for understanding and measuring the success of injecting fluid into the subsurface in a variety of applications, including the enhancement of formation permeability through fracture creation/reactivation, tracking fluid migration and storage, and physics associated with stress redistribution. This potential problem was envisaged in 2004 following observed seismicity at several EGS sites, a study was implemented by DOE to produce a white paper and a protocol (Majer et al 2008) to help potential investors. Recently, however, there have been a significant number of adverse comments by the press regarding induced seismicity which could adversely affect the development of the energy sector in the USA. Therefore, in order to identify critical technology and research

  19. Conventional Spinal Anaesthesia

    African Journals Online (AJOL)

    patients scheduled for clcctive unilateral lower limb surgery. ... the conventional group were turned supine immediately after injection. Blood pressure, heart rate, respiratory rate and oxygen .... Characteristic Type of spinal anaesthcsia P-value.

  20. Landslide seismic magnitude

    Science.gov (United States)

    Lin, C. H.; Jan, J. C.; Pu, H. C.; Tu, Y.; Chen, C. C.; Wu, Y. M.

    2015-11-01

    Landslides have become one of the most deadly natural disasters on earth, not only due to a significant increase in extreme climate change caused by global warming, but also rapid economic development in topographic relief areas. How to detect landslides using a real-time system has become an important question for reducing possible landslide impacts on human society. However, traditional detection of landslides, either through direct surveys in the field or remote sensing images obtained via aircraft or satellites, is highly time consuming. Here we analyze very long period seismic signals (20-50 s) generated by large landslides such as Typhoon Morakot, which passed though Taiwan in August 2009. In addition to successfully locating 109 large landslides, we define landslide seismic magnitude based on an empirical formula: Lm = log ⁡ (A) + 0.55 log ⁡ (Δ) + 2.44, where A is the maximum displacement (μm) recorded at one seismic station and Δ is its distance (km) from the landslide. We conclude that both the location and seismic magnitude of large landslides can be rapidly estimated from broadband seismic networks for both academic and applied purposes, similar to earthquake monitoring. We suggest a real-time algorithm be set up for routine monitoring of landslides in places where they pose a frequent threat.

  1. APPlication of suPPressing random noise in seismic data based on Trivashrink and DTCWT

    Institute of Scientific and Technical Information of China (English)

    WEI Yajie

    2014-01-01

    In Process of seismic exPloration,the noise of seismic signals Produces serious interference. Conven-tional methods of wavelet threshold denoising cannot fully use the characteristics of seismic signals due to its limitations. There is always a certain degree of deviation between estimated value and actual value. In this stu-dy,a method of seismic data denoising is ProPosed,the authors use the current coefficients,the Parent coeffi-cients and the neighborhood coefficients based on dual-tree comPlex wavelet transform( DTCWT )and related sub-band denoising model( TrivaShrink)to achieve the oPtimal estimation of shrinking factor and get the noise reduction of seismic records. It is found that the method is better than conventional methods of wavelet threshold denoising in removing random noise.

  2. Seismic Hazard Prediction Using Seismic Bumps: A Data Mining Approach

    Directory of Open Access Journals (Sweden)

    Musa Peker

    2016-04-01

    Full Text Available Due to the large number of influencing factors, it is difficult to predict the earthquake which is a natural disaster. Researchers are working intensively on earthquake prediction. Loss of life and property can be minimized with earthquake prediction. In this study, a system is proposed for earthquake prediction with data mining techniques. In the study in which Cross Industry Standard Process for Data Mining (CRISP-DM approach has been used as data mining methodology, seismic bumps data obtained from mines has been analyzed. Extreme learning machine (ELM which is an effective and rapid classification algorithm has been used in the modeling phase. In the evaluation stage, different performance evaluation criteria such as classification accuracy, sensitivity, specificity and kappa value have been used. The results are promising for earthquake prediction.

  3. Studies on seismic source

    Institute of Scientific and Technical Information of China (English)

    李世愚; 陈运泰

    2003-01-01

    During the period of 1999~2002, the Chinese seismologists made a serious of developments in the study on seismic sources including observations, experiments and theory. In the field of observation, the methods of the accuracy location of earthquake sources, the inversion of seismic moment tensor and the mechanism of earthquake source are improved and developed. A lot of important earthquake events are studied by using these methods. The rupture processes of these events are inverted and investigated combined with the local stress fields and the tectonic moment by using the measurements of surface deformation. In the fields of experiments and theory, many developments are obtained in cause of seismic formation, condition of stress and tectonics, dynamics of earthquake rupture, rock fracture and nucleation of strong earthquakes.

  4. Signal-to-noise ratio application to seismic marker analysis and fracture detection

    Institute of Scientific and Technical Information of China (English)

    Xu Hui-Qun; and Gui Zhi-Xian

    2014-01-01

    Seismic data with high signal-to-noise ratios (SNRs) are useful in reservoir exploration. To obtain high SNR seismic data, significant effort is required to achieve noise attenuation in seismic data processing, which is costly in materials, and human and financial resources. We introduce a method for improving the SNR of seismic data. The SNR is calculated by using the frequency domain method. Furthermore, we optimize and discuss the critical parameters and calculation procedure. We applied the proposed method on real data and found that the SNR is high in the seismic marker and low in the fracture zone. Consequently, this can be used to extract detailed information about fracture zones that are inferred by structural analysis but not observed in conventional seismic data.

  5. B341 Seismic Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Halle, J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-01-02

    The Seismic Evaluation of Building 341 located at Lawrence Livermore National Laboratory in Livermore, California has been completed. The subject building consists of a main building, Increment 1, and two smaller additions; Increments 2 and 3. Based on our evaluation the building does not meet a Life Safety performance level for the BSE- 1E earthquake ground shaking hazard. The BSE-1E is the recommended seismic hazard level for evaluation of existing structures and is based on a 20% probability of exceedence in 50 years.

  6. Induced Seismicity Monitoring System

    Science.gov (United States)

    Taylor, S. R.; Jarpe, S.; Harben, P.

    2014-12-01

    There are many seismological aspects associated with monitoring of permanent storage of carbon dioxide (CO2) in geologic formations. Many of these include monitoring underground gas migration through detailed tomographic studies of rock properties, integrity of the cap rock and micro seismicity with time. These types of studies require expensive deployments of surface and borehole sensors in the vicinity of the CO2 injection wells. Another problem that may exist in CO2 sequestration fields is the potential for damaging induced seismicity associated with fluid injection into the geologic reservoir. Seismic hazard monitoring in CO2 sequestration fields requires a seismic network over a spatially larger region possibly having stations in remote settings. Expensive observatory-grade seismic systems are not necessary for seismic hazard deployments or small-scale tomographic studies. Hazard monitoring requires accurate location of induced seismicity to magnitude levels only slightly less than that which can be felt at the surface (e.g. magnitude 1), and the frequencies of interest for tomographic analysis are ~1 Hz and greater. We have developed a seismo/acoustic smart sensor system that can achieve the goals necessary for induced seismicity monitoring in CO2 sequestration fields. The unit is inexpensive, lightweight, easy to deploy, can operate remotely under harsh conditions and features 9 channels of recording (currently 3C 4.5 Hz geophone, MEMS accelerometer and microphone). An on-board processor allows for satellite transmission of parameter data to a processing center. Continuous or event-detected data is kept on two removable flash SD cards of up to 64+ Gbytes each. If available, data can be transmitted via cell phone modem or picked up via site visits. Low-power consumption allows for autonomous operation using only a 10 watt solar panel and a gel-cell battery. The system has been successfully tested for long-term (> 6 months) remote operations over a wide range

  7. Aspects of the Iea-R1 research reactor seismic evaluation; Aspectos da avaliacao sismica do reator de pesquisa IEA-R1

    Energy Technology Data Exchange (ETDEWEB)

    Mattar Neto, Miguel [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)

    1996-07-01

    Codes and standards for the seismic evaluation of the research reactor IEA-R1 are presented. An approach to define the design basis earthquake based on the local seismic map and on simplified analysis methods is proposed. The site seismic evaluation indicates that the design earthquake intensity is IV MM. Therefore, according to the used codes and standards, no buildings, systems, and components seismic analysis are required. (author)

  8. Mammoth Mountain, California broadband seismic experiment

    Science.gov (United States)

    Dawson, P. B.; Pitt, A. M.; Wilkinson, S. K.; Chouet, B. A.; Hill, D. P.; Mangan, M.; Prejean, S. G.; Read, C.; Shelly, D. R.

    2013-12-01

    Mammoth Mountain is a young cumulo-volcano located on the southwest rim of Long Valley caldera, California. Current volcanic processes beneath Mammoth Mountain are manifested in a wide range of seismic signals, including swarms of shallow volcano-tectonic earthquakes, upper and mid-crustal long-period earthquakes, swarms of brittle-failure earthquakes in the lower crust, and shallow (3-km depth) very-long-period earthquakes. Diffuse emissions of C02 began after a magmatic dike injection beneath the volcano in 1989, and continue to present time. These indications of volcanic unrest drive an extensive monitoring effort of the volcano by the USGS Volcano Hazards Program. As part of this effort, eleven broadband seismometers were deployed on Mammoth Mountain in November 2011. This temporary deployment is expected to run through the fall of 2013. These stations supplement the local short-period and broadband seismic stations of the Northern California Seismic Network (NCSN) and provide a combined network of eighteen broadband stations operating within 4 km of the summit of Mammoth Mountain. Data from the temporary stations are not available in real-time, requiring the merging of the data from the temporary and permanent networks, timing of phases, and relocation of seismic events to be accomplished outside of the standard NCSN processing scheme. The timing of phases is accomplished through an interactive Java-based phase-picking routine, and the relocation of seismicity is achieved using the probabilistic non-linear software package NonLinLoc, distributed under the GNU General Public License by Alomax Scientific. Several swarms of shallow volcano-tectonic earthquakes, spasmodic bursts of high-frequency earthquakes, a few long-period events located within or below the edifice of Mammoth Mountain and numerous mid-crustal long-period events have been recorded by the network. To date, about 900 of the ~2400 events occurring beneath Mammoth Mountain since November 2011 have

  9. Seismic Search Engine: A distributed database for mining large scale seismic data

    Science.gov (United States)

    Liu, Y.; Vaidya, S.; Kuzma, H. A.

    2009-12-01

    The International Monitoring System (IMS) of the CTBTO collects terabytes worth of seismic measurements from many receiver stations situated around the earth with the goal of detecting underground nuclear testing events and distinguishing them from other benign, but more common events such as earthquakes and mine blasts. The International Data Center (IDC) processes and analyzes these measurements, as they are collected by the IMS, to summarize event detections in daily bulletins. Thereafter, the data measurements are archived into a large format database. Our proposed Seismic Search Engine (SSE) will facilitate a framework for data exploration of the seismic database as well as the development of seismic data mining algorithms. Analogous to GenBank, the annotated genetic sequence database maintained by NIH, through SSE, we intend to provide public access to seismic data and a set of processing and analysis tools, along with community-generated annotations and statistical models to help interpret the data. SSE will implement queries as user-defined functions composed from standard tools and models. Each query is compiled and executed over the database internally before reporting results back to the user. Since queries are expressed with standard tools and models, users can easily reproduce published results within this framework for peer-review and making metric comparisons. As an illustration, an example query is “what are the best receiver stations in East Asia for detecting events in the Middle East?” Evaluating this query involves listing all receiver stations in East Asia, characterizing known seismic events in that region, and constructing a profile for each receiver station to determine how effective its measurements are at predicting each event. The results of this query can be used to help prioritize how data is collected, identify defective instruments, and guide future sensor placements.

  10. Nonstructural seismic restraint guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Butler, D.M.; Czapinski, R.H.; Firneno, M.J.; Feemster, H.C.; Fornaciari, N.R.; Hillaire, R.G.; Kinzel, R.L.; Kirk, D.; McMahon, T.T.

    1993-08-01

    The Nonstructural Seismic Restraint Guidelines provide general information about how to secure or restrain items (such as material, equipment, furniture, and tools) in order to prevent injury and property, environmental, or programmatic damage during or following an earthquake. All SNL sites may experience earthquakes of magnitude 6.0 or higher on the Richter scale. Therefore, these guidelines are written for all SNL sites.

  11. Understanding induced seismicity

    NARCIS (Netherlands)

    Elsworth, Derek; Spiers, Christopher J.; Niemeijer, Andre R.

    2016-01-01

    Fluid injection–induced seismicity has become increasingly widespread in oil- and gas-producing areas of the United States (1–3) and western Canada. It has shelved deep geothermal energy projects in Switzerland and the United States (4), and its effects are especially acute in Oklahoma, where seismi

  12. Understanding induced seismicity

    NARCIS (Netherlands)

    Elsworth, Derek; Spiers, Christopher J.; Niemeijer, Andre R.

    2016-01-01

    Fluid injection–induced seismicity has become increasingly widespread in oil- and gas-producing areas of the United States (1–3) and western Canada. It has shelved deep geothermal energy projects in Switzerland and the United States (4), and its effects are especially acute in Oklahoma, where

  13. Mobile seismic exploration

    Science.gov (United States)

    Dräbenstedt, A.; Cao, X.; Polom, U.; Pätzold, F.; Zeller, T.; Hecker, P.; Seyfried, V.; Rembe, C.

    2016-06-01

    Laser-Doppler-Vibrometry (LDV) is an established technique to measure vibrations in technical systems with picometer vibration-amplitude resolution. Especially good sensitivity and resolution can be achieved at an infrared wavelength of 1550 nm. High-resolution vibration measurements are possible over more than 100 m distance. This advancement of the LDV technique enables new applications. The detection of seismic waves is an application which has not been investigated so far because seismic waves outside laboratory scales are usually analyzed at low frequencies between approximately 1 Hz and 250 Hz and require velocity resolutions in the range below 1 nm/s/√Hz. Thermal displacements and air turbulence have critical influences to LDV measurements at this low-frequency range leading to noise levels of several 100 nm/√Hz. Commonly seismic waves are measured with highly sensitive inertial sensors (geophones or Micro Electro-Mechanical Sensors (MEMS)). Approaching a laser geophone based on LDV technique is the topic of this paper. We have assembled an actively vibration-isolated optical table in a minivan which provides a hole in its underbody. The laser-beam of an infrared LDV assembled on the optical table impinges the ground below the car through the hole. A reference geophone has detected remaining vibrations on the table. We present the results from the first successful experimental demonstration of contactless detection of seismic waves from a movable vehicle with a LDV as laser geophone.

  14. Mobile seismic exploration

    Energy Technology Data Exchange (ETDEWEB)

    Dräbenstedt, A., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de; Seyfried, V. [Research & Development, Polytec GmbH, Waldbronn (Germany); Cao, X.; Rembe, C., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de [Institute of Electrical Information Technology, TU Clausthal, Clausthal-Zellerfeld (Germany); Polom, U., E-mail: a.draebenstedt@polytec.de, E-mail: rembe@iei.tu-clausthal.de, E-mail: ulrich.polom@liag-hannover.de [Leibniz Institute of Applied Geophysics, Hannover (Germany); Pätzold, F.; Hecker, P. [Institute of Flight Guidance, TU Braunschweig, Braunschweig (Germany); Zeller, T. [Clausthaler Umwelttechnik Institut CUTEC, Clausthal-Zellerfeld (Germany)

    2016-06-28

    Laser-Doppler-Vibrometry (LDV) is an established technique to measure vibrations in technical systems with picometer vibration-amplitude resolution. Especially good sensitivity and resolution can be achieved at an infrared wavelength of 1550 nm. High-resolution vibration measurements are possible over more than 100 m distance. This advancement of the LDV technique enables new applications. The detection of seismic waves is an application which has not been investigated so far because seismic waves outside laboratory scales are usually analyzed at low frequencies between approximately 1 Hz and 250 Hz and require velocity resolutions in the range below 1 nm/s/√Hz. Thermal displacements and air turbulence have critical influences to LDV measurements at this low-frequency range leading to noise levels of several 100 nm/√Hz. Commonly seismic waves are measured with highly sensitive inertial sensors (geophones or Micro Electro-Mechanical Sensors (MEMS)). Approaching a laser geophone based on LDV technique is the topic of this paper. We have assembled an actively vibration-isolated optical table in a minivan which provides a hole in its underbody. The laser-beam of an infrared LDV assembled on the optical table impinges the ground below the car through the hole. A reference geophone has detected remaining vibrations on the table. We present the results from the first successful experimental demonstration of contactless detection of seismic waves from a movable vehicle with a LDV as laser geophone.

  15. Geophysics and Seismic Hazard Reduction

    Institute of Scientific and Technical Information of China (English)

    YuGuihua; ZhouYuanze; YuSheng

    2003-01-01

    The earthquake is a natural phenomenon, which often brings serious hazard to the human life and material possession. It is a physical process of releasing interior energy of the earth, which is caused by interior and outer forces in special tectonic environment in the earth, especially within the lithosphere. The earthquake only causes casualty and loss in the place where people inhabit. Seismic hazard reduction is composed of four parts as seismic prediction, hazard prevention and seismic engineering, seismic response and seismic rescuing, and rebuilding.

  16. Validation of seismic probabilistic risk assessments of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Ellingwood, B. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Civil Engineering

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  17. Seismic Noise Characterization in the Northern Mississippi Embayment

    Science.gov (United States)

    Wiley, S.; Deshon, H. R.; Boyd, O. S.

    2009-12-01

    We present a study of seismic noise sources present within the northern Mississippi embayment near the New Madrid Seismic Zone (NMSZ). The northern embayment contains up to 1 km of unconsolidated coastal plain sediments overlying bedrock, making it an inherently noisy environment for seismic stations. The area is known to display high levels of cultural noise caused by agricultural activity, passing cars, trains, etc. We characterize continuous broadband seismic noise data recorded for the months of March through June 2009 at six stations operated by the Cooperative New Madrid Seismic Network. We looked at a single horizontal component of data during nighttime hours, defined as 6:15PM to 5:45AM Central Standard Time, which we determined to be the lowest amplitude period of noise for the region. Hourly median amplitudes were compared to daily average wind speeds downloaded from the National Oceanic and Atmospheric Administration. We find a correlation between time periods of increased noise and days with high wind speeds, suggesting that wind is likely a prevalent source of seismic noise in the area. The effects of wind on seismic recordings may result from wind induced tree root movement which causes ground motion to be recorded at the vaults located ~3m below ground. Automated studies utilizing the local network or the EarthScope Transportable Array, scheduled to arrive in the area in 2010-11, should expect to encounter wind induced noise fluctuations and must account for this in their analysis.

  18. Background noise model development for seismic stations of KMA

    Science.gov (United States)

    Jeon, Youngsoo

    2010-05-01

    The background noise recorded at seismometer is exist at any seismic signal due to the natural phenomena of the medium which the signal passed through. Reducing the seismic noise is very important to improve the data quality in seismic studies. But, the most important aspect of reducing seismic noise is to find the appropriate place before installing the seismometer. For this reason, NIMR(National Institution of Meteorological Researches) starts to develop a model of standard background noise for the broadband seismic stations of the KMA(Korea Meteorological Administration) using a continuous data set obtained from 13 broadband stations during the period of 2007 and 2008. We also developed the model using short period seismic data from 10 stations at the year of 2009. The method of Mcmara and Buland(2004) is applied to analyse background noise of Korean Peninsula. The fact that borehole seismometer records show low noise level at frequency range greater than 1 Hz compared with that of records at the surface indicate that the cultural noise of inland Korean Peninsula should be considered to process the seismic data set. Reducing Double Frequency peak also should be regarded because the Korean Peninsula surrounded by the seas from eastern, western and southern part. The development of KMA background model shows that the Peterson model(1993) is not applicable to fit the background noise signal generated from Korean Peninsula.

  19. The marriage of conventional cancer treatments and alternative cancer therapies.

    Science.gov (United States)

    Decker, Georgia M

    2008-06-01

    The terms "alternative" or "unconventional" have been used to describe any therapy used instead of conventional approaches. Conventional approaches, known as "standard" or "traditional" or "biomedical" approaches, have had broad application in Western medicine. Complementary and alternative medicine has been referred to as "integrative," "integrated," or "complementary" when therapies are combined with conventional approaches, such as those for cancer.

  20. Design and development of digital seismic amplifier recorder

    Energy Technology Data Exchange (ETDEWEB)

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan, E-mail: gunawanhandayani@gmail.com [Department of Physics, ITB (Indonesia)

    2015-04-16

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1KΩ and 1MΩ. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.

  1. Design and development of digital seismic amplifier recorder

    Science.gov (United States)

    Samsidar, Siti Alaa; Afuar, Waldy; Handayani, Gunawan

    2015-04-01

    A digital seismic recording is a recording technique of seismic data in digital systems. This method is more convenient because it is more accurate than other methods of seismic recorders. To improve the quality of the results of seismic measurements, the signal needs to be amplified to obtain better subsurface images. The purpose of this study is to improve the accuracy of measurement by amplifying the input signal. We use seismic sensors/geophones with a frequency of 4.5 Hz. The signal is amplified by means of 12 units of non-inverting amplifier. The non-inverting amplifier using IC 741 with the resistor values 1KΩ and 1MΩ. The amplification results were 1,000 times. The results of signal amplification converted into digital by using the Analog Digital Converter (ADC). Quantitative analysis in this study was performed using the software Lab VIEW 8.6. The Lab VIEW 8.6 program was used to control the ADC. The results of qualitative analysis showed that the seismic conditioning can produce a large output, so that the data obtained is better than conventional data. This application can be used for geophysical methods that have low input voltage such as microtremor application.

  2. A robust polynomial principal component analysis for seismic noise attenuation

    Science.gov (United States)

    Wang, Yuchen; Lu, Wenkai; Wang, Benfeng; Liu, Lei

    2016-12-01

    Random and coherent noise attenuation is a significant aspect of seismic data processing, especially for pre-stack seismic data flattened by normal moveout correction or migration. Signal extraction is widely used for pre-stack seismic noise attenuation. Principle component analysis (PCA), one of the multi-channel filters, is a common tool to extract seismic signals, which can be realized by singular value decomposition (SVD). However, when applying the traditional PCA filter to seismic signal extraction, the result is unsatisfactory with some artifacts when the seismic data is contaminated by random and coherent noise. In order to directly extract the desired signal and fix those artifacts at the same time, we take into consideration the amplitude variation with offset (AVO) property and thus propose a robust polynomial PCA algorithm. In this algorithm, a polynomial constraint is used to optimize the coefficient matrix. In order to simplify this complicated problem, a series of sub-optimal problems are designed and solved iteratively. After that, the random and coherent noise can be effectively attenuated simultaneously. Applications on synthetic and real data sets note that our proposed algorithm can better suppress random and coherent noise and have a better performance on protecting the desired signals, compared with the local polynomial fitting, conventional PCA and a L1-norm based PCA method.

  3. High Voltage Seismic Generator

    Science.gov (United States)

    Bogacz, Adrian; Pala, Damian; Knafel, Marcin

    2015-04-01

    This contribution describes the preliminary result of annual cooperation of three student research groups from AGH UST in Krakow, Poland. The aim of this cooperation was to develop and construct a high voltage seismic wave generator. Constructed device uses a high-energy electrical discharge to generate seismic wave in ground. This type of device can be applied in several different methods of seismic measurement, but because of its limited power it is mainly dedicated for engineering geophysics. The source operates on a basic physical principles. The energy is stored in capacitor bank, which is charged by two stage low to high voltage converter. Stored energy is then released in very short time through high voltage thyristor in spark gap. The whole appliance is powered from li-ion battery and controlled by ATmega microcontroller. It is possible to construct larger and more powerful device. In this contribution the structure of device with technical specifications is resented. As a part of the investigation the prototype was built and series of experiments conducted. System parameter was measured, on this basis specification of elements for the final device were chosen. First stage of the project was successful. It was possible to efficiently generate seismic waves with constructed device. Then the field test was conducted. Spark gap wasplaced in shallowborehole(0.5 m) filled with salt water. Geophones were placed on the ground in straight line. The comparison of signal registered with hammer source and sparker source was made. The results of the test measurements are presented and discussed. Analysis of the collected data shows that characteristic of generated seismic signal is very promising, thus confirms possibility of practical application of the new high voltage generator. The biggest advantage of presented device after signal characteristics is its size which is 0.5 x 0.25 x 0.2 m and weight approximately 7 kg. This features with small li-ion battery makes

  4. Calibration of Seismic Attributes for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Wayne D. Pennington

    2002-09-29

    The project, "Calibration of Seismic Attributes for Reservoir Characterization," is now complete. Our original proposed scope of work included detailed analysis of seismic and other data from two to three hydrocarbon fields; we have analyzed data from four fields at this level of detail, two additional fields with less detail, and one other 2D seismic line used for experimentation. We also included time-lapse seismic data with ocean-bottom cable recordings in addition to the originally proposed static field data. A large number of publications and presentations have resulted from this work, inlcuding several that are in final stages of preparation or printing; one of these is a chapter on "Reservoir Geophysics" for the new Petroleum Engineering Handbook from the Society of Petroleum Engineers. Major results from this project include a new approach to evaluating seismic attributes in time-lapse monitoring studies, evaluation of pitfalls in the use of point-based measurements and facies classifications, novel applications of inversion results, improved methods of tying seismic data to the wellbore, and a comparison of methods used to detect pressure compartments. Some of the data sets used are in the public domain, allowing other investigators to test our techniques or to improve upon them using the same data. From the public-domain Stratton data set we have demonstrated that an apparent correlation between attributes derived along 'phantom' horizons are artifacts of isopach changes; only if the interpreter understands that the interpretation is based on this correlation with bed thickening or thinning, can reliable interpretations of channel horizons and facies be made. From the public-domain Boonsville data set we developed techniques to use conventional seismic attributes, including seismic facies generated under various neural network procedures, to subdivide regional facies determined from logs into productive and non-productive subfacies, and we

  5. CALIBRATION OF SEISMIC ATTRIBUTES FOR RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Wayne D. Pennington; Horacio Acevedo; Aaron Green; Joshua Haataja; Shawn Len; Anastasia Minaeva; Deyi Xie

    2002-10-01

    The project, ''Calibration of Seismic Attributes for Reservoir Calibration,'' is now complete. Our original proposed scope of work included detailed analysis of seismic and other data from two to three hydrocarbon fields; we have analyzed data from four fields at this level of detail, two additional fields with less detail, and one other 2D seismic line used for experimentation. We also included time-lapse seismic data with ocean-bottom cable recordings in addition to the originally proposed static field data. A large number of publications and presentations have resulted from this work, including several that are in final stages of preparation or printing; one of these is a chapter on ''Reservoir Geophysics'' for the new Petroleum Engineering Handbook from the Society of Petroleum Engineers. Major results from this project include a new approach to evaluating seismic attributes in time-lapse monitoring studies, evaluation of pitfalls in the use of point-based measurements and facies classifications, novel applications of inversion results, improved methods of tying seismic data to the wellbore, and a comparison of methods used to detect pressure compartments. Some of the data sets used are in the public domain, allowing other investigators to test our techniques or to improve upon them using the same data. From the public-domain Stratton data set we have demonstrated that an apparent correlation between attributes derived along ''phantom'' horizons are artifacts of isopach changes; only if the interpreter understands that the interpretation is based on this correlation with bed thickening or thinning, can reliable interpretations of channel horizons and facies be made. From the public-domain Boonsville data set we developed techniques to use conventional seismic attributes, including seismic facies generated under various neural network procedures, to subdivide regional facies determined from logs into

  6. Standardized implementation and evaluation of conventional open radical resection for distal gastric cancer%经腹远端胃癌根治术规范化实施及评价

    Institute of Scientific and Technical Information of China (English)

    戴冬秋; 张春东

    2014-01-01

    The incidence of gastric cancer in China is high and most patients are diagnosed at advanced stage. Surgery is one of the key therapies for patients with advanced gastric cancer. Gastrectomy with D2 lymphadenectomy is the standard operation for advanced gastric cancer. However, there are still many problems in the process of standardized implementation. Surgeon should master the standardized implementation, significance of preoperative staging, preoperative and particularly intraoperative management of surgical indications, standardized sequence of operation, method of lymph node dissection, determination of gastrectomy, operation of tumor-free technique, treatment and prevention for intraoperative damage or postoperatively serious complications.%我国胃癌发病率高,多数病人就诊时已处于进展期,手术是治疗进展期胃癌的重要手段之一。胃癌D2根治术是进展期胃癌的标准术式,而在其规范化实施过程中仍存在诸多问题。外科医生应掌握经腹远端胃癌根治术的规范化实施要点,术前分期的意义,术前、尤其术中合理确定手术适应证的要点,规范化的手术操作顺序,淋巴结清扫方法,胃切除范围判定,无瘤技术操作及术中损伤及术后常见严重并发症防治等问题。

  7. Imaging fault zones using 3D seismic image processing techniques

    Science.gov (United States)

    Iacopini, David; Butler, Rob; Purves, Steve

    2013-04-01

    Significant advances in structural analysis of deep water structure, salt tectonic and extensional rift basin come from the descriptions of fault system geometries imaged in 3D seismic data. However, even where seismic data are excellent, in most cases the trajectory of thrust faults is highly conjectural and still significant uncertainty exists as to the patterns of deformation that develop between the main faults segments, and even of the fault architectures themselves. Moreover structural interpretations that conventionally define faults by breaks and apparent offsets of seismic reflectors are commonly conditioned by a narrow range of theoretical models of fault behavior. For example, almost all interpretations of thrust geometries on seismic data rely on theoretical "end-member" behaviors where concepts as strain localization or multilayer mechanics are simply avoided. Yet analogue outcrop studies confirm that such descriptions are commonly unsatisfactory and incomplete. In order to fill these gaps and improve the 3D visualization of deformation in the subsurface, seismic attribute methods are developed here in conjunction with conventional mapping of reflector amplitudes (Marfurt & Chopra, 2007)). These signal processing techniques recently developed and applied especially by the oil industry use variations in the amplitude and phase of the seismic wavelet. These seismic attributes improve the signal interpretation and are calculated and applied to the entire 3D seismic dataset. In this contribution we will show 3D seismic examples of fault structures from gravity-driven deep-water thrust structures and extensional basin systems to indicate how 3D seismic image processing methods can not only build better the geometrical interpretations of the faults but also begin to map both strain and damage through amplitude/phase properties of the seismic signal. This is done by quantifying and delineating the short-range anomalies on the intensity of reflector amplitudes

  8. Reduction of seismic risk for immovable cultural property

    Directory of Open Access Journals (Sweden)

    Kuzović Duško

    2015-01-01

    Full Text Available The existing legislation for determining the seismic design parameters, which is used in Serbia, is defined by the "Code on technical norms for construction of buildings in seismic areas" ("Sl. List SFRJ" no. 31/81, and its amendments and amendments to ("Sl. List SFRJ" no. 49/82, 29/83, 21/88 and 52/90, as well as Code on technical standards for remediation, strengthening and reconstruction of building structures damaged in earthquakes and for reconstruction and revitalization of building structures ("Sl .List SFRY", no. 52/85 etc. The above mentioned normatives are related to the seismic risk prevention for the newly constructed buildings or their revitalization, and all of them obey to no collapse requirement. Within them, all structures are grouped into appropriate categories comprising allowed seismic risk in their service life. Having in mind their uniqueness and irreparable loss in the event of their destruction it is necessary to take all required actions in order to protect them in the event of an earthquake. All new solutions within regulations of seismic construction should be associated with the provisions of the Law on Cultural Property ("Official Gazette of the Republic of Serbia" no. 71/94. These legislative changes would result in obligation to prevent seismic hazards to which historical buildings are exposed, through standardized legal studies and interventions on buildings.

  9. Calibration of Seismic Attributes for Reservoir Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Pennington, Wayne D.; Acevedo, Horacio; Green, Aaron; Len, Shawn; Minavea, Anastasia; Wood, James; Xie, Deyi

    2002-01-29

    This project has completed the initially scheduled third year of the contract, and is beginning a fourth year, designed to expand upon the tech transfer aspects of the project. From the Stratton data set, demonstrated that an apparent correlation between attributes derived along `phantom' horizons are artifacts of isopach changes; only if the interpreter understands that the interpretation is based on this correlation with bed thickening or thinning, can reliable interpretations of channel horizons and facies be made. From the Boonsville data set , developed techniques to use conventional seismic attributes, including seismic facies generated under various neural network procedures, to subdivide regional facies determined from logs into productive and non-productive subfacies, and developed a method involving cross-correlation of seismic waveforms to provide a reliable map of the various facies present in the area. The Teal South data set provided a surprising set of data, leading us to develop a pressure-dependent velocity relationship and to conclude that nearby reservoirs are undergoing a pressure drop in response to the production of the main reservoir, implying that oil is being lost through their spill points, never to be produced. The Wamsutter data set led to the use of unconventional attributes including lateral incoherence and horizon-dependent impedance variations to indicate regions of former sand bars and current high pressure, respectively, and to evaluation of various upscaling routines.

  10. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  11. Understanding seismic design criteria for Japanese Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, Y.J.; Hofmayer, C.H. [Brookhaven National Lab., Upton, NY (United States); Costello, J.F. [Nuclear Regulatory Commission, Washington, DC (United States)

    1995-04-01

    This paper summarizes the results of recent survey studies on the seismic design practice for nuclear power plants in Japan. The seismic design codes and standards for both nuclear as well as non-nuclear structures have been reviewed and summarized. Some key documents for understanding Japanese seismic design criteria are also listed with brief descriptions. The paper highlights the design criteria to determine the seismic demand and component capacity in comparison with U.S. criteria, the background studies which have led to the current Japanese design criteria, and a survey of current research activities. More detailed technical descriptions are presented on the development of Japanese shear wall equations, design requirements for containment structures, and ductility requirements.

  12. Evidence Standards and Litigation

    DEFF Research Database (Denmark)

    Guerra, Alice; Luppi, Barbara; Parisi, Francesco

    aspect of the legal system: the evidence standard. We recast the conventional rent-seeking model to consider how alternative evidence standards affect litigation choices. We analyze the interrelation between different evidence standards, the effectiveness of the parties’ efforts, and the merits...

  13. The reliability of radon as seismic precursor

    Science.gov (United States)

    Emilian Toader, Victorin; Moldovan, Iren Adelina; Ionescu, Constantin; Marmureanu, Alexandru

    2016-04-01

    Our multidisciplinary network (AeroSolSys) located in Vrancea (Curvature Carpathian Mountains) includes radon concentration monitoring in five stations. We focus on lithosphere and near surface low atmosphere phenomena using real-time information about seismicity, + / - ions, clouds, solar radiation, temperature (air, ground), humidity, atmospheric pressure, wind speed and direction, telluric currents, variations of the local magnetic field, infrasound, variations of the atmospheric electrostatic field, variations in the earth crust with inclinometers, electromagnetic activity, CO2 concentration, ULF radio wave propagation, seismo-acoustic emission, animal behavior. The main purpose is to inform the authorities about risk situation and update hazard scenarios. The radon concentration monitoring is continuously with 1 hour or 3 hours sample rate in locations near to faults in an active seismic zone characterized by intermediate depth earthquakes. Trigger algorithms include standard deviation, mean and derivative methods. We correlate radon concentration measurements with humidity, temperature and atmospheric pressure from the same equipment. In few stations we have meteorological information, too. Sometime the radon concentration has very high variations (maxim 4535 Bq/m3 from 106 Bq/m3) in short time (1 - 2 days) without being accompanied by an important earthquake. Generally the cause is the high humidity that could be generated by tectonic stress. Correlation with seismicity needs information from minimum 6 month in our case. For 10605 hours, 618 earthquakes with maxim magnitude 4.9 R, we have got radon average 38 Bq/m3 and exposure 408111 Bqh/m3 in one station. In two cases we have correlation between seismicity and radon concentration. In other one we recorded high variation because the location was in an area with multiple faults and a river. Radon can be a seismic precursor but only in a multidisciplinary network. The anomalies for short or long period of

  14. Seismic hazard assessment in Aswan, Egypt

    Science.gov (United States)

    Deif, A.; Hamed, H.; Ibrahim, H. A.; Abou Elenean, K.; El-Amin, E.

    2011-12-01

    The study of earthquake activity and seismic hazard assessment around Aswan is very important due to the proximity of the Aswan High Dam. The Aswan High Dam is based on hard Precambrian bedrock and is considered to be the most important project in Egypt from the social, agricultural and electrical energy production points of view. The seismotectonic settings around Aswan strongly suggest that medium to large earthquakes are possible, particularly along the Kalabsha, Seiyal and Khor El-Ramla faults. The seismic hazard for Aswan is calculated utilizing the probabilistic approach within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for eight ground motion spectral periods and for a return period of 475 years, which is deemed appropriate for structural design standards in the Egyptian building codes. The results were also displayed in terms of uniform hazard spectra for rock sites at the Aswan High Dam for return periods of 475 and 2475 years. In addition, the ground-motion levels are also deaggregated at the dam site, in order to provide insight into which events are the most important for hazard estimation. The peak ground acceleration ranges between 36 and 152 cm s-2 for return periods of 475 years (equivalent to 90% probability of non-exceedance in 50 years). Spectral hazard values clearly indicate that compared with countries of high seismic risk, the seismicity in the Aswan region can be described as low at most sites to moderate in the area between the Kalabsha and Seyial faults.

  15. Tests and prospects of new physics at very high energy. Beyond the standard basic principles, and beyond conventional matter and space-time. On the possible origin of Quantum Mechanics.

    Directory of Open Access Journals (Sweden)

    Gonzalez-Mestres Luis

    2015-01-01

    Full Text Available Recent results and announcements by Planck and BICEP2 have led to important controversies in the fields of Cosmology and Particle Physics. As new ideas and alternative approaches can since then more easily emerge, the link between the Mathematical Physics aspects of theories and the interpretation of experimental results becomes more direct. This evolution is also relevant for Particle Physics experiments at very high energy, where the interpretation of data on the highest-energy cosmic rays remains a major theoretical and phenomenological challenge. Alternative particle physics and cosmology can raise fundamental questions such as that of the structure of vacuum and space-time. In particular, the simplified description of the physical vacuum contained in standard quantum field theory does not necessarily correspond to reality at a deeper level, and similarly for the relativistic space-time based on four real variables. In a more general approach, the definition itself of vacuum can be a difficult task. The spinorial space-time (SST we suggested in 1996-97 automatically incorporates a local privileged space direction (PSD for each comoving observer, possibly leading to a locally anisotropic vacuum structure. As the existence of the PSD may have been confirmed by Planck, and a possible discovery of primordial B-modes in the polarization of the cosmic microwave background radiation (CMB may turn out to contain new evidence for the SST, we explore other possible implications of this approach to space-time. The SST structure can naturally be at the origin of Quantum Mechanics at distance scales larger than the fundamental one if standard particles are dealt with as vacuum excitations. We also discuss possible implications of our lack of knowledge of the structure of vacuum, as well as related theoretical, phenomenological and cosmological uncertainties. Pre-Big Bang scenarios and new ultimate constituents of matter (including superbradyons are

  16. Tests and prospects of new physics at very high energy. Beyond the standard basic principles, and beyond conventional matter and space-time. On the possible origin of Quantum Mechanics.

    Directory of Open Access Journals (Sweden)

    Gonzalez-Mestres Luis

    2015-01-01

    Full Text Available Recent results and announcements by Planck and BICEP2 have led to important controversies in the fields of Cosmology and Particle Physics. As new ideas and alternative approaches can since then more easily emerge, the link between the Mathematical Physics aspects of theories and the interpretation of experimental results becomes more direct. This evolution is also relevant for Particle Physics experiments at very high energy, where the interpretation of data on the highest-energy cosmic rays remains a major theoretical and phenomenological challenge. Alternative particle physics and cosmology can raise fundamental questions such as that of the structure of vacuum and space-time. In particular, the simplified description of the physical vacuum contained in standard quantum field theory does not necessarily correspond to reality at a deeper level, and similarly for the relativistic space-time based on four real variables. In a more general approach, the definition itself of vacuum can be a difficult task. The spinorial space-time (SST we suggested in 1996-97 automatically incorporates a local privileged space direction (PSD for each comoving observer, possibly leading to a locally anisotropic vacuum structure. As the existence of the PSD may have been confirmed by Planck, and a possible discovery of primordial B-modes in the polarization of the cosmic microwave background radiation (CMB may turn out to contain new evidence for the SST, we explore other possible implications of this approach to space-time. The SST structure can naturally be at the origin of Quantum Mechanics at distance scales larger than the fundamental one if standard particles are dealt with as vacuum excitations. We also discuss possible implications of our lack of knowledge of the structure of vacuum, as well as related theoretical, phenomenological and cosmological uncertainties. Pre-Big Bang scenarios and new ultimate constituents of matter (including superbradyons are

  17. Conventional cerebrospinal fluid scanning

    Energy Technology Data Exchange (ETDEWEB)

    Schicha, H.

    1985-06-01

    Conventional cerebrospinal fluid scanning (CSF scanning) today is mainly carried out in addition to computerized tomography to obtain information about liquor flow kinetics. Especially in patients with communicating obstructive hydrocephalus, CSF scanning is clinically useful for the decision for shunt surgery. In patients with intracranial cysts, CSF scanning can provide information about liquor circulation. Further indications for CSF scanning include the assessment of shunt patency especially in children, as well as the detection and localization of cerebrospinal fluid leaks.

  18. Next Generation Seismic Imaging; High Fidelity Algorithms and High-End Computing

    Science.gov (United States)

    Bevc, D.; Ortigosa, F.; Guitton, A.; Kaelin, B.

    2007-05-01

    uniquely powerful computing power of the MareNostrum supercomputer in Barcelona to realize the promise of RTM, incorporate it into daily processing flows, and to help solve exploration problems in a highly cost-effective way. Uniquely, the Kaleidoscope Project is simultaneously integrating software (algorithms) and hardware (Cell BE), steps that are traditionally taken sequentially. This unique integration of software and hardware will accelerate seismic imaging by several orders of magnitude compared to conventional solutions running on standard Linux Clusters.

  19. REGULATION OF SEISMIC LOAD ON BUILDINGS SEISMIC DEVICES

    Directory of Open Access Journals (Sweden)

    Kh. N. Mazhiev

    2013-01-01

    Full Text Available The issues of regulation of seismic loads on structures using kinematic supports of highstrength concrete on the impregnated coarse aggregate and seismic isolation bearings Belleville. The results of experimental studies related to the obtaining of a new coarse aggregate and construction of seismic isolation bearings. Addresses the issues of interaction forces in thehemispherical supports vibration process.

  20. United States Pharmacopeial Convention

    Science.gov (United States)

    ... New USP Reference Standards Quality Matters Blog New Brand Showcases How USP Improves Health Around the World ... Experts Leadership Team Careers Newsroom Public Policy Legal Recognition Our Impact True Impact Stories Quality Institute Our ...

  1. Stutter seismic source

    Energy Technology Data Exchange (ETDEWEB)

    Gumma, W. H.; Hughes, D. R.; Zimmerman, N. S.

    1980-08-12

    An improved seismic prospecting system comprising the use of a closely spaced sequence of source initiations at essentially the same location to provide shorter objective-level wavelets than are obtainable with a single pulse. In a preferred form, three dynamite charges are detonated in the same or three closely spaced shot holes to generate a downward traveling wavelet having increased high frequency content and reduced content at a peak frequency determined by initial testing.

  2. Principle and Program of Evaluating Diffuse Seismicity

    Institute of Scientific and Technical Information of China (English)

    Chang Xiangdong

    2001-01-01

    Concept and origin of the term "the diffuse seismicity" are illustrated. Some different viewpoints regarding the diffuse seismicity and the influence characteristics on determining seismic design basis of engineering from the seismicity are analyzed. Principle and program for evaluating diffuse seismicity are studied and discussed base on over contents.

  3. Seismic safety in nuclear-waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Carpenter, D.W.; Towse, D.

    1979-04-26

    Seismic safety is one of the factors that must be considered in the disposal of nuclear waste in deep geologic media. This report reviews the data on damage to underground equipment and structures from earthquakes, the record of associated motions, and the conventional methods of seismic safety-analysis and engineering. Safety considerations may be divided into two classes: those during the operational life of a disposal facility, and those pertinent to the post-decommissioning life of the facility. Operational hazards may be mitigated by conventional construction practices and site selection criteria. Events that would materially affect the long-term integrity of a decommissioned facility appear to be highly unlikely and can be substantially avoided by conservative site selection and facility design. These events include substantial fault movement within the disposal facility and severe ground shaking in an earthquake epicentral region. Techniques need to be developed to address the question of long-term earthquake probability in relatively aseismic regions, and for discriminating between active and extinct faults in regions where earthquake activity does not result in surface ruptures.

  4. Double-difference adjoint seismic tomography

    Science.gov (United States)

    Yuan, Yanhua O.; Simons, Frederik J.; Tromp, Jeroen

    2016-09-01

    We introduce a `double-difference' method for the inversion for seismic wave speed structure based on adjoint tomography. Differences between seismic observations and model predictions at individual stations may arise from factors other than structural heterogeneity, such as errors in the assumed source-time function, inaccurate timings and systematic uncertainties. To alleviate the corresponding non-uniqueness in the inverse problem, we construct differential measurements between stations, thereby reducing the influence of the source signature and systematic errors. We minimize the discrepancy between observations and simulations in terms of the differential measurements made on station pairs. We show how to implement the double-difference concept in adjoint tomography, both theoretically and practically. We compare the sensitivities of absolute and differential measurements. The former provide absolute information on structure along the ray paths between stations and sources, whereas the latter explain relative (and thus higher resolution) structural variations in areas close to the stations. Whereas in conventional tomography a measurement made on a single earthquake-station pair provides very limited structural information, in double-difference tomography one earthquake can actually resolve significant details of the structure. The double-difference methodology can be incorporated into the usual adjoint tomography workflow by simply pairing up all conventional measurements; the computational cost of the necessary adjoint simulations is largely unaffected. Rather than adding to the computational burden, the inversion of double-difference measurements merely modifies the construction of the adjoint sources for data assimilation.

  5. Seismic hazard from induced seismicity: effect of time-dependent hazard variables

    Science.gov (United States)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2012-12-01

    Geothermal systems are drawing large attention worldwide as an alternative source of energy. Although geothermal energy is beneficial, field operations can produce induced seismicity whose effects can range from light and unfelt to severe damaging. In a recent paper by Convertito et al. (2012), we have investigated the effect of time-dependent seismicity parameters on seismic hazard from induced seismicity. The analysis considered the time-variation of the b-value of the Gutenberg-Richter relationship and the seismicity rate, and assumed a non-homogeneous Poisson model to solve the hazard integral. The procedure was tested in The Geysers geothermal area in Northern California where commercial exploitation has started in the 1960s. The analyzed dataset consists of earthquakes recorded during the period 2007 trough 2010 by the LBNL Geysers/Calpine network. To test the reliability of the analysis, we applied a simple forecasting procedure which compares the estimated hazard values in terms of ground-motion values having fixed probability of exceedance and the observed ground-motion values. The procedure is feasible for monitoring purposes and for calibrating the production/extraction rate to avoid adverse consequences. However, one of the main assumptions we made concern the fact that both median predictions and standard deviation of the ground-motion prediction equation (GMPE) are stationary. Particularly for geothermal areas where the number of recorded earthquakes can rapidly change with time, we want to investigate how a variation of the coefficients of the used GMPE and of the standard deviation influences the hazard estimates. Basically, we hypothesize that the physical-mechanical properties of a highly fractured medium which is continuously perturbed by field operations can produce variations of both source and medium properties that cannot be captured by a stationary GMPE. We assume a standard GMPE which accounts for the main effects which modify the scaling

  6. Establishing seismic design criteria to achieve an acceptable seismic margin

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, R.P. [RPK Structural Mechanics Consulting, Inc., Yorba Linda, CA (United States)

    1997-01-01

    In order to develop a risk based seismic design criteria the following four issues must be addressed: (1) What target annual probability of seismic induced unacceptable performance is acceptable? (2). What minimum seismic margin is acceptable? (3) Given the decisions made under Issues 1 and 2, at what annual frequency of exceedance should the Safe Shutdown Earthquake ground motion be defined? (4) What seismic design criteria should be established to reasonably achieve the seismic margin defined under Issue 2? The first issue is purely a policy decision and is not addressed in this paper. Each of the other three issues are addressed. Issues 2 and 3 are integrally tied together so that a very large number of possible combinations of responses to these two issues can be used to achieve the target goal defined under Issue 1. Section 2 lays out a combined approach to these two issues and presents three potentially attractive combined resolutions of these two issues which reasonably achieves the target goal. The remainder of the paper discusses an approach which can be used to develop seismic design criteria aimed at achieving the desired seismic margin defined in resolution of Issue 2. Suggestions for revising existing seismic design criteria to more consistently achieve the desired seismic margin are presented.

  7. Seismic basement in Poland

    Science.gov (United States)

    Grad, Marek; Polkowski, Marcin

    2016-06-01

    The area of contact between Precambrian and Phanerozoic Europe in Poland has complicated structure of sedimentary cover and basement. The thinnest sedimentary cover in the Mazury-Belarus anteclize is only 0.3-1 km thick, increases to 7-8 km along the East European Craton margin, and 9-12 km in the Trans-European Suture Zone (TESZ). The Variscan domain is characterized by a 1- to 2-km-thick sedimentary cover, while the Carpathians are characterized by very thick sediments, up to c. 20 km. The map of the basement depth is created by combining data from geological boreholes with a set of regional seismic refraction profiles. These maps do not provide data about the basement depth in the central part of the TESZ and in the Carpathians. Therefore, the data set is supplemented by 32 models from deep seismic sounding profiles and a map of a high-resistivity (low-conductivity) layer from magnetotelluric soundings, identified as a basement. All of these data provide knowledge about the basement depth and of P-wave seismic velocities of the crystalline and consolidated type of basement for the whole area of Poland. Finally, the differentiation of the basement depth and velocity is discussed with respect to geophysical fields and the tectonic division of the area.

  8. NSR&D Program Fiscal Year (FY) 2015 Call for Proposals Mitigation of Seismic Risk at Nuclear Facilities using Seismic Isolation

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-02-01

    Seismic isolation (SI) has the potential to drastically reduce seismic response of structures, systems, or components (SSCs) and therefore the risk associated with large seismic events (large seismic event could be defined as the design basis earthquake (DBE) and/or the beyond design basis earthquake (BDBE) depending on the site location). This would correspond to a potential increase in nuclear safety by minimizing the structural response and thus minimizing the risk of material release during large seismic events that have uncertainty associated with their magnitude and frequency. The national consensus standard America Society of Civil Engineers (ASCE) Standard 4, Seismic Analysis of Safety Related Nuclear Structures recently incorporated language and commentary for seismically isolating a large light water reactor or similar large nuclear structure. Some potential benefits of SI are: 1) substantially decoupling the SSC from the earthquake hazard thus decreasing risk of material release during large earthquakes, 2) cost savings for the facility and/or equipment, and 3) applicability to both nuclear (current and next generation) and high hazard non-nuclear facilities. Issue: To date no one has evaluated how the benefit of seismic risk reduction reduces cost to construct a nuclear facility. Objective: Use seismic probabilistic risk assessment (SPRA) to evaluate the reduction in seismic risk and estimate potential cost savings of seismic isolation of a generic nuclear facility. This project would leverage ongoing Idaho National Laboratory (INL) activities that are developing advanced (SPRA) methods using Nonlinear Soil-Structure Interaction (NLSSI) analysis. Technical Approach: The proposed study is intended to obtain an estimate on the reduction in seismic risk and construction cost that might be achieved by seismically isolating a nuclear facility. The nuclear facility is a representative pressurized water reactor building nuclear power plant (NPP) structure

  9. Seismic hazard assessment of Iran

    Directory of Open Access Journals (Sweden)

    M. Ghafory-Ashtiany

    1999-06-01

    Full Text Available The development of the new seismic hazard map of Iran is based on probabilistic seismic hazard computation using the historical earthquakes data, geology, tectonics, fault activity and seismic source models in Iran. These maps have been prepared to indicate the earthquake hazard of Iran in the form of iso-acceleration contour lines, and seismic hazard zoning, by using current probabilistic procedures. They display the probabilistic estimates of Peak Ground Acceleration (PGA for the return periods of 75 and 475 years. The maps have been divided into intervals of 0.25 degrees in both latitudinal and longitudinal directions to calculate the peak ground acceleration values at each grid point and draw the seismic hazard curves. The results presented in this study will provide the basis for the preparation of seismic risk maps, the estimation of earthquake insurance premiums, and the preliminary site evaluation of critical facilities.

  10. SEISMIC RANDOM VIBRATION ANALYSIS OF STOCHASTIC STRUCTURES USING RANDOM FACTOR METHOD

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Seismic random vibration analysis of stochastic truss structures is presented. A new method called random factor method is used for dynamic analysis of structures with uncertain parameters, due to variability in their material properties and geometry. Using the random factor method, the natural frequencies and modeshapes of a stochastic structure can be respectively described by the product of two parts, corresponding to the random factors of the structural parameters with uncertainty, and deterministic values of the natural frequencies and modeshapes obtained by conventional finite element analysis. The stochastic truss structure is subjected to stationary or non-stationary random earthquake excitation. Computational expressions for the mean and standard deviation of the mean square displacement and mean square stress are developed by means of the random variable's functional moment method and the algebra synthesis method. An antenna and a truss bridge are used as practical engineering examples to illustrate the application of the random factor method in the seismic response analysis of random structures under stationary or non-stationary random earthquake excitation.

  11. Seismic Hazard analysis of Adjaria Region in Georgia

    Science.gov (United States)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  12. Seismic Imager Space Telescope

    Science.gov (United States)

    Sidick, Erkin; Coste, Keith; Cunningham, J.; Sievers,Michael W.; Agnes, Gregory S.; Polanco, Otto R.; Green, Joseph J.; Cameron, Bruce A.; Redding, David C.; Avouac, Jean Philippe; Ampuero, Jean Paul; Leprince, Sebastien; Michel, Remi

    2012-01-01

    A concept has been developed for a geostationary seismic imager (GSI), a space telescope in geostationary orbit above the Pacific coast of the Americas that would provide movies of many large earthquakes occurring in the area from Southern Chile to Southern Alaska. The GSI movies would cover a field of view as long as 300 km, at a spatial resolution of 3 to 15 m and a temporal resolution of 1 to 2 Hz, which is sufficient for accurate measurement of surface displacements and photometric changes induced by seismic waves. Computer processing of the movie images would exploit these dynamic changes to accurately measure the rapidly evolving surface waves and surface ruptures as they happen. These measurements would provide key information to advance the understanding of the mechanisms governing earthquake ruptures, and the propagation and arrest of damaging seismic waves. GSI operational strategy is to react to earthquakes detected by ground seismometers, slewing the satellite to point at the epicenters of earthquakes above a certain magnitude. Some of these earthquakes will be foreshocks of larger earthquakes; these will be observed, as the spacecraft would have been pointed in the right direction. This strategy was tested against the historical record for the Pacific coast of the Americas, from 1973 until the present. Based on the seismicity recorded during this time period, a GSI mission with a lifetime of 10 years could have been in position to observe at least 13 (22 on average) earthquakes of magnitude larger than 6, and at least one (2 on average) earthquake of magnitude larger than 7. A GSI would provide data unprecedented in its extent and temporal and spatial resolution. It would provide this data for some of the world's most seismically active regions, and do so better and at a lower cost than could be done with ground-based instrumentation. A GSI would revolutionize the understanding of earthquake dynamics, perhaps leading ultimately to effective warning

  13. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    Science.gov (United States)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  14. Seismic capacity of a reinforced concrete frame structure without seismic detailing and limited ductility seismic design in moderate seismicity

    Energy Technology Data Exchange (ETDEWEB)

    Kim, J. K.; Kim, I. H. [Seoul National Univ., Seoul (Korea, Republic of)

    1999-10-01

    A four-story reinforced concrete frame building model is designed for the gravity loads only. Static nonlinear pushover analyses are performed in two orthogonal horizontal directions. The overall capacity curves are converted into ADRS spectra and compared with demand spectra. At several points the deformed shape, moment and shear distribution are calculated. Based on these results limited ductility seismic design concept is proposed as an alternative seismic design approach in moderate seismicity resign.

  15. Conventional and unconventional superconductivity

    Science.gov (United States)

    Fernandes, R. M.

    2012-02-01

    Superconductivity has been one of the most fruitful areas of research in condensed matter physics, bringing together researchers with distinct interests in a collaborative effort to understand from its microscopic basis to its potential for unprecedented technological applications. The concepts, techniques, and methods developed along its centennial history have gone beyond the realm of condensed matter physics and influenced the development of other fascinating areas, such as particle physics and atomic physics. These notes, based on a set of lectures given at the 2011 Advanced Summer School of Cinvestav, aim to motivate the young undergraduate student in getting involved in the exciting world of conventional and unconventional superconductors.

  16. Strategic interaction and conventions

    Directory of Open Access Journals (Sweden)

    Espinosa, María Paz

    2012-03-01

    Full Text Available The scope of the paper is to review the literature that employs coordination games to study social norms and conventions from the viewpoint of game theory and cognitive psychology. We claim that those two alternative approaches are in fact complementary, as they provide different insights to explain how people converge to a unique system of self-fulfilling expectations in presence of multiple, equally viable, conventions. While game theory explains the emergence of conventions relying on efficiency and risk considerations, the psychological view is more concerned with frame and labeling effects. The interaction between these alternative (and, sometimes, competing effects leads to the result that coordination failures may well occur and, even when coordination takes place, there is no guarantee that the convention eventually established will be the most efficient.

    El objetivo de este artículo es presentar la literatura que emplea los juegos de coordinación para el estudio de normas y convenciones sociales, que se han analizado tanto desde el punto de vista de la teoría de juegos como de la psicología cognitiva. Argumentamos en este trabajo que estos dos enfoques alternativos son en realidad complementarios, dado que ambos contribuyen al entendimiento de los procesos mediante los cuales las personas llegan a coordinarse en un único sistema de expectativas autorrealizadas, en presencia de múltiples convenciones todas ellas igualmente viables. Mientras que la teoría de juegos explica la aparición de convenciones basándose en argumentos de eficiencia y comportamientos frente al riesgo, el enfoque de la psicología cognitiva utiliza en mayor medida consideraciones referidas al entorno y naturaleza de las decisiones. La interacción entre estos efectos diferentes (y en ocasiones, rivales desemboca con frecuencia en fallos de coordinación y, aun cuando la coordinación se produce, no hay garantía de que la convención en vigor sea la m

  17. Modelling of NW Himalayan Seismicity

    Science.gov (United States)

    Bansal, A. R.; Dimri, V. P.

    2014-12-01

    The northwest Himalaya is seismicity active region due to the collision of Indian and Eurasian plates and experienced many large earthquakes in past. A systematic analysis of seismicity is useful for seismic hazard estimation of the region. We analyzed the seismicity of northwestern Himalaya since 1980. The magnitude of completeness of the catalogue is carried out using different methods and found as 3.0. A large difference in magnitude of completeness is found using different methods and a reliable value is obtained after testing the distribution of magnitudes with time. The region is prone to large earthquake and many studied have shown that seismic activation or quiescence takes place before large earthquakes. We studied such behavior of seismicity based on Epidemic Type Aftershock Sequence (ETAS) model and found that a stationary ETAS model is more suitable for modelling the seismicity of this region. The earthquake catalogue is de-clustered using stochasting approach to study behavior of background and triggered seismicity. The triggered seismicity is found to have shallower depths as compared to the background events.

  18. Flat lens for seismic waves

    CERN Document Server

    Brule, Stephane; Guenneau, Sebastien

    2016-01-01

    A prerequisite for achieving seismic invisibility is to demonstrate the ability of civil engineers to control seismic waves with artificially structured soils. We carry out large-scale field tests with a structured soil made of a grid consisting of cylindrical and vertical holes in the ground and a low frequency artificial source (< 10 Hz). This allows the identification of a distribution of energy inside the grid, which can be interpreted as the consequence of an effective negative refraction index. Such a flat lens reminiscent of what Veselago and Pendry envisioned for light opens avenues in seismic metamaterials to counteract the most devastating components of seismic signals.

  19. Neural networks in seismic discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Dowla, F.U.

    1995-01-01

    Neural networks are powerful and elegant computational tools that can be used in the analysis of geophysical signals. At Lawrence Livermore National Laboratory, we have developed neural networks to solve problems in seismic discrimination, event classification, and seismic and hydrodynamic yield estimation. Other researchers have used neural networks for seismic phase identification. We are currently developing neural networks to estimate depths of seismic events using regional seismograms. In this paper different types of network architecture and representation techniques are discussed. We address the important problem of designing neural networks with good generalization capabilities. Examples of neural networks for treaty verification applications are also described.

  20. Seismic hazard estimation of northern Iran using smoothed seismicity

    Science.gov (United States)

    Khoshnevis, Naeem; Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Cramer, Chris H.

    2017-07-01

    This article presents a seismic hazard assessment for northern Iran, where a smoothed seismicity approach has been used in combination with an updated seismic catalog and a ground motion prediction equation recently found to yield good fit with data. We evaluate the hazard over a geographical area including the seismic zones of Azerbaijan, the Alborz Mountain Range, and Kopeh-Dagh, as well as parts of other neighboring seismic zones that fall within our region of interest. In the chosen approach, seismic events are not assigned to specific faults but assumed to be potential seismogenic sources distributed within regular grid cells. After performing the corresponding magnitude conversions, we decluster both historical and instrumental seismicity catalogs to obtain earthquake rates based on the number of events within each cell, and smooth the results to account for the uncertainty in the spatial distribution of future earthquakes. Seismicity parameters are computed for each seismic zone separately, and for the entire region of interest as a single uniform seismotectonic region. In the analysis, we consider uncertainties in the ground motion prediction equation, the seismicity parameters, and combine the resulting models using a logic tree. The results are presented in terms of expected peak ground acceleration (PGA) maps and hazard curves at selected locations, considering exceedance probabilities of 2 and 10% in 50 years for rock site conditions. According to our results, the highest levels of hazard are observed west of the North Tabriz and east of the North Alborz faults, where expected PGA values are between about 0.5 and 1 g for 10 and 2% probability of exceedance in 50 years, respectively. We analyze our results in light of similar estimates available in the literature and offer our perspective on the differences observed. We find our results to be helpful in understanding seismic hazard for northern Iran, but recognize that additional efforts are necessary to

  1. Seismic hazard estimation of northern Iran using smoothed seismicity

    Science.gov (United States)

    Khoshnevis, Naeem; Taborda, Ricardo; Azizzadeh-Roodpish, Shima; Cramer, Chris H.

    2017-03-01

    This article presents a seismic hazard assessment for northern Iran, where a smoothed seismicity approach has been used in combination with an updated seismic catalog and a ground motion prediction equation recently found to yield good fit with data. We evaluate the hazard over a geographical area including the seismic zones of Azerbaijan, the Alborz Mountain Range, and Kopeh-Dagh, as well as parts of other neighboring seismic zones that fall within our region of interest. In the chosen approach, seismic events are not assigned to specific faults but assumed to be potential seismogenic sources distributed within regular grid cells. After performing the corresponding magnitude conversions, we decluster both historical and instrumental seismicity catalogs to obtain earthquake rates based on the number of events within each cell, and smooth the results to account for the uncertainty in the spatial distribution of future earthquakes. Seismicity parameters are computed for each seismic zone separately, and for the entire region of interest as a single uniform seismotectonic region. In the analysis, we consider uncertainties in the ground motion prediction equation, the seismicity parameters, and combine the resulting models using a logic tree. The results are presented in terms of expected peak ground acceleration (PGA) maps and hazard curves at selected locations, considering exceedance probabilities of 2 and 10% in 50 years for rock site conditions. According to our results, the highest levels of hazard are observed west of the North Tabriz and east of the North Alborz faults, where expected PGA values are between about 0.5 and 1 g for 10 and 2% probability of exceedance in 50 years, respectively. We analyze our results in light of similar estimates available in the literature and offer our perspective on the differences observed. We find our results to be helpful in understanding seismic hazard for northern Iran, but recognize that additional efforts are necessary to

  2. The influence of backfill on seismicity

    CSIR Research Space (South Africa)

    Hemp, DA

    1990-09-01

    Full Text Available , that the seismicity has been reduced in areas where backfill had been placed. A factor complicating the evaluation of backfill on seismicity is the effect of geological structures on seismicity....

  3. High resolution seismic refraction method (development and applications); Koseido kussetsuho jishin tansa no kaihatsu to tekiyorei

    Energy Technology Data Exchange (ETDEWEB)

    Hayashi, K.; Saito, H. [OYO Corp., Tokyo (Japan)

    1998-10-01

    Described herein are measurement/analysis procedures of the high-resolution seismic refraction method. Recently, use of explosives has been limited for many explorative activities. The measurement systems and waveform processing procedures described herein can minimize use of explosives and widen applicability of non-explosive seismic sources. The seismic refraction method is now advanced, e.g., to process large quantities of high-quality data, use tomographic algorithm and include analysis of vibration-receiving points in holes, and is applicable to grounds of complicated structures, for which the conventional method is difficult to use. The new method is aided by a personal computer to give the analysis results almost automatically, thereby establishing objectivity of the explorative results and securing data quality. The high-resolution seismic refraction method, aided by the new techniques of measurement/analysis, can now give the results in a much shorter time than the conventional one. 40 refs., 22 figs.

  4. Multicomponent seismic noise attenuation with multivariate order statistic filters

    Science.gov (United States)

    Wang, Chao; Wang, Yun; Wang, Xiaokai; Xun, Chao

    2016-10-01

    The vector relationship between multicomponent seismic data is highly important for multicomponent processing and interpretation, but this vector relationship could be damaged when each component is processed individually. To overcome the drawback of standard component-by-component filtering, multivariate order statistic filters are introduced and extended to attenuate the noise of multicomponent seismic data by treating such dataset as a vector wavefield rather than a set of scalar fields. According to the characteristics of seismic signals, we implement this type of multivariate filtering along local events. First, the optimal local events are recognized according to the similarity between the vector signals which are windowed from neighbouring seismic traces with a sliding time window along each trial trajectory. An efficient strategy is used to reduce the computational cost of similarity measurement for vector signals. Next, one vector sample each from the neighbouring traces are extracted along the optimal local event as the input data for a multivariate filter. Different multivariate filters are optimal for different noise. The multichannel modified trimmed mean (MTM) filter, as one of the multivariate order statistic filters, is applied to synthetic and field multicomponent seismic data to test its performance for attenuating white Gaussian noise. The results indicate that the multichannel MTM filter can attenuate noise while preserving the relative amplitude information of multicomponent seismic data more effectively than a single-channel filter.

  5. Aging evaluation of class 1E batteries: Seismic testing

    Energy Technology Data Exchange (ETDEWEB)

    Edson, J.L. (EG and G Idaho, Inc., Idaho Falls, ID (USA))

    1990-08-01

    This report presents the results of a seismic testing program on naturally aged class 1E batteries obtained from a nuclear plant. The testing program is a Phase 2 activity resulting from a Phase 1 aging evaluation of class 1E batteries in safety systems of nuclear power plants, performed previously as a part of the US Nuclear Regulatory Commission's Nuclear Plant Aging Research Program and reported in NUREG/CR-4457. The primary purpose of the program was to evaluate the seismic ruggedness of naturally aged batteries to determine if aged batteries could have adequate electrical capacity, as determined by tests recommended by IEEE Standards, and yet have inadequate seismic ruggedness to provide needed electrical power during and after a safe shutdown earthquake (SSE) event. A secondary purpose of the program was to evaluate selected advanced surveillance methods to determine if they were likely to be more sensitive to the aging degradation that reduces seismic ruggedness. The program used twelve batteries naturally aged to about 14 years of age in a nuclear facility and tested them at four different seismic levels representative of the levels of possible earthquakes specified for nuclear plants in the United States. Seismic testing of the batteries did not cause any loss of electrical capacity. 19 refs., 29 figs., 7 tabs.

  6. A new passive seismic method based on seismic interferometry and multichannel analysis of surface waves

    Science.gov (United States)

    Cheng, Feng; Xia, Jianghai; Xu, Yixian; Xu, Zongbo; Pan, Yudi

    2015-06-01

    We proposed a new passive seismic method (PSM) based on seismic interferometry and multichannel analysis of surface waves (MASW) to meet the demand for increasing investigation depth by acquiring surface-wave data at a low-frequency range (1 Hz ≤ f ≤ 10 Hz). We utilize seismic interferometry to sort common virtual source gathers (CVSGs) from ambient noise and analyze obtained CVSGs to construct 2D shear-wave velocity (Vs) map using the MASW. Standard ambient noise processing procedures were applied to the computation of cross-correlations. To enhance signal to noise ratio (SNR) of the empirical Green's functions, a new weighted stacking method was implemented. In addition, we proposed a bidirectional shot mode based on the virtual source method to sort CVSGs repeatedly. The PSM was applied to two field data examples. For the test along Han River levee, the results of PSM were compared with the improved roadside passive MASW and spatial autocorrelation method (SPAC). For test in the Western Junggar Basin, PSM was applied to a 70 km long linear survey array with a prominent directional urban noise source and a 60 km-long Vs profile with 1.5 km in depth was mapped. Further, a comparison about the dispersion measurements was made between PSM and frequency-time analysis (FTAN) technique to assess the accuracy of PSM. These examples and comparisons demonstrated that this new method is efficient, flexible, and capable to study near-surface velocity structures based on seismic ambient noise.

  7. Enhanced seismic depth imaging of complex fault-fold structures

    Science.gov (United States)

    Kirtland Grech, Maria Graziella

    Synthetic seismic data were acquired over numerical and physical models, representing fault-fold structures encountered in the Canadian Rocky Mountain Foothills, to investigate which migration algorithm produces the best image in such complex environments. Results showed that pre-stack depth migration from topography with the known velocity model yielded the optimum migrated image. Errors in the positioning of a target underneath a dipping antisotropic overburden were also studied using multicomponent data. The largest error was observed on P-wave data where anisotropy was highest at 18%. For an overburden thickness of 1500 m, the target was imaged 300 m updip from the true location. Field data from a two-dimensional surface seismic line and a multioffset vertical seismic profile (VSP) from the Foothills of southern Alberta, Canada, were processed using a flow designed to yield an optimum depth image. Traveltime inversion of the first arrivals from all the shots from the multioffset VSP revealed that the Mesozoic shale strata in the area exhibit seismic velocity anisotropy. The anisotropy parameters, ε and delta, were calculated to be 0.1 and 0.05 respectively. Anisotropic pre-stack depth migration code for VSP and surface seismic data, which uses a modified version of a raytracer developed in this thesis for the computation of traveltime tables, was also developed. The algorithm was then used in a new method for integrated VSP and surface seismic depth imaging. Results from the migration of synthetic and field data show that the resulting integrated image is superior to that obtained from the migration of either data set alone or to that obtained from the conventional "splicing" approach. The combination of borehole and surface seismic data for anisotropy analysis, velocity model building, and depth migration, yielded a robust image even when the geology was complex, thus permitting a more accurate interpretation of the exploration target.

  8. Seismic Prediction While Drilling (SPWD): Seismic exploration ahead of the drill bit using phased array sources

    Science.gov (United States)

    Jaksch, Katrin; Giese, Rüdiger; Kopf, Matthias

    2010-05-01

    In the case of drilling for deep reservoirs previous exploration is indispensable. In recent years the focus shifted more on geological structures like small layers or hydrothermal fault systems. Beside 2D- or 3D-seismics from the surface and seismic measurements like Vertical Seismic Profile (VSP) or Seismic While Drilling (SWD) within a borehole these methods cannot always resolute this structures. The resolution is worsen the deeper and smaller the sought-after structures are. So, potential horizons like small layers in oil exploration or fault zones usable for geothermal energy production could be failed or not identified while drilling. The application of a device to explore the geology with a high resolution ahead of the drill bit in direction of drilling would be of high importance. Such a device would allow adjusting the drilling path according to the real geology and would minimize the risk of discovery and hence the costs for drilling. Within the project SPWD a device for seismic exploration ahead of the drill bit will be developed. This device should allow the seismic exploration to predict areas about 50 to 100 meters ahead of the drill bit with a resolution of one meter. At the GFZ a first prototype consisting of different units for seismic sources, receivers and data loggers has been designed and manufactured. As seismic sources four standard magnetostrictive actuators and as receivers four 3-component-geophones are used. Every unit, actuator or geophone, can be rotated in steps of 15° around the longitudinal axis of the prototype to test different measurement configurations. The SPWD prototype emits signal frequencies of about 500 up to 5000 Hz which are significant higher than in VSP and SWD. An increased radiation of seismic wave energy in the direction of the borehole axis allows the view in areas to be drilled. Therefore, every actuator must be controlled independently of each other regarding to amplitude and phase of the source signal to

  9. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    Science.gov (United States)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  10. Intelligent seismic sensor with double three component MEMS accelerometers

    Science.gov (United States)

    Fu, Jihua; Wang, Jianjun; Li, Zhitao; Liu, Xiaoxi; Wang, Zhongyu

    2010-08-01

    To better understand the response and damage characteristics of structures under earthquakes, a great number of intelligent seismic sensors with high performance were needed to be installed distributed in the whole country. The intelligent seismic sensor was a cost-sensitive application because of its large number of usages. For this reason, a low cost intelligent seismic sensor was put forward in this paper. This kind of intelligent seismic sensor cut down the cost without sacrificing performance by introducing two three component MEMS accelerometers. It was composed by a microprocessor, two three component MEMS accelerometers, an A/D converter, a flash memory, etc. The MEMS accelerometer has better structure and frequency response characteristics than the conventional geophones'. But one MEMS accelerometer tended to be unreliable and have no enough dynamic range for precision measurement. Therefore two three component MEMS accelerometers were symmetrically mounted on both sides of the circuit board. And their measuring values were composed to describe the ground motion or structure response. The composed value was the in-phase stacking of the two accelerometers' measuring values, which enhanced the signal noise ratio of the sensor and broadened its dynamic range. Through the preliminary theory and experiment analysis, the low cost intelligent seismic sensor could measure the acceleration in accuracy.

  11. Data Set From Molisan Regional Seismic Network Events

    CERN Document Server

    De Gasperis, Giovanni

    2016-01-01

    After the earthquake occurred in Molise (Central Italy) on 31st October 2002 (Ml 5.4, 29 people dead), the local Servizio Regionale per la Protezione Civile to ensure a better analysis of local seismic data, through a convention with the Istituto Nazionale di Geofisica e Vulcanologia (INGV), promoted the design of the Regional Seismic Network (RMSM) and funded its implementation. The 5 stations of RMSM worked since 2007 to 2013 collecting a large amount of seismic data and giving an important contribution to the study of seismic sources present in the region and the surrounding territory. This work reports about the dataset containing all triggers collected by RMSM since July 2007 to March 2009, including actual seismic events; among them, all earthquakes events recorded in coincidence to Rete Sismica Nazionale Centralizzata (RSNC) of INGV have been marked with S and P arrival timestamps. Every trigger has been associated to a spectrogram defined into a recorded time vs. frequency domain. The main aim of this...

  12. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  13. Seismic failure modes and seismic safety of Hardfill dam

    Institute of Scientific and Technical Information of China (English)

    Kun XIONG; Yong-hong WENG; Yun-long HE

    2013-01-01

    Based on microscopic damage theory and the finite element method, and using the Weibull distribution to characterize the random distribution of the mechanical properties of materials, the seismic response of a typical Hardfill dam was analyzed through numerical simulation during the earthquakes with intensities of 8 degrees and even greater. The seismic failure modes and failure mechanism of the dam were explored as well. Numerical results show that the Hardfill dam remains at a low stress level and undamaged or slightly damaged during an earthquake with an intensity of 8 degrees. During overload earthquakes, tensile cracks occur at the dam surfaces and extend to inside the dam body, and the upstream dam body experiences more serious damage than the downstream dam body. Therefore, under the seismic conditions, the failure pattern of the Hardfill dam is the tensile fracture of the upstream regions and the dam toe. Compared with traditional gravity dams, Hardfill dams have better seismic performance and greater seismic safety.

  14. Biodiesel from conventional feedstocks.

    Science.gov (United States)

    Du, Wei; Liu, De-Hua

    2012-01-01

    At present, traditional fossil fuels are used predominantly in China, presenting the country with challenges that include sustainable energy supply, energy efficiency improvement, and reduction of greenhouse gas emissions. In 2007, China issued The Strategic Plan of the Mid-and-Long Term Development of Renewable Energy, which aims to increase the share of clean energy in the country's energy consumption to 15% by 2020 from only 7.5% in 2005. Biodiesel, an important renewable fuel with significant advantages over fossil diesel, has attracted great attention in the USA and European countries. However, biodiesel is still in its infancy in China, although its future is promising. This chapter reviews biodiesel production from conventional feedstocks in the country, including feedstock supply and state of the art technologies for the transesterification reaction through which biodiesel is made, particularly the enzymatic catalytic process developed by Chinese scientists. Finally, the constraints and perspectives for China's biodiesel development are highlighted.

  15. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  16. Elastic-Wavefield Seismic Stratigraphy: A New Seismic Imaging Technology

    Energy Technology Data Exchange (ETDEWEB)

    Bob A. Hardage; Milo M. Backus; Michael V. DeAngelo; Sergey Fomel; Khaled Fouad; Robert J. Graebner; Paul E. Murray; Randy Remington; Diana Sava

    2006-07-31

    The purpose of our research has been to develop and demonstrate a seismic technology that will provide the oil and gas industry a better methodology for understanding reservoir and seal architectures and for improving interpretations of hydrocarbon systems. Our research goal was to expand the valuable science of seismic stratigraphy beyond the constraints of compressional (P-P) seismic data by using all modes (P-P, P-SV, SH-SH, SV-SV, SV-P) of a seismic elastic wavefield to define depositional sequences and facies. Our objective was to demonstrate that one or more modes of an elastic wavefield may image stratal surfaces across some stratigraphic intervals that are not seen by companion wave modes and thus provide different, but equally valid, information regarding depositional sequences and sedimentary facies within that interval. We use the term elastic wavefield stratigraphy to describe the methodology we use to integrate seismic sequences and seismic facies from all modes of an elastic wavefield into a seismic interpretation. We interpreted both onshore and marine multicomponent seismic surveys to select the data examples that we use to document the principles of elastic wavefield stratigraphy. We have also used examples from published papers that illustrate some concepts better than did the multicomponent seismic data that were available for our analysis. In each interpretation study, we used rock physics modeling to explain how and why certain geological conditions caused differences in P and S reflectivities that resulted in P-wave seismic sequences and facies being different from depth-equivalent S-wave sequences and facies across the targets we studied.

  17. Passive seismic experiment.

    Science.gov (United States)

    Latham, G V; Ewing, M; Press, F; Sutton, G; Dorman, J; Nakamura, Y; Toksöz, N; Wiggins, R; Derr, J; Duennebier, F

    1970-01-30

    Seismometer operation for 21 days at Tranquillity Base revealed, among strong signals produced by the Apollo 11 lunar module descent stage, a small proportion of probable natural seismic signals. The latter are long-duration, emergent oscillations which lack the discrete phases and coherence of earthquake signals. From similarity with the impact signal of the Apollo 12 ascent stage, they are thought to be produced by meteoroid impacts or shallow moonquakes. This signal character may imply transmission with high Q and intense wave scattering, conditions which are mutually exclusive on earth. Natural background noise is very much smaller than on earth, and lunar tectonism may be very low.

  18. Recent developments in seismically isolated buildings in Japan

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The Building Standard Law of Japan and related Enforcement Order and Notifications have been substantially revised since the year 2000 to introduce a performance-based regulatory and deregulation system for building control systems.Up to then, time-history analyses were mandatory for isolated buildings and had to be specially approved by the Minster of the Ministry of Construction (MOC). Simplified design procedures based on the equivalent linear method for seismically isolated buildings have been issued as "Notification 2009 - Structural calculation procedure for buildings with seismic isolation" from MOC, and are now integrated into the Ministry of Land, Infrastructure, and Transportation (MLIT). Along with Notification 2009, "Notification 1446 of year 2000 Standard for specifications and test methods for seismic isolation devices" was also issued. Buildings with heights equal to or less than 60m and that are designed according to these Notifications, including base isolated buildings, only need approval from local building officials, and no longer require the special approval of the Minister of MLIT. This paper summarizes: 1 ) some statistics related to buildings with seismic isolation completed up to the end of 2001;2) simplified design procedures required by Notification 2009 of year 2000; and 3) performance of seismic isolation devicesrequired by Notification 1446 of year 2000.

  19. What constitutes a convention? : implications for the coexistence of conventions

    OpenAIRE

    Kolstad, Ivar

    2002-01-01

    A model of repeated play of a coordination game, where stage games have a location in social space, and players receive noisy signals of the true location of their games, is reviewed. Sugden (1995) suggests that in such a model, there can be a stationary state of convention coexistence only if interaction is non-uniform across social space. This paper shows that an alternative definition of conventions, which links conventions to actions rather than expectations, permits convention coexistenc...

  20. Procedures for computing site seismicity

    Science.gov (United States)

    Ferritto, John

    1994-02-01

    This report was prepared as part of the Navy's Seismic Hazard Mitigation Program. The Navy has numerous bases located in seismically active regions throughout the world. Safe effective design of waterfront structures requires determining expected earthquake ground motion. The Navy's problem is further complicated by the presence of soft saturated marginal soils that can significantly amplify the levels of seismic shaking as evidenced in the 1989 Loma Prieta earthquake. The Naval Facilities Engineering Command's seismic design manual, NAVFAC P355.l, requires a probabilistic assessment of ground motion for design of essential structures. This report presents the basis for the Navy's Seismic Hazard Analysis procedure that was developed and is intended to be used with the Seismic Hazard Analysis computer program and user's manual. This report also presents data on geology and seismology to establish the background for the seismic hazard model developed. The procedure uses the historical epicenter data base and available geologic data, together with source models, recurrence models, and attenuation relationships to compute the probability distribution of site acceleration and an appropriate spectra. This report discusses the developed stochastic model for seismic hazard evaluation and the associated research.

  1. Passive seismic monitoring at the ketzin CCS site -Magnitude estimation

    NARCIS (Netherlands)

    Paap, B.F.; Steeghs, T.P.H.

    2014-01-01

    In order to allow quantification of the strength of local micro-seismic events recorded at the CCS pilot site in Ketzin in terms of local magnitude, earthquake data recorded by standardized seismometers were used. Earthquakes were selected that occurred in Poland and Czech Republic and that were det

  2. Seismic investigations in downtown Copenhagen, Denmark

    Science.gov (United States)

    Martinez, K.; Mendoza, J. A.; Olsen, H.

    2009-12-01

    Near surface geophysics are gaining widespread use in major infrastructure projects with respect to geotechnical and engineering applications. The development of data acquisition, processing tools and interpretation methods have optimized survey production, reduced logistics costs and increase results reliability of seismic surveys during the last decades. However, the use of geophysical methods under urban environments continues to face challenges due to multiple noise sources and obstacles inherent to cities. A seismic investigation was conducted in Copenhagen aiming to produce information needed for hydrological, geotechnical and groundwater modeling assessments related to the planned Cityringen underground metro project. The particular objectives were a) map variations in subsurface Quaternary and limestone properties b) to map for near surface structural features. The geological setting in the Copenhagen region is characterized by several interlaced layers of glacial till and meltwater sand deposits. These layers, which are found unevenly distributed throughout the city and present in varying thicknesses, overlie limestone of different generations. There are common occurrences of incised valley structures containing localized instances of weathered or fractured limestone. The surveys consisted of combined seismic reflection and refraction profiles accounting for approximately 13 km along sections of the projected metro line. The data acquisition was carried out using standard 192 channels arrays, receiver groups with 5 m spacing and a Vibroseis as a source at 5 m spacing. In order to improve the resolution of the data, 29 Walkaway-Vertical Seismic Profiles were performed at selected wells along the surface seismic lines. The refraction data was processed with travel-time tomography and the reflection data underwent standard interpretation. The refraction data inversion was performed twofold; a surface refraction alone and combined with the VSP data. Three

  3. Key aspects governing induced seismicity

    Science.gov (United States)

    Buijze, Loes; Wassing, Brecht; Fokker, Peter

    2013-04-01

    In the past decades numerous examples of earthquakes induced by human-induced changes in subsurface fluid pressures have been reported. This poses a major threat to the future development of some of these operations and calls for an understanding and quantification of the seismicity generated. From geomechanical considerations and insights from laboratory experiments the factors controlling induced seismicity may be grouped into 4 categories; the magnitude of the stress disturbance, the pre-existing stress conditions, the reservoir/fault rock properties and the local geometry. We investigated whether the (relative) contributions of these factors and their influence on magnitudes generated could be recognized by looking at the entire dataset of reported cases of induced seismicity as a whole, and what this might imply for future developments. An extensive database has been built out of over a 160 known cases of induced seismicity worldwide, incorporating the relevant geological, seismological and fluid-related parameters. The cases studied include hydrocarbon depletion and secondary recovery, waste water injection, (enhanced) geothermal systems and hydraulic fracturing with observed magnitudes ranging from less than -1.5 to 7. The parameters taken into account were based on the theoretical background of the mechanisms of induced seismicity and include the injection/depletion-related parameters, (spatial) characteristics of seismicity, lithological properties and the local stress situation. Correlations between the seismic response and the geological/geomechanical characteristics of the various sites were investigated. The injected/depleted volumes and the scale of the activities are major controlling factors on the maximum magnitudes generated. Spatial signatures of seismicity such as the depth and lateral spread of the seismicity were observed to be distinct for different activities, which is useful when considering future operations. Where available the local

  4. Advances in Rotational Seismic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Pierson, Robert [Applied Technology Associates, Albuquerque, NM (United States); Laughlin, Darren [Applied Technology Associates, Albuquerque, NM (United States); Brune, Robert [Applied Technology Associates, Albuquerque, NM (United States)

    2016-10-19

    Rotational motion is increasingly understood to be a significant part of seismic wave motion. Rotations can be important in earthquake strong motion and in Induced Seismicity Monitoring. Rotational seismic data can also enable shear selectivity and improve wavefield sampling for vertical geophones in 3D surveys, among other applications. However, sensor technology has been a limiting factor to date. The US Department of Energy (DOE) and Applied Technology Associates (ATA) are funding a multi-year project that is now entering Phase 2 to develop and deploy a new generation of rotational sensors for validation of rotational seismic applications. Initial focus is on induced seismicity monitoring, particularly for Enhanced Geothermal Systems (EGS) with fracturing. The sensors employ Magnetohydrodynamic (MHD) principles with broadband response, improved noise floors, robustness, and repeatability. This paper presents a summary of Phase 1 results and Phase 2 status.

  5. Seismic moulin tremor

    Science.gov (United States)

    Roeoesli, Claudia; Walter, Fabian; Ampuero, Jean-Paul; Kissling, Edi

    2016-08-01

    Through glacial moulins, meltwater is routed from the glacier surface to its base. Moulins are a main feature feeding subglacial drainage systems and thus influencing basal motion and ice dynamics, but their geometry remains poorly known. Here we show that analysis of the seismic wavefield generated by water falling into a moulin can help constrain its geometry. We present modeling results of hour-long seimic tremors emitted from a vertical moulin shaft, observed with a seismometer array installed at the surface of the Greenland Ice Sheet. The tremor was triggered when the moulin water level exceeded a certain height, which we associate with the threshold for the waterfall to hit directly the surface of the moulin water column. The amplitude of the tremor signal changed over each tremor episode, in close relation to the amount of inflowing water. The tremor spectrum features multiple prominent peaks, whose characteristic frequencies are distributed like the resonant modes of a semiopen organ pipe and were found to depend on the moulin water level, consistent with a source composed of resonant tube waves (water pressure waves coupled to elastic deformation of the moulin walls) along the water-filled moulin pipe. Analysis of surface particle motions lends further support to this interpretation. The seismic wavefield was modeled as a superposition of sustained wave radiation by pressure sources on the side walls and at the bottom of the moulin. The former was found to dominate the wave field at close distance and the latter at large distance to the moulin.

  6. Seismic fragility analysis of seismically isolated nuclear power plants piping system

    Energy Technology Data Exchange (ETDEWEB)

    Salimi Firoozabad, Ehsan, E-mail: e.salimi@pusan.ac.kr [Department of Civil and Environmental Engineering, Pusan National University, 30 Jangjeon-dong, Geumjeong-gu, Busan 609-735 (Korea, Republic of); Jeon, Bub-Gyu, E-mail: bkjeon79@pusan.ac.kr [KOCED Seismic Simulation Test Center, Pusan National University, Yangsan Campus Mulgeum, Yangsan, Kyungsangnam (Korea, Republic of); Choi, Hyoung-Suk, E-mail: engineer@pusan.ac.kr [KOCED Seismic Simulation Test Center, Pusan National University, Yangsan Campus Mulgeum, Yangsan, Kyungsangnam (Korea, Republic of); Kim, Nam-Sik, E-mail: nskim@pusan.ac.kr [Department of Civil and Environmental Engineering, Pusan National University, 30 Jangjeon-dong, Geumjeong-gu, Busan 609-735 (Korea, Republic of)

    2015-04-01

    Highlights: • The critical points of a seismically isolated NPP piping system are identified. • The simulation results are validated through a monotonic and cyclic test of the critical points. • The conditional mean spectrum method is used to scale the selected records. • The fragility curves of the NPP piping system are estimated. • Computation of the fragility parameters is addressed. - Abstract: Nuclear power plants are high risk facilities due to the possibility of sudden seismic events, because any possible failure could initiate catastrophic radioactive contamination. The seismic fragility analysis of NPPs and related equipments (such as piping systems) is a proven method to determine their performance against any possible earthquake. In this study the Brookhaven National laboratory benchmark model of a piping system was considered for the fragility analysis. A tensile test was conducted to define the material properties. An initial seismic analysis of the piping system is performed to indicate the critical sections of the piping system. Numerical analysis was validated through a monotonic and cyclic loading experiment of two identified critical points of the piping system. The tests were conducted at the Korea Construction Engineering Development (KOCED) Seismic Simulation Test Center, Pusan National University, Korea. Fragility curves were expressed for critical points of the system as a function of the spectral acceleration of the records and the maximum relative displacement. The standard deviation of the response and capacity were calculated using mathematical formulas, assuming that those follow a log-normal distribution. We determined that the fragility curve of a pipe elbow must be derived for both the opening and closing mode, regarding the difference between the capacities of the elbow on those modes. The high confidence of low probability of failure for the considered fragility functions in a straight section in any direction is

  7. Conventional mechanical ventilation

    Directory of Open Access Journals (Sweden)

    Tobias Joseph

    2010-01-01

    Full Text Available The provision of mechanical ventilation for the support of infants and children with respiratory failure or insufficiency is one of the most common techniques that are performed in the Pediatric Intensive Care Unit (PICU. Despite its widespread application in the PICUs of the 21st century, before the 1930s, respiratory failure was uniformly fatal due to the lack of equipment and techniques for airway management and ventilatory support. The operating rooms of the 1950s and 1960s provided the arena for the development of the manual skills and the refinement of the equipment needed for airway management, which subsequently led to the more widespread use of endotracheal intubation thereby ushering in the era of positive pressure ventilation. Although there seems to be an ever increasing complexity in the techniques of mechanical ventilation, its successful use in the PICU should be guided by the basic principles of gas exchange and the physiology of respiratory function. With an understanding of these key concepts and the use of basic concepts of mechanical ventilation, this technique can be successfully applied in both the PICU and the operating room. This article reviews the basic physiology of gas exchange, principles of pulmonary physiology, and the concepts of mechanical ventilation to provide an overview of the knowledge required for the provision of conventional mechanical ventilation in various clinical arenas.

  8. ESD and the Rio Conventions

    Science.gov (United States)

    Sarabhai, Kartikeya V.; Ravindranath, Shailaja; Schwarz, Rixa; Vyas, Purvi

    2012-01-01

    Chapter 36 of Agenda 21, a key document of the 1992 Earth Summit, emphasised reorienting education towards sustainable development. While two of the Rio conventions, the Convention on Biological Diversity (CBD) and the United Nations Framework Convention on Climate Change (UNFCCC), developed communication, education and public awareness (CEPA)…

  9. GSAC - Generic Seismic Application Computing

    Science.gov (United States)

    Herrmann, R. B.; Ammon, C. J.; Koper, K. D.

    2004-12-01

    With the success of the IRIS data management center, the use of large data sets in seismological research has become common. Such data sets, and especially the significantly larger data sets expected from EarthScope, present challenges for analysis with existing tools developed over the last 30 years. For much of the community, the primary format for data analysis is the Seismic Analysis Code (SAC) format developed by Lawrence Livermore National Laboratory. Although somewhat restrictive in meta-data storage, the simplicity and stability of the format has established it as an important component of seismological research. Tools for working with SAC files fall into two categories - custom research quality processing codes and shared display - processing tools such as SAC2000, MatSeis,etc., which were developed primarily for the needs of individual seismic research groups. While the current graphics display and platform dependence of SAC2000 may be resolved if the source code is released, the code complexity and the lack of large-data set analysis or even introductory tutorials could preclude code improvements and development of expertise in its use. We believe that there is a place for new, especially open source, tools. The GSAC effort is an approach that focuses on ease of use, computational speed, transportability, rapid addition of new features and openness so that new and advanced students, researchers and instructors can quickly browse and process large data sets. We highlight several approaches toward data processing under this model. gsac - part of the Computer Programs in Seismology 3.30 distribution has much of the functionality of SAC2000 and works on UNIX/LINUX/MacOS-X/Windows (CYGWIN). This is completely programmed in C from scratch, is small, fast, and easy to maintain and extend. It is command line based and is easily included within shell processing scripts. PySAC is a set of Python functions that allow easy access to SAC files and enable efficient

  10. Simultaneous use of multiple seismic arrays

    Science.gov (United States)

    Stipčević, J.; Kennett, B. L. N.; Tkalčić, H.

    2017-05-01

    Seismic arrays provide an important means of enhancing seismic signals and determining the directional properties of the wavefield by beamforming. When multiple arrays are to be used together, the viewpoint needs to be modified from looking outwards from each array to focusing on a specific target area and so constraining the portions of the waveforms to be analysed. Beamforming for each array is supplemented by the relative time constraints for propagation from the target to each array to provide tight spatial control. Simultaneous multiple array analysis provides a powerful tool for source characterization, and for structural analysis of scatterers as virtual sources. The multiple array concept allows us to illuminate a specific point in the Earth from many different directions and thus maps detailed patterns of heterogeneity in the Earth. Furthermore, illumination of the structure from multiple directions using data from the same event minimizes source effects to provide clearer images of heterogeneity. The analysis is based on a similar concept to the backprojection technique, where a part of the seismic wave train is mapped to a specific point in space by ray tracing. In contrast to classic backprojection where the incoming energy is mapped onto a horizontal plane with limited vertical resolution, the multiarray method controls depth response by combining relative time constraints between the arrays and conventional beamforming. We illustrate this approach with application to two earthquakes at moderate depth. The results show that the use of simultaneous multiple arrays can provide improvement both in signal quality and resolution, with the additional benefit of being able to accurately locate the source of the incoming energy and map large areas with only a limited number of such arrays.

  11. Simultaneous use of multiple seismic arrays

    Science.gov (United States)

    Stipčević, J.; Kennett, B. L. N.; Tkalčić, H.

    2017-01-01

    Seismic arrays provide an important means of enhancing seismic signals and determining the directional properties of the wavefield by beam-forming. When multiple arrays are to be used together, the viewpoint needs to be modified from looking outwards from each array to focusing on a specific target area and so constraining the portions of the waveforms to be analysed. Beam-forming for each array is supplemented by the relative time constraints for propagation from the target to each array to provide tight spatial control. Simultaneous multiple array analysis provides a powerful tool for source characterisation, and for structural analysis of scatterers as virtual sources. The multiple array concept allows us to illuminate a specific point in the Earth from many different directions and thus map detailed patterns of heterogeneity in the Earth. Furthermore, illumination of the structure from multiple directions using data from the same event minimizes source effects to provide clearer images of heterogeneity. The analysis is based on a similar concept to the back-projection technique, where a part of the seismic wavetrain is mapped to a specific point in space by ray-tracing. In contrast to classic back-projection where the incoming energy is mapped onto a horizontal plane with limited vertical resolution, the multi-array method controls depth response by combining relative time constraints between the arrays and conventional beam-forming. We illustrate this approach with application to two earthquakes at moderate depth. The results show that the use of simultaneous multiple arrays can provide improvement both in signal quality and resolution, with the additional benefit of being able to accurately locate the source of the incoming energy and map large areas with only a limited number of such arrays.

  12. Considerations on Seismic Design of Installations using Natural Gas Fuel

    Directory of Open Access Journals (Sweden)

    Adriana Tokar

    2016-10-01

    Full Text Available The paper presents issues relating to existing standards underlying seismic design restrictions for non-structural components (NSC related to constructions. Are presented measures that can be implemented to maintain a high level of safety in case of earthquake, natural gas plants, which due to the flammability of fuel, carry some risk of fire or explosion. The purpose of this paper is to highlight the need for seismic design of facilities using natural gas fuel for new buildings but also to review the existing installations in buildings by taking mandatory measures.

  13. First level seismic microzonation map of Chennai city – a GIS approach

    Directory of Open Access Journals (Sweden)

    G. P. Ganapathy

    2011-02-01

    Full Text Available Chennai city is the fourth largest metropolis in India, is the focus of economic, social and cultural development and it is the capital of the State of Tamil Nadu. The city has a multi-dimensional growth in development of its infrastructures and population. The area of Chennai has experienced moderate earthquakes in the historical past. Also the Bureau of Indian Standard upgraded the seismic status of Chennai from Low Seismic Hazard (Zone II to Moderate Seismic Hazard (Zone III–(BIS: 1893 (2001. In this connection, a first level seismic microzonation map of Chennai city has been produced with a GIS platform using the themes, viz, Peak Ground Acceleration (PGA, Shear wave velocity at 3 m, Geology, Ground water fluctuation and bed rock depth. The near potential seismic sources were identified from the remote-sensing study and seismo-tectonic details from published literatures. The peak ground acceleration for these seismic sources were estimated based on the attenuation relationship and the maximum PGA for Chennai is 0.176 g. The groundwater fluctuation of the city varies from 0–4 m below ground level. The depth to bedrock configuration shows trough and ridges in the bedrock topography all over the city. The seismic microzonation analysis involved grid datasets (the discrete datasets from different themes were converted to grids to compute the final seismic hazard grid through integration and weightage analysis of the source themes. The Chennai city has been classified into three broad zones, viz, High, Moderate and Low Seismic Hazard. The High seismic Hazard concentrated in a few places in the western central part of the city. The moderate hazard areas are oriented in NW-SE direction in the Western part. The southern and eastern part will have low seismic hazard. The result of the study may be used as first-hand information in selecting the appropriate earthquake resistant features in designing the forthcoming new buildings against seismic

  14. First level seismic microzonation map of Chennai city - a GIS approach

    Science.gov (United States)

    Ganapathy, G. P.

    2011-02-01

    Chennai city is the fourth largest metropolis in India, is the focus of economic, social and cultural development and it is the capital of the State of Tamil Nadu. The city has a multi-dimensional growth in development of its infrastructures and population. The area of Chennai has experienced moderate earthquakes in the historical past. Also the Bureau of Indian Standard upgraded the seismic status of Chennai from Low Seismic Hazard (Zone II) to Moderate Seismic Hazard (Zone III)-(BIS: 1893 (2001)). In this connection, a first level seismic microzonation map of Chennai city has been produced with a GIS platform using the themes, viz, Peak Ground Acceleration (PGA), Shear wave velocity at 3 m, Geology, Ground water fluctuation and bed rock depth. The near potential seismic sources were identified from the remote-sensing study and seismo-tectonic details from published literatures. The peak ground acceleration for these seismic sources were estimated based on the attenuation relationship and the maximum PGA for Chennai is 0.176 g. The groundwater fluctuation of the city varies from 0-4 m below ground level. The depth to bedrock configuration shows trough and ridges in the bedrock topography all over the city. The seismic microzonation analysis involved grid datasets (the discrete datasets from different themes were converted to grids) to compute the final seismic hazard grid through integration and weightage analysis of the source themes. The Chennai city has been classified into three broad zones, viz, High, Moderate and Low Seismic Hazard. The High seismic Hazard concentrated in a few places in the western central part of the city. The moderate hazard areas are oriented in NW-SE direction in the Western part. The southern and eastern part will have low seismic hazard. The result of the study may be used as first-hand information in selecting the appropriate earthquake resistant features in designing the forthcoming new buildings against seismic ground motion of the

  15. Retrieving impulse response function amplitudes from the ambient seismic field

    Science.gov (United States)

    Viens, Loïc; Denolle, Marine; Miyake, Hiroe; Sakai, Shin'ichi; Nakagawa, Shigeki

    2017-07-01

    Seismic interferometry is now widely used to retrieve the impulse response function of the Earth between two distant seismometers. The phase information has been the focus of most passive imaging studies, as conventional seismic tomography uses traveltime measurements. The amplitude information, however, is harder to interpret because it strongly depends on the distribution of ambient seismic field sources and on the multitude of processing methods. Our study focuses on the latter by comparing the amplitudes of the impulse response functions calculated between seismic stations in the Kanto sedimentary basin, Japan, using several processing techniques. This region provides a unique natural laboratory to test the reliability of the amplitudes with complex wave propagation through the basin, and dense observations from the Metropolitan Seismic Observation network. We compute the impulse response functions using the cross correlation, coherency and deconvolution techniques of the raw ambient seismic field and the cross correlation of 1-bit normalized data. To validate the amplitudes of the impulse response functions, we use a shallow Mw 5.8 earthquake that occurred on the eastern edge of Kanto Basin and close to a station that is used as the virtual source. Both S and surface waves are retrieved in the causal part of the impulse response functions computed with all the different techniques. However, the amplitudes obtained from the deconvolution method agree better with those of the earthquake. Despite the expected wave attenuation due to the soft sediments of the Kanto Basin, seismic amplification caused by the basin geometry dominates the amplitudes of S and surface waves and is captured by the ambient seismic field. To test whether or not the anticausal part of the impulse response functions from deconvolution also contains reliable amplitude information, we use another virtual source located on the western edge of the basin. We show that the surface wave amplitudes

  16. Midget Seismic in Sandbox Models

    Science.gov (United States)

    Krawczyk, C. M.; Buddensiek, M. L.; Philipp, J.; Kukowski, N.; Oncken, O.

    2008-12-01

    Analog sandbox simulation has been applied to study geological processes to provide qualitative and quantitative insights into specific geological problems. In nature, the structures, which are simulated in those sandbox models, are often inferred from seismic data. With the study introduced here, we want to combine the analog sandbox simulation techniques with seismic physical modeling of those sandbox models. The long-term objectives of this approach are (1) imaging of seismic and seismological events of actively deforming and static 3D analogue models, and (2) assessment of the transferability of the model data to field data in order to improve field data acquisition and interpretation according to the addressed geological problem. To achieve this objective, a new midget-seismic facility for laboratory use was designed and developed, comprising a seismic tank, a PC control unit including piezo-electric transducers, and a positioning system. The first experiments are aimed at studying the wave field properties of the piezo- transducers in order to investigate their feasibility for seismic profiling. The properties investigated are their directionality and the change of waveform due to their size (5-12 mm) compared to the wavelengths (material properties and the effects of wave propagation in an-/isotropic media by physical studies, before we finally start using different seismic imaging and processing techniques on static and actively deforming 3D analog models.

  17. [Conventional treatment of gout].

    Science.gov (United States)

    Curković, Bozidar

    2012-01-01

    Gout is a severely disabling disorder, leading to poor quality of life, functional impairment with repercussion on physical activity, social functioning and emotional health. On the other hand, gout is probably the best understood and most manageable of all common systemic rheumatic diseases. The treatment of gout is appropriately divided into treatment of the acute attack and prevention of further attacks and of joint damage. Standard management of acute attacks of gout consists of rest, application of ice to the affected joint, and prescription of non-steroidal anti-inflammatory drugs, or glucocorticoids which should be started immediately to be most effective. Colchicin and interleukin-1 inhibitors can be used as alternative, when are indicated and available. Urate lowering therapy (usually alopurinol) is indicated to treat recurrent gout attacks, chronic arthropathy, tophi and uric acid renal lithiasis.

  18. Cosmology and Convention

    Science.gov (United States)

    Merritt, David

    2017-02-01

    I argue that some important elements of the current cosmological model are "conventionalist" in the sense defined by Karl Popper. These elements include dark matter and dark energy; both are auxiliary hypotheses that were invoked in response to observations that falsified the standard model as it existed at the time. The use of conventionalist stratagems in response to unexpected observations implies that the field of cosmology is in a state of 'degenerating problemshift' in the language of Imre Lakatos. I show that the 'concordance' argument, often put forward by cosmologists in support of the current paradigm, is weaker than the convergence arguments that were made in the past in support of the atomic theory of matter or the quantization of energy.

  19. Seismic hazard assessment: Issues and alternatives

    Science.gov (United States)

    Wang, Z.

    2011-01-01

    Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.

  20. Seismic modeling of carbonate outcrops

    Energy Technology Data Exchange (ETDEWEB)

    Stafleu, J.; Schlager, W.; Campbell, E.; Everts, A.J. (Vrije Universiteit, Amsterdam (Netherlands))

    1993-09-01

    Traditionally, seismic modeling has concentrated on one-dimensional borehole modeling and two-dimensional forward modeling of basic structural-stratigraphic schemes, which are directly compared with real seismic data. Two-dimensional seismic models based on outcrop observations may aid in bridging the gap between the detail of the outcrop and the low resolution of seismic lines. Examples include the Dolomites (north Italy), the High Atlas (Morocco), the Vercors (southeast France) and the Last chance Canyon (New Mexico). The seismic models generally are constructed using the following procedure: (1) construction of a detailed lithological model based on direct outcrop observations; (2) division of the lithological model into lithostratigraphic units, using master bedding planes and important facies transitions as boundaries; (3) assignment of petrophysical properties of these lithostratigraphic units; (4) computation of time sections of reflectivity, using different modeling techniques; and (5) convolution with source wavelets of different frequencies. The lithological detail modeled in the case studies lead to some striking results, particularly the discovery of pseudo-unconformities. Pseudo-unconformities are unconformities in seismics, but correspond to rapid changes of dip and facies in outcrop. None of the outcrop geometries studied were correctly portrayed seismically at 25 Hz frequency. However, in some instances the true relationship would emerge gradually at frequencies of 50 to 100 Hz. These results demonstrate that detailed, outcrop-derived/seismic models can reveal what stratigraphic relationships and features are likely to be resolved under ideal or less ideal conditions, and what pitfalls may befall the interpreter of real seismic data.

  1. Integrated system for seismic evaluations

    Energy Technology Data Exchange (ETDEWEB)

    Xu, J.; Philippacopoulos, A.J.; Miller, C.A.; Costantino, C.J.; Graves, H.

    1989-01-01

    This paper describes the various features of the Seismic Module of the CARES system (Computer Analysis for Rapid Evaluation of Structures). This system was developed by Brookhaven National Laboratory (BNL) for the US Nuclear Regulatory Commission to perform rapid evaluations of structural behavior and capability of nuclear power plant facilities. The CARES is structured in a modular format. Each module performs a specific type of analysis i.e., static or dynamic, linear or nonlinear, etc. This paper describes the features of the Seismic Module in particular. The development of the Seismic Module of the CARES system is based on an approach which incorporates all major aspects of seismic analysis currently employed by the industry into an integrated system that allows for carrying out interactively computations of structural response to seismic motions. The code operates on a PC computer system and has multi-graphics capabilities. It has been designed with user friendly features and it allows for interactive manipulation of various analysis phases during the seismic design process. The capabilities of the seismic module include (a) generation of artificial time histories compatible with given design ground response spectra, (b) development of Power Spectral Density (PSD) functions associated with the seismic input, (c) deconvolution analysis using vertically propagating shear waves through a given soil profile, and (d) development of in-structure response spectra or corresponding PSD's. It should be pointed out that these types of analyses can also be performed individually by using available computer codes such as FLUSH, SAP, etc. The uniqueness of the CARES, however, lies on its ability to perform all required phases of the seismic analysis in an integrated manner. 5 refs., 6 figs.

  2. Exploring Shakespeare: Dynamic Drama Conventions in Teaching "Romeo and Juliet."

    Science.gov (United States)

    Robinson, Sophie

    1999-01-01

    Outlines a Year 10 unit on teaching "Romeo and Juliet" based on standard experiential conventions which include the following: (1) Teacher in Role, (2) Soundscaping, (3) Freeze Frames, (4) Alter Egos, (5) Hot Seating, and (6) Role Playing. Suggests that these conventions can be applied to the study of any Shakespearean play. (NH)

  3. Seismic calm predictors of rockburst

    Science.gov (United States)

    Zmushko, Tatjana; Turuntaev, Sergey; Kulikov, Vladimir

    2013-04-01

    The method of "seismic calm" is widely used for forecasting of strong natural earthquakes (Sobolev G.A., Ponomarev A.V., 2003). The "seismic calm" means that during some time period before the main earthquake, the smaller events (with energies of several order smaller than that of the main earthquake) don't occur. In the presented paper the applicability of the method based on the idea of seismic calm for forecasting rockburst is considered. Three deposits (with seismicity induced by mining) are analyzed: Tashtagol iron deposit (Altai, Russia), Vorkuta (North Ural, Russia) and Barentsburg (Spitsbergen, Norway) coalmines. Local seismic monitoring networks are installed on each of them. The catalogues of seismic events were processed and strong events (rockbursts) were studied (Vorkuta M=2,3; Barentsburg M=1,8; Tashtagol M=1,9÷2,2). All catalogues cover at least two years (Vorkuta - 2008-2011, Barentsburg - 2011-2012, Tashtagol - 2002-2012). It was found that the number of seismic events with magnitudes M=0,5÷1 decreased in a month before the main strong event at Vorkuta coalmines. This event was not directly related with coal mining, its epicenter was located aside of the area of coal mining. In Barentsburg mine the rockburst wasn't so strong as in Vorkuta. The number of events with energies M=0,5 decreased slightly before the rockburst, but not so obviously as in Vorkuta case. The seismic events with high energies occur often at Tashtagol iron deposit. Mining methods used there differ from the coal deposit mining. At coalmines the mining combine runs from edge to edge of the wall, cutting off the coal. The considered iron deposit is developed by a method of block blasting. Not all rockbursts occur immediately after the blasting, so, the problem of the rockburst prediction is important for mining safety. To find rockburst precursors it is necessary to separate the events occurred due to the block blasting from the seismic events due to relocation of stresses in

  4. A probabilistic seismic hazard map of India and adjoining regions

    Directory of Open Access Journals (Sweden)

    H. K. Gupta

    1999-06-01

    Full Text Available This paper presents the results of an exercise carried out under GSHAP, over India and adjoining regions bound by 0°N-40°N and 65°E-100°E. A working catalogue of main shocks was prepared by merging the local catalogues with the NOAA catalogue, and removing duplicates, aftershocks and earthquakes without any magnitude. Eighty six potential seismic source zones were delineated based on the major tectonic features and seismicity trends. Using the probabilistic hazard assessment approach of McGuire, adopted by GSHAP, the Peak Ground Accelerations (PGA were computed for 10% probability of exceedance in 50 years, at locations defined by a grid of 0.5° x 0.5°. Since no reliable estimates of attenuation values are available for the Indian region, the attenuation relation of Joyner and Boore (1981 was used. The PGA values over the grid points were contoured to obtain a seismic hazard map. The hazard map depicts that a majority of the Northern Indian plate boundary region and the Tibetan plateau region have hazard level of the order of 0.25 g with prominent highs of the order of 0.35-0.4 g in the seismically more active zones like the Burmese arc, Northeastern India and Hindukush region. In the Indian shield, the regional seismic hazard, covering a major area, is of the order of 0.05-0.1 g whereas some areas like Koyna depict hazard to the level of 0.2 g. The present map can be converted into a conventional seismic zoning map having four zones with zone factors of 0.1 g, 0.2 g, 0.3 g and 0.4 g respectively.

  5. Reassigned time-frequency peak filtering for seismic random noise attenuation

    Science.gov (United States)

    Lin, H.; Li, Y.; Ma, H.

    2012-12-01

    -Ville distribution (PWVD) used by conventional TFPF. Moreover, the time-frequency reassignment method is adopted with the aim to concentrate the SPWVD to the IF of the coded signal to obtain accurate filtering signal. The coordinates of the reassigned SPWVD are the center of the gravity of the TFD, which is the estimation of the IF. Subsequently, RTFPF took the peaks of the reassigned SPWVD over the frequency to obtain the filtering signal. The RTFPF were applied to the synthetic seismic data and a common-shot-point gather. The results were compared with the conventional TFPF using the PWVD. The preliminary results seem to be encouraged in random noise attenuation and precision of the signal when applying RTFPF. Example on a field seismic record.

  6. Seismicity of the Earth 1900-2007

    Science.gov (United States)

    Tarr, Arthur C.; Villaseñor, Antonio; Furlong, Kevin P.; Rhea, Susan; Benz, Harley M.

    2010-01-01

    This map illustrates more than one century of global seismicity in the context of global plate tectonics and the Earth's physiography. Primarily designed for use by earth scientists and engineers interested in earthquake hazards of the 20th and early 21st centuries, this map provides a comprehensive overview of strong earthquakes since 1900. The map clearly identifies the location of the 'great' earthquakes (M8.0 and larger) and the rupture area, if known, of the M8.3 or larger earthquakes. The earthquake symbols are scaled proportional to the moment magnitude and therefore to the area of faulting, thus providing a better understanding of the relative sizes and distribution of earthquakes in the magnitude range 5.5 to 9.5. Plotting the known rupture area of the largest earthquakes also provides a better appreciation of the extent of some of the most famous and damaging earthquakes in modern history. All earthquakes shown on the map were carefully relocated using a standard earth reference model and standardized location procedures, thereby eliminating gross errors and biases in locations of historically important earthquakes that are often found in numerous seismicity catalogs.

  7. Reevaluation of the Seismicity and seismic hazards of Northeastern Libya

    Science.gov (United States)

    Ben Suleman, abdunnur; Aousetta, Fawzi

    2014-05-01

    Libya, located at the northern margin of the African continent, underwent many episodes of orogenic activities. These episodes of orogenic activities affected and shaped the geological setting of the country. This study represents a detailed investigation that aims to focus on the seismicity and its implications on earthquake hazards of Northeastern Libya. At the end of year 2005 the Libyan National Seismological Network starts functioning with 15 stations. The Seismicity of the area under investigation was reevaluated using data recorded by the recently established network. The Al-Maraj earthquake occurred in May 22nd 2005was analyzed. This earthquake was located in a known seismically active area. This area was the sight of the well known 1963 earthquake that kills over 200 people. Earthquakes were plotted and resulting maps were interpreted and discussed. The level of seismic activity is higher in some areas, such as the city of Al-Maraj. The offshore areas north of Al-Maraj seem to have higher seismic activity. It is highly recommended that the recent earthquake activity is considered in the seismic hazard assessments for the northeastern part of Libya.

  8. Validating induced seismicity forecast models - Induced Seismicity Test Bench

    CERN Document Server

    Kiraly-Proag, Eszter; Gischig, Valentin; Wiemer, Stefan; Karvounis, Dimitrios; Doetsch, Joseph

    2016-01-01

    Induced earthquakes often accompany fluid injection, and the seismic hazard they pose threatens various underground engineering projects. Models to monitor and control induced seismic hazard with traffic light systems should be probabilistic, forward-looking, and updated as new data arrive. In this study, we propose an Induced Seismicity Test Bench to test and rank such models; this test bench can be used for model development, model selection, and ensemble model building. We apply the test bench to data from the Basel 2006 and Soultz-sous-For\\^ets 2004 geothermal stimulation projects, and we assess forecasts from two models: Shapiro and Smoothed Seismicity (SaSS) and Hydraulics and Seismics (HySei). These models incorporate a different mix of physics-based elements and stochastic representation of the induced sequences. Our results show that neither model is fully superior to the other. Generally, HySei forecasts the seismicity rate better after shut-in, but is only mediocre at forecasting the spatial distri...

  9. Analysis and models of pre-injection surface seismic array noise recorded at the Aquistore carbon storage site

    Science.gov (United States)

    Birnie, Claire; Chambers, Kit; Angus, Doug; Stork, Anna L.

    2016-08-01

    Noise is a persistent feature in seismic data and so poses challenges in extracting increased accuracy in seismic images and physical interpretation of the subsurface. In this paper, we analyse passive seismic data from the Aquistore carbon capture and storage pilot project permanent seismic array to characterise, classify and model seismic noise. We perform noise analysis for a three-month subset of passive seismic data from the array and provide conclusive evidence that the noise field is not white, stationary, or Gaussian; characteristics commonly yet erroneously assumed in most conventional noise models. We introduce a novel noise modelling method that provides a significantly more accurate characterisation of real seismic noise compared to conventional methods, which is quantified using the Mann-Whitney-White statistical test. This method is based on a statistical covariance modelling approach created through the modelling of individual noise signals. The identification of individual noise signals, broadly classified as stationary, pseudo-stationary and non-stationary, provides a basis on which to build an appropriate spatial and temporal noise field model. Furthermore, we have developed a workflow to incorporate realistic noise models within synthetic seismic data sets providing an opportunity to test and analyse detection and imaging algorithms under realistic noise conditions.

  10. Seismic rupture modelling, strong motion prediction and seismic hazard assessment: fundamental and applied approaches; Modelisation de la rupture sismique, prediction du mouvement fort, et evaluation de l'alea sismique: approches fondamentale et appliquee

    Energy Technology Data Exchange (ETDEWEB)

    Berge-Thierry, C

    2007-05-15

    The defence to obtain the 'Habilitation a Diriger des Recherches' is a synthesis of the research work performed since the end of my Ph D. thesis in 1997. This synthesis covers the two years as post doctoral researcher at the Bureau d'Evaluation des Risques Sismiques at the Institut de Protection (BERSSIN), and the seven consecutive years as seismologist and head of the BERSSIN team. This work and the research project are presented in the framework of the seismic risk topic, and particularly with respect to the seismic hazard assessment. Seismic risk combines seismic hazard and vulnerability. Vulnerability combines the strength of building structures and the human and economical consequences in case of structural failure. Seismic hazard is usually defined in terms of plausible seismic motion (soil acceleration or velocity) in a site for a given time period. Either for the regulatory context or the structural specificity (conventional structure or high risk construction), seismic hazard assessment needs: to identify and locate the seismic sources (zones or faults), to characterize their activity, to evaluate the seismic motion to which the structure has to resist (including the site effects). I specialized in the field of numerical strong-motion prediction using high frequency seismic sources modelling and forming part of the IRSN allowed me to rapidly working on the different tasks of seismic hazard assessment. Thanks to the expertise practice and the participation to the regulation evolution (nuclear power plants, conventional and chemical structures), I have been able to work on empirical strong-motion prediction, including site effects. Specific questions related to the interface between seismologists and structural engineers are also presented, especially the quantification of uncertainties. This is part of the research work initiated to improve the selection of the input ground motion in designing or verifying the stability of structures. (author)

  11. Multi scale seismic data correlation and integration with regional tectonic framework: example of the Piratininga Dome, SP, Brazil; Correlacao de dados sismicos multiescala e integracao com arcabouco tectonico regional: exemplo da area do Domo de Piratininga, SP

    Energy Technology Data Exchange (ETDEWEB)

    Campos, Adriane Fatima de; Bartoszeck, Marcelo Kulevicz [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Programa de Pos-Graduacao em Geologia. Lab. de Analise de Bacias e Petrofisica]. E-mail: adrianefcampos@yahoo.com.br; Rostirolla, Sidnei Pires; Ferreira, Francisco Jose Fonseca; Romeiro, Marco Antonio Thoaldo [Universidade Federal do Parana (UFPR), Curitiba, PR (Brazil). Dept. de Geologia; Kiang, Chang Hung [UNESP, Rio Claro, SP (Brazil). Dept. de Geologia Aplicada

    2008-06-15

    The study area covers the Piratininga Dome, a structural high composed by a center horst bordered by faults. The main objective of this work was to establish a systematic multi scale approach, in which high resolution seismic data was compared to conventional seismic, digital terrain models and geophysical potential data. The subsurface data include an 80 km conventional seismic section and the well 1-80 km-1-SP. The Kingdom (Seismic Micro-Technology) software was used to interpret the seismic data in order to map the main horizons and faults. To test the multi scale hypothesis was acquired a high resolution seismic line just over the regional seismic trace. This detailed line measures 1 km length and 360 m depth. The seismic processing was based on a conventional flowchart for CDP technique with Vista (Gedco) software. Shuttle Radar Topography Mission (SRTM) and aero magnetic data of Botucatu and Bauru projects were used to the lineaments interpretation. Comparison between observed horizons in the high resolution and conventional seismic lines made possible to test different alternatives to map structural and stratigraphic features. The obtained multi scale geological elements hierarchies enlarge the knowledge in reservoir resolution. The results of the interpretation indicate the close relationship between regional structural framework and features observed in seismic data, and can be applied to enhance and guide the studies of analogous to depth reservoirs. (author)

  12. Modernization of the USGS Hawaiian Volcano Observatory Seismic Processing Infrastructure

    Science.gov (United States)

    Antolik, L.; Shiro, B.; Friberg, P. A.

    2016-12-01

    The USGS Hawaiian Volcano Observatory (HVO) operates a Tier 1 Advanced National Seismic System (ANSS) seismic network to monitor, characterize, and report on volcanic and earthquake activity in the State of Hawaii. Upgrades at the observatory since 2009 have improved the digital telemetry network, computing resources, and seismic data processing with the adoption of the ANSS Quake Management System (AQMS) system. HVO aims to build on these efforts by further modernizing its seismic processing infrastructure and strengthen its ability to meet ANSS performance standards. Most notably, this will also allow HVO to support redundant systems, both onsite and offsite, in order to provide better continuity of operation during intermittent power and network outages. We are in the process of implementing a number of upgrades and improvements on HVO's seismic processing infrastructure, including: 1) Virtualization of AQMS physical servers; 2) Migration of server operating systems from Solaris to Linux; 3) Consolidation of AQMS real-time and post-processing services to a single server; 4) Upgrading database from Oracle 10 to Oracle 12; and 5) Upgrading to the latest Earthworm and AQMS software. These improvements will make server administration more efficient, minimize hardware resources required by AQMS, simplify the Oracle replication setup, and provide better integration with HVO's existing state of health monitoring tools and backup system. Ultimately, it will provide HVO with the latest and most secure software available while making the software easier to deploy and support.

  13. Seismic assessment of Technical Area V (TA-V).

    Energy Technology Data Exchange (ETDEWEB)

    Medrano, Carlos S.

    2014-03-01

    The Technical Area V (TA-V) Seismic Assessment Report was commissioned as part of Sandia National Laboratories (SNL) Self Assessment Requirement per DOE O 414.1, Quality Assurance, for seismic impact on existing facilities at Technical Area-V (TA-V). SNL TA-V facilities are located on an existing Uniform Building Code (UBC) Seismic Zone IIB Site within the physical boundary of the Kirtland Air Force Base (KAFB). The document delineates a summary of the existing facilities with their safety-significant structure, system and components, identifies DOE Guidance, conceptual framework, past assessments and the present Geological and Seismic conditions. Building upon the past information and the evolution of the new seismic design criteria, the document discusses the potential impact of the new standards and provides recommendations based upon the current International Building Code (IBC) per DOE O 420.1B, Facility Safety and DOE G 420.1-2, Guide for the Mitigation of Natural Phenomena Hazards for DOE Nuclear Facilities and Non-Nuclear Facilities.

  14. Earthquake Activity - SEISMIC_DATA_IN: Seismic Refraction Data for Indiana (Indiana Geological Survey, Point Shapefile)

    Data.gov (United States)

    NSGIC GIS Inventory (aka Ramona) — SEISMIC_DATA_IN is a point shapefile created from a shapefile named SEISMIC_DATA, which was derived from a Microsoft Excel spreadsheet named SEISMIC_DECODED. The...

  15. null Seismic Creep, null Images

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Seismic creep is the constant or periodic movement on a fault as contrasted with the sudden rupture associated with an earthquake. It is a usually slow deformation...

  16. Worldwide Marine Seismic Reflection Profiles

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — NGDC maintains a large volume of both Analog and Digital seismic reflection data. Currently only a limited number of lines are available online. Digital data include...

  17. Conventional colonoscopy; Konventionelle Kolonoskopie

    Energy Technology Data Exchange (ETDEWEB)

    Haefner, M. [Universitaetsklinik fuer Innere Medizin III, Klinische Abteilung fuer Gastroenterologie und Hepatologie, Wien (Austria)

    2008-02-15

    In the last 40 years colonoscopy has been the gold standard in diagnosis of conditions affecting the large intestine. We see its main disadvantages in the necessity for intestinal preparation and in the pain not infrequently experienced by patients who are not sedated. Widespread use of sedation has made it possible to improve patient acceptance in recent years. Complications of colonoscopy are rare, and even the removal of large polyps is regarded as a safe procedure. One of the main problems of colonoscopy is that a large number of far from trivial polyps - up to 20% in the literature - are overlooked. New developments, such as higher resolution videochips and chromoendoscopy, lead to a better diagnostic yield, especially of flat lesions. The rapidly developing sector of interventional colonoscopy in particular will ensure that colonoscopy continues to have an important place in the management of illnesses affecting the large intestine. (orig.) [German] Die Kolonoskopie war in den letzten 40 Jahren der Goldstandard zur Diagnostik von Dickdarmerkrankungen. Ihre Hauptnachteile liegen in der Notwendigkeit der Darmvorbereitung sowie nicht selten auftretenden Schmerzen beim nichtsedierten Patienten. Durch den breiten Einsatz der Sedierung konnte in den letzten Jahren die Patientenakzeptanz verbessert werden. Komplikationen der Kolonoskopie sind selten und selbst das Entfernen grosser Polypen wird als sicher angesehen. Eines der Hauptprobleme der Kolonoskopie liegt in der nicht unbetraechtlichen Anzahl uebersehener Polypen, die in der Literatur bis zu 20% betraegt. Neue Entwicklungen wie hoeher aufloesende Videochips oder die Chromoendoskopie fuehren zu einer verbesserten diagnostischen Ausbeute v. a. flacher Laesionen. Besonders das sich rasch entwickelnde Gebiet der interventionellen Kolonoskopie wird dafuer sorgen, dass die Kolonoskopie auch in Zukunft noch einen wichtigen Stellenwert beim Management von Dickdarmerkrankungen hat. (orig.)

  18. A study on the seismic fortification level of offshore platform in Bohai Sea of China

    Science.gov (United States)

    Lu, Y.

    2010-12-01

    The Chinese sea areas are important places of offshore petroleum resources, and at the same time they are also seismically active regions. Fixed offshore platforms (OPs) are the fundamental facilities for marine resource exploitation, and usually situated in a complex and severe environment as having to endure many environmental loads in their life span, therefore, the damage to their structures may result in serious disasters. Among these environmental loads the seismic load has tremendous destructive effect and is not predictable. In case of not overly severe wind, wave and current, seismic resistance dominates the strength design of platforms. Furthermore, strong earthquakes have occurred recently or in the history of all the sea areas of oil/gas exploitation in China. Therefore, seismic design of fixed OPs is a very important issue. With the development of marine exploration and earthquake researches in the sea area, extensive studies on the seismotectonic environment and seismicity characteristics of the sea areas of China have been performed, meanwhile, more and more experience and data have been accumulated from OP design practice, which laid a foundation for studying and establishing the seismic design standard of OPs. This paper first gives an overall understanding of the seismic environment of the sea areas of China, then taking the Bohai Sea seismic risk study as an example, introducing a so-called shape factor K to characterize the seismic risk distribution in sub-regions of the Bohai Sea. Based on the seismic design ground motions for 46 platforms in Bohai Sea, a statistic analysis was performed for different peak ground acceleration (PGA) ratios at two different probability levels. In accordance with the two-stage design method, a scheme of two seismic design levels is proposed, and two seismic design objectives are established respectively for the strength level earthquake and ductility level earthquake. By analogy with and comparison to the Chinese

  19. Visualization of volumetric seismic data

    Science.gov (United States)

    Spickermann, Dela; Böttinger, Michael; Ashfaq Ahmed, Khawar; Gajewski, Dirk

    2015-04-01

    Mostly driven by demands of high quality subsurface imaging, highly specialized tools and methods have been developed to support the processing, visualization and interpretation of seismic data. 3D seismic data acquisition and 4D time-lapse seismic monitoring are well-established techniques in academia and industry, producing large amounts of data to be processed, visualized and interpreted. In this context, interactive 3D visualization methods proved to be valuable for the analysis of 3D seismic data cubes - especially for sedimentary environments with continuous horizons. In crystalline and hard rock environments, where hydraulic stimulation techniques may be applied to produce geothermal energy, interpretation of the seismic data is a more challenging problem. Instead of continuous reflection horizons, the imaging targets are often steep dipping faults, causing a lot of diffractions. Without further preprocessing these geological structures are often hidden behind the noise in the data. In this PICO presentation we will present a workflow consisting of data processing steps, which enhance the signal-to-noise ratio, followed by a visualization step based on the use the commercially available general purpose 3D visualization system Avizo. Specifically, we have used Avizo Earth, an extension to Avizo, which supports the import of seismic data in SEG-Y format and offers easy access to state-of-the-art 3D visualization methods at interactive frame rates, even for large seismic data cubes. In seismic interpretation using visualization, interactivity is a key requirement for understanding complex 3D structures. In order to enable an easy communication of the insights gained during the interactive visualization process, animations of the visualized data were created which support the spatial understanding of the data.

  20. Seismic properties of polyphase rocks

    Science.gov (United States)

    Wang, Qin

    2005-11-01

    Knowledge about the seismic properties of polyphase rocks is fundamental for interpreting seismic refraction and reflection data and for establishing lithospheric structure and composition models. This study aims to obtain more precise relationships between seismic properties of rocks and controlling factors (e.g., pressure, temperature, mineralogical and chemical compositions, microstructure of rocks), particularly for those rocks imprinted by ultrahigh-pressure (UHP) metamorphism. These relationships will be very helpful to extrapolate calculated and measured seismic properties of rocks to depths of interest and to engender interpretations relevant to petrological composition and tectonic process. An Internet Database of Rock Seismic Properties (DRSP) was set up and a Handbook of Seismic Properties of Minerals, Rocks and Ores was published. They comprise almost all data available in the literature during the past 4 decades and can serve as a convenient, comprehensive and concise information source on physical properties of rocks to the earth sciences and geotechnical communities. Statistical results of the DRSP reveal the dependence of seismic properties on density, porosity, humidity, and mineralogical and chemical compositions. Using 16 different averaging methods, we calculated P-wave velocities of 696 dry samples according to the volume fraction and elastic constants of each constituent mineral. Although only 22 common minerals were taken into account in the computation, the calculated P-wave velocities agree well with laboratory values measured at about 300 MPa, where most microcracks are closed and the mean Vp of a polymineralic rock is exclusively controlled by its modal composition. However, none of these mixture rules can simultaneously fit measured P-wave velocities for all lithologies or at all pressures. Therefore, more prudence is required in selecting an appropriate mixture rule for calculation of seismic velocities of different rock types.

  1. Newberry Seismic Deployment Fieldwork Report

    Energy Technology Data Exchange (ETDEWEB)

    Wang, J; Templeton, D C

    2012-03-21

    This report summarizes the seismic deployment of Lawrence Livermore National Laboratory (LLNL) Geotech GS-13 short-period seismometers at the Newberry Enhanced Geothermal System (EGS) Demonstration site located in Central Oregon. This Department of Energy (DOE) demonstration project is managed by AltaRock Energy Inc. AltaRock Energy had previously deployed Geospace GS-11D geophones at the Newberry EGS Demonstration site, however the quality of the seismic data was somewhat low. The purpose of the LLNL deployment was to install more sensitive sensors which would record higher quality seismic data for use in future seismic studies, such as ambient noise correlation, matched field processing earthquake detection studies, and general EGS microearthquake studies. For the LLNL deployment, seven three-component seismic stations were installed around the proposed AltaRock Energy stimulation well. The LLNL seismic sensors were connected to AltaRock Energy Gueralp CMG-DM24 digitizers, which are powered by AltaRock Energy solar panels and batteries. The deployment took four days in two phases. In phase I, the sites were identified, a cavity approximately 3 feet deep was dug and a flat concrete pad oriented to true North was made for each site. In phase II, we installed three single component GS-13 seismometers at each site, quality controlled the data to ensure that each station was recording data properly, and filled in each cavity with native soil.

  2. seismicity and seismotectonics of Libya

    Science.gov (United States)

    Ben Suleman, abdunnur

    2015-04-01

    Libya, located at the central Mediterranean margin of the African shield, underwent many episodes of orogenic activity that shaped its geological setting. The present day deformation of Libya is the result of the Eurasia-Africa continental collision. The tectonic evolution of Libya has yielded a complex crustal structure that is composed of a series of basins and uplifts. This study aims to explain in detail the seismicity and seismotectonics of Libya using new data recorded by the recently established Libyan National Seismograph Network (LNSN) incorporating other available geophysical and geological information. Detailed investigations of the Libyan seismicity indicates that Libya has experienced earthquakes of varying magnitudes The seismic activity of Libya shows dominant trends of Seismicity with most of the seismic activity concentrated along the northern coastal areas. Four major clusters of Seismicity were quit noticeable. Fault plane solution was estimated for 20 earthquakes recorded by the Libyan National Seismograph Network in northwestern and northeastern Libya. Results of fault plane solution suggest that normal faulting was dominant in the westernmost part of Libya; strike slip faulting was dominant in northern-central part of Libya. The northern-eastern part of the country suggests that dip-dip faulting were more prevalent.

  3. Seismic stratigraphy of the Bahamas

    Energy Technology Data Exchange (ETDEWEB)

    Ladd, J.W.; Sheridan, R.E.

    1987-06-01

    Seismic reflection profiles from the Straits of Florida, Northwest Providence Channel, Tongue of the Ocean, and Exuma Sound reveal a seismic stratigraphy characterized by a series of prograding Upper Cretaceous and Tertiary seismic sequences with seismic velocities generally less than 4 km/sec overlying a Lower Cretaceous section of low-amplitude reflections which are more nearly horizontal than the overlying prograding clinoforms and have seismic velocities greater than 5 km/sec. The prograding units are detrital shallow-water carbonates shed from nearby carbonate banks into deep intrabank basins that were established in the Late Cretaceous. The Lower Cretaceous units are probably shallow-water carbonate banks that were drowned in the middle Cretaceous but which, during the Early Cretaceous, extended from Florida throughout the Bahamas region. The seismic reflection profiles reveal a sharp angular unconformity at 5-sec two-way traveltime in northwest Tongue of the Ocean, suggesting a rift-drift unconformity and deposition on thinned continental crust. No such unconformity is seen in central and southeast Tongue of the Ocean or in Exuma Sound, suggesting that these areas are built on oceanic crust.

  4. Seismic risk perception in Italy

    Science.gov (United States)

    Crescimbene, Massimo; La Longa, Federica; Camassi, Romano; Pino, Nicola Alessandro; Peruzza, Laura

    2014-05-01

    Risk perception is a fundamental element in the definition and the adoption of preventive counter-measures. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. This paper presents results of a survey on seismic risk perception in Italy conducted from January 2013 to present . The research design combines a psychometric and a cultural theoretic approach. More than 7,000 on-line tests have been compiled. The data collected show that in Italy seismic risk perception is strongly underestimated; 86 on 100 Italian citizens, living in the most dangerous zone (namely Zone 1), do not have a correct perception of seismic hazard. From these observations we deem that extremely urgent measures are required in Italy to reach an effective way to communicate seismic risk. Finally, the research presents a comparison between groups on seismic risk perception: a group involved in campaigns of information and education on seismic risk and a control group.

  5. Seismicity of the Jalisco Block

    Science.gov (United States)

    Nunez-Cornu, F. J.; Rutz, M.; Camarena-Garcia, M.; Trejo-Gomez, E.; Reyes-Davila, G.; Suarez-Plascencia, C.

    2002-12-01

    In April 2002 began to transmit the stations of the first phase of Jalisco Telemetric Network located at the northwest of Jalisco Block and at the area of Volcan de Fuego (Colima Volcano), in June were deployed four additional MarsLite portable stations in the Bahia de Banderas area, and by the end of August one more portable station at Ceboruco Volcano. The data of these stations jointly with the data from RESCO (Colima Telemetric Network) give us the minimum seismic stations coverage to initiate in a systematic and permanent way the study of the seismicity in this very complex tectonic region. A preliminary analysis of seismicity based on the events registered by the networks using a shutter algorithm, confirms several important features proposed by microseismicity studies carried out between 1996 and 1998. A high level of seismicity inside and below of Rivera plate is observed, this fact suggest a very complex stress pattern acting on this plate. Shallow seismicity at south and east of Bahia de Banderas also suggest a complex stress pattern in this region of the Jalisco Block, events at more than 30 km depth are located under the mouth of the bay and in face of it, a feature denominated Banderas Boundary mark the change of the seismic regime at north of this latitude (20.75°N), however some shallow events were located at the region of Nayarit.

  6. An Adaptable Seismic Data Format

    Science.gov (United States)

    Krischer, Lion; Smith, James; Lei, Wenjie; Lefebvre, Matthieu; Ruan, Youyi; de Andrade, Elliott Sales; Podhorszki, Norbert; Bozdağ, Ebru; Tromp, Jeroen

    2016-11-01

    We present ASDF, the Adaptable Seismic Data Format, a modern and practical data format for all branches of seismology and beyond. The growing volume of freely available data coupled with ever expanding computational power opens avenues to tackle larger and more complex problems. Current bottlenecks include inefficient resource usage and insufficient data organization. Properly scaling a problem requires the resolution of both these challenges, and existing data formats are no longer up to the task. ASDF stores any number of synthetic, processed or unaltered waveforms in a single file. A key improvement compared to existing formats is the inclusion of comprehensive meta information, such as event or station information, in the same file. Additionally, it is also usable for any non-waveform data, for example, cross-correlations, adjoint sources or receiver functions. Last but not least, full provenance information can be stored alongside each item of data, thereby enhancing reproducibility and accountability. Any data set in our proposed format is self-describing and can be readily exchanged with others, facilitating collaboration. The utilization of the HDF5 container format grants efficient and parallel I/O operations, integrated compression algorithms and check sums to guard against data corruption. To not reinvent the wheel and to build upon past developments, we use existing standards like QuakeML, StationXML, W3C PROV and HDF5 wherever feasible. Usability and tool support are crucial for any new format to gain acceptance. We developed mature C/Fortran and Python based APIs coupling ASDF to the widely used SPECFEM3D_GLOBE and ObsPy toolkits.

  7. Seismic pattern treatment method through calculation of seismic density at grid nodes

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Analysis of seismic data and seismicity characteristics in China, we gave a method to deal with seismic patterns by calculating density at grid nodes. Number of earthquakes and epicenter distribution are considered comprehen-sively in this method. Effect of datum accuracy is stressed on parameter confirmation. Seismic patterns from this method are stable and can reflect seismic characteristics reliably. These seismic patterns are the base of quantita-tive analysis of seismicity. It can be applied in seismic tendency analysis and medium-long term earthquake pre-diction, earthquake countermeasure and risk mitigation.

  8. Inverse seismic interferometry: can we observe seismic data at greater depth?

    Science.gov (United States)

    Koelemeijer, Paula; Fichtner, Andreas; Kimman, Wouter

    2015-04-01

    By the very nature of our planet, seismological recordings are limited to the Earth's surface with some deployments in boreholes and more recently the placement of seismometers on the sea floor. Therefore, only travelling and standing waves that are excited and oscillate at shallow depths can be observed. Seismic waves oscillating at great depth with zero amplitude near the surface, e.g. higher frequency core-mantle boundary Stoneley modes, remain practically invisible to us. Seismic interferometry based on background noise has become a standard method for obtaining information regarding shallow and more recently also deeper Earth structure. Noise cross-correlations between a set of stations located on the surface of the Earth provide in theory information on the inter-station Green's functions, in case of an equipartitioned wave field or an isotropic source distribution. Using reciprocity, similar techniques can be employed to obtain the Green's function between two events for a distribution of receivers. In this contribution, we propose to use the concept of inverse interferometry for observing seismic data with only deep non-zero amplitude. As an initial step, cross-correlation measurements between two deep events, recorded at stations over the globe, will be analysed. Numerical wave field simulations will enable us to investigate the sensitivity of these measurements to Earth structure. Important contributing factors are possibly the source mechanisms of the events, inter-source distance and the distribution of receivers over the surface of the Earth.

  9. More than meets the eye---A study in seismic visualization

    Science.gov (United States)

    Lynch, Steven

    This thesis is primarily concerned with examining the properties of SeisScape displays, which render seismic data as a three-dimensional surface. SeisScape displays are fundamentally different from conventional seismic displays in that they fully engage the visual system and produce sensations of perception. These perceptions are the goal of scientific visualization. Visualization itself is placed into context with respect to seismic data by discussing how the display acts as a filter upon seismic resolution. There are two levels of seismic resolution; absolute resolution, which is a product of spatial and temporal resolution; and apparent resolution, which is a product of the display. It is established that the apparent resolution of conventional displays is significantly lower than the absolute resolution of the data. The primate visual system is the second, immutable, stage of the seismic display filter. It is not, however, a general purpose tool. To learn how to use it appropriately, the evolution and properties of the primate visual system are discussed in the context of determining how primates establish their perceptions of form and color. Two terms that describe the structure of a seismic section are introduced. The first is macrostructure, which is the collection of strong amplitude events that are visible on any seismic section. The second is the microstructure, which is the collection of weak amplitude events that are often only observed as perturbations upon the macrostructure. Several techniques for tessellating the seismic surface are developed. Examples are presented to illustrate the effect that tessellation has on the ability to perceive both macrostructure and microstructure. Various techniques are developed to calculate the reflectance of the seismic surface and examples show how reflectance is primarily responsible for our ability to perceive microstructure. The use of color on seismic data is examined from the perspective of the evolution of

  10. Romanian Educational Seismic Network Project

    Science.gov (United States)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  11. EMSE: Synergizing EM and seismic data attributes for enhanced forecasts of reservoirs

    KAUST Repository

    Katterbauer, Klemens

    2014-10-01

    New developments of electromagnetic and seismic techniques have recently revolutionized the oil and gas industry. Time-lapse seismic data is providing engineers with tools to more accurately track the dynamics of multi-phase reservoir fluid flows. With the challenges faced in distinguishing between hydrocarbons and water via seismic methods, the industry has been looking at electromagnetic techniques in order to exploit the strong contrast in conductivity between hydrocarbons and water. Incorporating this information into reservoir simulation is expected to considerably enhance the forecasting of the reservoir, hence optimizing production and reducing costs. Conventional approaches typically invert the seismic and electromagnetic data in order to transform them into production parameters, before incorporating them as constraints in the history matching process and reservoir simulations. This makes automatization difficult and computationally expensive due to the necessity of manual processing, besides the potential artifacts. Here we introduce a new approach to incorporate seismic and electromagnetic data attributes directly into the history matching process. To avoid solving inverse problems and exploit information in the dynamics of the flow, we exploit petrophysical transformations to simultaneously incorporate time lapse seismic and electromagnetic data attributes using different ensemble Kalman-based history matching techniques. Our simulation results show enhanced predictability of the critical reservoir parameters and reduce uncertainties in model simulations, outperforming with only production data or the inclusion of either seismic or electromagnetic data. A statistical test is performed to confirm the significance of the results. © 2014 Elsevier B.V. All rights reserved.

  12. Principle and application of high density spatial sampling in seismic migration

    Science.gov (United States)

    Li, Zi-Shun

    2012-06-01

    To avoid spatial aliasing problems in broad band high resolution seismic sections, I present a high density migration processing solution. I first analyze the spatial aliasing definition for stack and migration seismic sections and point out the differences between the two. We recognize that migration sections more often show spatial aliasing than stacked sections. Second, from wave propagation theory, I know that migration output is a new spatial sampling process and seismic prestack time migration can provide the high density sampling to prevent spatial aliasing on high resolution migration sections. Using a 2D seismic forward modeling analysis, I have found that seismic spatial aliasing noise can be eliminated by high density spatial sampling in prestack migration. In a 3D seismic data study for Daqing Oilfield in the Songliao Basin, I have also found that seismic sections obtained by high-density spatial sampling (10 × 10 m) in prestack migration have less spatial aliasing noise than those obtained by conventional low density spatial sampling (20 × 40 m) in prestack migration.

  13. Overcoming barriers to high performance seismic design using lessons learned from the green building industry

    Science.gov (United States)

    Glezil, Dorothy

    NEHRP's Provisions today currently governing conventional seismic resistant design. These provisions, though they ensure the life-safety of building occupants, extensive damage and economic losses may still occur in the structures. This minimum performance can be enhanced using the Performance-Based Earthquake Engineering methodology and passive control systems like base isolation and energy dissipation systems. Even though these technologies and the PBEE methodology are effective reducing economic losses and fatalities during earthquakes, getting them implemented into seismic resistant design has been challenging. One of the many barriers to their implementation has been their upfront costs. The green building community has faced some of the same challenges that the high performance seismic design community currently faces. The goal of this thesis is to draw on the success of the green building industry to provide recommendations that may be used overcome the barriers that high performance seismic design (HPSD) is currently facing.

  14. Seismic spatial effects on long-span bridge response in nonstationary inhomogeneous random fields

    Science.gov (United States)

    Jiahao, Lin; Yahui, Zhang; Yan, Zhao

    2005-06-01

    The long-span bridge response to nonstationary multiple seismic random excitations is investigated using the PEM (pseudo excitation method). This method transforms the nonstationary random response analysis into ordinary direct dynamic analysis, and therefore, the analysis can be solved conveniently using the Newmark, Wilson-θ schemes or the precise integration method. Numerical results of the seismic response for an actual long-span bridge using the proposed PEM are given and compared with the results based on the conventional stationary analysis. From the numerical comparisons, it was found that both the seismic spatial effect and the nonstationary effect are quite important, and that both stationary and nonstationary seismic analysis should pay special attention to the wave passage effect.

  15. Seismic spatial effects on long-span bridge response in nonstationary inhomogeneous random fields

    Institute of Scientific and Technical Information of China (English)

    Lin Jiahao; Zhang Yahui; Zhao Yan

    2005-01-01

    The long-span bridge response to nonstationary multiple seismic random excitations is investigated using the PEM (pseudo excitation method). This method transforms the nonstationary random response analysis into ordinary direct dynamic analysis, and therefore, the analysis can be solved conveniently using the Newmark, Wilson-θ schemes or the precise integration method. Numerical results of the seismic response for an actual long-span bridge using the proposed PEM are given and compared with the results based on the conventional stationary analysis. From the numerical comparisons, it was found that both the seismic spatial effect and the nonstationary effect are quite important, and that both stationary and nonstationary seismic analysis should pay special attention to the wave passage effect.

  16. Seismic qualification of PWR plant auxiliary feedwater systems

    Energy Technology Data Exchange (ETDEWEB)

    Lu, S.C.; Tsai, N.C.

    1983-08-01

    The NRC Standard Review Plan specifies that the auxiliary feedwater (AFW) system of a pressurized water reactor (PWR) is a safeguard system that functions in the event of a Safe Shutdown Earthquake (SSE) to remove the decay heat via the steam generator. Only recently licensed PWR plants have an AFW system designed to the current Standard Review Plan specifications. The NRC devised the Multiplant Action Plan C-14 in order to make a survey of the seismic capability of the AFW systems of operating PWR plants. The purpose of this survey is to enable the NRC to make decisions regarding the need of requiring the licensees to upgrade the AFW systems to an SSE level of seismic capability. To implement the first phase of the C-14 plan, the NRC issued a Generic Letter (GL) 81-14 to all operating PWR licensees requesting information on the seismic capability of their AFW systems. This report summarizes Lawrence Livermore National Laboratory's efforts to assist the NRC in evaluating the status of seismic qualification of the AFW systems in 40 PWR plants, by reviewing the licensees' responses to GL 81-14.

  17. Monitoring El Hierro submarine volcanic eruption events with a submarine seismic array

    Science.gov (United States)

    Jurado, Maria Jose; Molino, Erik; Lopez, Carmen

    2013-04-01

    A submarine volcanic eruption took place near the southernmost emerged land of the El Hierro Island (Canary Islands, Spain), from October 2011 to February 2012. The Instituto Geografico Nacional (IGN) seismic stations network evidenced seismic unrest since July 2012 and was a reference also to follow the evolution of the seismic activity associated with the volcanic eruption. From the beginning of the eruption a geophone string was installed less than 2 km away from the new volcano, next to La Restinga village shore, to record seismic activity related to the volcanic activity, continuously and with special interest on high frequency events. The seismic array was endowed with 8, high frequency, 3 component, 250 Hz, geophone cable string with a separation of 6 m between them. The analysis of the dataset using spectral techniques allows the characterization of the different phases of the eruption and the study of its dynamics. The correlation of the data analysis results with the observed sea surface activity (ash and lava emission and degassing) and also with the seismic activity recorded by the IGN field seismic monitoring system, allows the identification of different stages suggesting the existence of different signal sources during the volcanic eruption and also the posteruptive record of the degassing activity. The study shows that the high frequency capability of the geophone array allow the study of important features that cannot be registered by the standard seismic stations. The accumulative spectral amplitude show features related to eruptive changes.

  18. Drill bit seismic, vertical seismic profiling, and seismic depth imaging to acid drilling decisions in the Tho Tinh structure Nam Con Son Basin-Vietnam

    Energy Technology Data Exchange (ETDEWEB)

    Borland, W.; Leaney, W.; Nakanishi, S.; Kusaka, H.

    1998-02-01

    Rapid deposition in the Nam Con Son Basin during the Miocene resulted in under-compacted shales. These under-compacted shales are often associated with over-pressured formations. As these shales have excess water and tend to be mechanically weak, the safe mud window for drilling the under-compacted interval can be quite narrow. Efficient and safe drilling operations require accurate depth predictions of these over-pressured formations as well a knowledge of the magnitude of the over-pressure. In this paper we describe a technique which combines the best aspects of conventional Vertical Seismic Profiles (VSP) and Reverse Vertical Seismic Profiles (RVSP) to detect under-compacted shales and predict formation pressures to locate drilling hazards below TD. Under-compacted shales with excess water will have a lower acoustic impedance than expected from the compaction trend. Shales that depart from the compaction trend may indicate potential drilling hazards below. Conventional VSPs provide high quality reflection data at discrete intervals in the well, and can be used to accurately predict acoustic impedance below the bit. This acoustic impedance is then interpreted to provide both the location (in time and depth) of the drilling hazard and the mud weight necessary to contain it. The two-way time estimate of the hazard location is usually quite accurate but the depth estimate is less certain due to the estimation error in formation velocities below TD. The RVSP using the drill bit as a source, provides a continuous time versus depth relationship while drilling. This time versus depth is used to continually update the conventional VSP depth prediction of the drilling hazard and thus provide the most accurate depth of the hazard prior to its penetration. It is also used to update a depth-indexed display of existing surface seismic at the wellsite. 10 refs., 22 figs.

  19. The effect of source's shape for seismic wave propagation

    Science.gov (United States)

    Tanaka, S.; Mikada, H.; Goto, T.; Takekawa, J.; Onishi, K.; Kasahara, J.; Kuroda, T.

    2009-12-01

    In conventional simulation of seismic wave propagation, the source which generates signals is usually given by a point force or by a particle velocity at a point. In practice, seismic wave is generated by signal generators with finite volume and width. Since seismic lines span a distance up to hundreds meter to several kilometers, many people conducted seismic survey and data processing with the assumption that the size of signal generator is negligible compared with survey scale. However, there are no studies that tells how the size of baseplate influences generated seismic waves. Such estimations, therefore, are meaningful to consider the scale of generator. In this sense, current seismic processing might require a theoretical background about the seismic source for further detailed analysis. The main purpose of this study is to investigate the impact of seismic source’s shape to resultant wave properties, and then estimate how effective the consideration about the scale of signal generator is for analyzing the seismic data. To evaluate source’s scale effect, we performed finite element analysis with the 3D model including the baseplate of source and the heterogeneous ground medium. We adopted a finite element method (FEM) and chose the code named “MD Nastran” (MSC Software Ver.2008) to calculate seismic wave propagation. To verify the reliability of calculation, we compared the result of FEM and that of finite-difference method (FDM) with wave propagating simulation of isotropic and homogeneous model with a point source. The amplitude and phase of those two were nearly equal each other. We considered the calculation of FEM is accurate enough and can be performed in the following calculations. As the first step, we developed a simple point source model and a baseplate model. The point source model contains only the ground represented by an elastic medium. The force generating the signal is given at the nodal point of the surface in this case. On the other

  20. Seismic failure modes and seismic safety of Hardfill dam

    Directory of Open Access Journals (Sweden)

    Kun XIONG

    2013-04-01

    Full Text Available Based on microscopic damage theory and the finite element method, and using the Weibull distribution to characterize the random distribution of the mechanical properties of materials, the seismic response of a typical Hardfill dam was analyzed through numerical simulation during the earthquakes with intensities of 8 degrees and even greater. The seismic failure modes and failure mechanism of the dam were explored as well. Numerical results show that the Hardfill dam remains at a low stress level and undamaged or slightly damaged during an earthquake with an intensity of 8 degrees. During overload earthquakes, tensile cracks occur at the dam surfaces and extend to inside the dam body, and the upstream dam body experiences more serious damage than the downstream dam body. Therefore, under the seismic conditions, the failure pattern of the Hardfill dam is the tensile fracture of the upstream regions and the dam toe. Compared with traditional gravity dams, Hardfill dams have better seismic performance and greater seismic safety.

  1. Seismic design of equipment and piping systems for nuclear power plants in Japan

    Energy Technology Data Exchange (ETDEWEB)

    Minematsu, Akiyoshi [Tokyo Electric Power Co., Inc. (Japan)

    1997-03-01

    The philosophy of seismic design for nuclear power plant facilities in Japan is based on `Examination Guide for Seismic Design of Nuclear Power Reactor Facilities: Nuclear Power Safety Committee, July 20, 1981` (referred to as `Examination Guide` hereinafter) and the present design criteria have been established based on the survey of governmental improvement and standardization program. The detailed design implementation procedure is further described in `Technical Guidelines for Aseismic Design of Nuclear Power Plants, JEAG4601-1987: Japan Electric Association`. This report describes the principles and design procedure of the seismic design of equipment/piping systems for nuclear power plant in Japan. (J.P.N.)

  2. Assessment of the Metrological Performance of Seismic Tables for a QMS Recognition

    Science.gov (United States)

    Silva Ribeiro, A.; Campos Costa, A.; Candeias, P.; Sousa, J. Alves e.; Lages Martins, L.; Freitas Martins, A. C.; Ferreira, A. C.

    2016-11-01

    Seismic testing and analysis using large infrastructures, such as shaking tables and reaction walls, is performed worldwide requiring the use of complex instrumentation systems. To assure the accuracy of these systems, conformity assessment is needed to verify the compliance with standards and applications, and the Quality Management Systems (QMS) is being increasingly applied to domains where risk analysis is critical as a way to provide a formal recognition. This paper describes an approach to the assessment of the metrological performance of seismic shake tables as part of a QMS recognition, with the analysis of a case study of LNEC Seismic shake table.

  3. First level seismic microzonation map of Chennai city – a GIS approach

    OpenAIRE

    Ganapathy, G. P.

    2011-01-01

    Chennai city is the fourth largest metropolis in India, is the focus of economic, social and cultural development and it is the capital of the State of Tamil Nadu. The city has a multi-dimensional growth in development of its infrastructures and population. The area of Chennai has experienced moderate earthquakes in the historical past. Also the Bureau of Indian Standard upgraded the seismic status of Chennai from Low Seismic Hazard (Zone II) to Moderate Seismic Hazard (Zone III)–(BIS: 1893 (...

  4. Are seismic hazard assessment errors and earthquake surprises unavoidable?

    Science.gov (United States)

    Kossobokov, Vladimir

    2013-04-01

    demonstrated and sufficient justification of hazard assessment protocols; (b) a more complete learning of the actual range of earthquake hazards to local communities and populations, and (c) a more ethically responsible control over how seismic hazard and seismic risk is implemented to protect public safety. It follows that the international project GEM is on the wrong track, if it continues to base seismic risk estimates on the standard method to assess seismic hazard. The situation is not hopeless and could be improved dramatically due to available geological, geomorphologic, seismic, and tectonic evidences and data combined with deterministic pattern recognition methodologies, specifically, when intending to PREDICT PREDICTABLE, but not the exact size, site, date, and probability of a target event. Understanding the complexity of non-linear dynamics of hierarchically organized systems of blocks-and-faults has led already to methodologies of neo-deterministic seismic hazard analysis and intermediate-term middle- to narrow-range earthquake prediction algorithms tested in real-time applications over the last decades. It proves that Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such info in advance extreme catastrophes, which are LOW PROBABILITY EVENTS THAT HAPPEN WITH CERTAINTY. Geoscientists must initiate shifting the minds of community from pessimistic disbelieve to optimistic challenging issues of neo-deterministic Hazard Predictability.

  5. Global Seismicity: Three New Maps Compiled with Geographic Information Systems

    Science.gov (United States)

    Lowman, Paul D., Jr.; Montgomery, Brian C.

    1996-01-01

    This paper presents three new maps of global seismicity compiled from NOAA digital data, covering the interval 1963-1998, with three different magnitude ranges (mb): greater than 3.5, less than 3.5, and all detectable magnitudes. A commercially available geographic information system (GIS) was used as the database manager. Epicenter locations were acquired from a CD-ROM supplied by the National Geophysical Data Center. A methodology is presented that can be followed by general users. The implications of the maps are discussed, including the limitations of conventional plate models, and the different tectonic behavior of continental vs. oceanic lithosphere. Several little-known areas of intraplate or passive margin seismicity are also discussed, possibly expressing horizontal compression generated by ridge push.

  6. A study on seismicity and seismic hazard for Karnataka State

    Indian Academy of Sciences (India)

    T G Sitharam; Naveen James; K S Vipin; K Ganesha Raj

    2012-04-01

    This paper presents a detailed study on the seismic pattern of the state of Karnataka and also quantifies the seismic hazard for the entire state. In the present work, historical and instrumental seismicity data for Karnataka (within 300 km from Karnataka political boundary) were compiled and hazard analysis was done based on this data. Geographically, Karnataka forms a part of peninsular India which is tectonically identified as an intraplate region of Indian plate. Due to the convergent movement of the Indian plate with the Eurasian plate, movements are occurring along major intraplate faults resulting in seismic activity of the region and hence the hazard assessment of this region is very important. Apart from referring to seismotectonic atlas for identifying faults and fractures, major lineaments in the study area were also mapped using satellite data. The earthquake events reported by various national and international agencies were collected until 2009. Declustering of earthquake events was done to remove foreshocks and aftershocks. Seismic hazard analysis was done for the state of Karnataka using both deterministic and probabilistic approaches incorporating logic tree methodology. The peak ground acceleration (PGA) at rock level was evaluated for the entire state considering a grid size of 0.05° × 0.05°. The attenuation relations proposed for stable continental shield region were used in evaluating the seismic hazard with appropriate weightage factors. Response spectra at rock level for important Tier II cities and Bangalore were evaluated. The contour maps showing the spatial variation of PGA values at bedrock are presented in this work.

  7. Performance-Based Seismic Design of Steel Frames Utilizing Colliding Bodies Algorithm

    Directory of Open Access Journals (Sweden)

    H. Veladi

    2014-01-01

    Full Text Available A pushover analysis method based on semirigid connection concept is developed and the colliding bodies optimization algorithm is employed to find optimum seismic design of frame structures. Two numerical examples from the literature are studied. The results of the new algorithm are compared to the conventional design methods to show the power or weakness of the algorithm.

  8. Application of the Radon-FCL approach to seismic random noise suppression and signal preservation

    Science.gov (United States)

    Meng, Fanlei; Li, Yue; Liu, Yanping; Tian, Yanan; Wu, Ning

    2016-08-01

    The fractal conservation law (FCL) is a linear partial differential equation that is modified by an anti-diffusive term of lower order. The analysis indicated that this algorithm could eliminate high frequencies and preserve or amplify low/medium-frequencies. Thus, this method is quite suitable for the simultaneous noise suppression and enhancement or preservation of seismic signals. However, the conventional FCL filters seismic data only along the time direction, thereby ignoring the spatial coherence between neighbouring traces, which leads to the loss of directional information. Therefore, we consider the development of the conventional FCL into the time-space domain and propose a Radon-FCL approach. We applied a Radon transform to implement the FCL method in this article; performing FCL filtering in the Radon domain achieves a higher level of noise attenuation. Using this method, seismic reflection events can be recovered with the sacrifice of fewer frequency components while effectively attenuating more random noise than conventional FCL filtering. Experiments using both synthetic and common shot point data demonstrate the advantages of the Radon-FCL approach versus the conventional FCL method with regard to both random noise attenuation and seismic signal preservation.

  9. 3-D imaging of seismic data from a physical model of a salt structure

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, P. M. (Peter M.); Huang, L. (Lianjie); House, L. S. (Leigh S.); Wiley, R. (Robert)

    2001-01-01

    Seismic data from a physical model of the SEG/EAGE salt structure were imaged to evaluate the quality of imaging of a complex structure and benchmark imaging codes. The physical model was constructed at the University of Houston. Two simulated marine surveys were collected from it: a conventional towed streamer survey, and a vertical receiver cable survey.

  10. Seismic behaviour of geotechnical structures

    Directory of Open Access Journals (Sweden)

    F. Vinale

    2002-06-01

    Full Text Available This paper deals with some fundamental considerations regarding the behaviour of geotechnical structures under seismic loading. First a complete definition of the earthquake disaster risk is provided, followed by the importance of performing site-specific hazard analysis. Then some suggestions are provided in regard to adequate assessment of soil parameters, a crucial point to properly analyze the seismic behaviour of geotechnical structures. The core of the paper is centered on a critical review of the analysis methods available for studying geotechnical structures under seismic loadings. All of the available methods can be classified into three main classes, including the pseudo-static, pseudo-dynamic and dynamic approaches, each of which is reviewed for applicability. A more advanced analysis procedure, suitable for a so-called performance-based design approach, is also described in the paper. Finally, the seismic behaviour of the El Infiernillo Dam was investigated. It was shown that coupled elastoplastic dynamic analyses disclose some of the important features of dam behaviour under seismic loading, confirmed by comparing analytical computation and experimental measurements on the dam body during and after a past earthquake.

  11. Seismic Response and Performance Evaluation of Self-Centering LRB Isolators Installed on the CBF Building under NF Ground Motions

    OpenAIRE

    Junwon Seo; Jong Wan Hu

    2016-01-01

    This paper mainly treats the seismic behavior of lead-rubber bearing (LRB) isolation systems with superealstic shape memory alloy (SMA) bending bars functioning as damper and self-centering devices. The conventional LRB isolators that are usually installed at the column bases supply extra flexibility to the centrically braced frame (CBF) building with a view to elongate its vibration period, and thus make a contribution to mitigating seismic acceleration transferred from ground to structure. ...

  12. Differences between conventional and non-conventional MRI techniques in Parkinson’s disease

    Science.gov (United States)

    Baglieri, Annalisa; Marino, Maria Adele; Morabito, Rosa; Di Lorenzo, Giuseppe; Bramanti, Placido; Marino, Silvia

    2013-01-01

    Summary Magnetic resonance imaging (MRI) provides an in vivo assessment of cortical and subcortical regions affected in Parkinson’s disease (PD). This review summarizes the most important conventional and non-conventional MRI techniques applied in this field. Standard neuroimaging techniques have played a marginal role in the diagnosis and follow-up of PD, essentially being used only to discriminate atypical syndromes from PD, to exclude secondary causes such as vascular lesions, and to confirm the absence of specific imaging features found in atypical parkinsonisms. However, non-conventional MRI techniques, i.e. new neuroimaging approaches such as magnetic resonance spectroscopy, diffusion tensor imaging, and functional MRI, may allow the detection of structural, functional and metabolic changes useful not only for differential diagnosis, but also for early diagnosis and outcome and treatment monitoring in PD. In addition, we illustrate the advantages of high-field MRI over lower magnetic fields, highlighting the great potential of advanced neuroimaging techniques. PMID:24125556

  13. CONVENTIONAL DEVELOPMENT OF ENVIRONMENTAL PREOCCUPATIONS

    OpenAIRE

    2011-01-01

    A great number of the conventions referring to nature, even if they do not refer ton particular species, were limited from the point of view of geography and territories: we may give as example here a convention for the protection of flora, fauna and panoramic beauties of America, the African convention for nature and natural resources… By the Stockholm conferences, from the 5th of June 1972, we entered in a “dynamic of globalization”. Article 1 of the Declaration that followed the conference...

  14. Conventional Armaments for coming decades .

    Directory of Open Access Journals (Sweden)

    S.K. Salwan

    1997-10-01

    Full Text Available Conventional arnaments have continued to play a decisive role even in the present scenario of nuclear weapons and electronic warfare. As a war-fighting technology, they are low cost, reliable, highly effective and proven in several battlefield situations. With the application of advancements in electronics, materials and manufacturing technologies, computers and propulsion technologies to conventional weapon systems, they are capable of having greater flexibility, lethality , accuracy and effectiveness. This communication gives an overview on advancements in conventional armament systems, emerging trends in weapon technologies and modern enabling technologies for advanced weapon systems.

  15. Application of Seismic Array Processing to Tsunami Early Warning

    Science.gov (United States)

    An, C.; Meng, L.

    2015-12-01

    Tsunami wave predictions of the current tsunami warning systems rely on accurate earthquake source inversions of wave height data. They are of limited effectiveness for the near-field areas since the tsunami waves arrive before data are collected. Recent seismic and tsunami disasters have revealed the need for early warning to protect near-source coastal populations. In this work we developed the basis for a tsunami warning system based on rapid earthquake source characterisation through regional seismic array back-projections. We explored rapid earthquake source imaging using onshore dense seismic arrays located at regional distances on the order of 1000 km, which provides faster source images than conventional teleseismic back-projections. We implement this method in a simulated real-time environment, and analysed the 2011 Tohoku earthquake rupture with two clusters of Hi-net stations in Kyushu and Northern Hokkaido, and the 2014 Iquique event with the Earthscope USArray Transportable Array. The results yield reasonable estimates of rupture area, which is approximated by an ellipse and leads to the construction of simple slip models based on empirical scaling of the rupture area, seismic moment and average slip. The slip model is then used as the input of the tsunami simulation package COMCOT to predict the tsunami waves. In the example of the Tohoku event, the earthquake source model can be acquired within 6 minutes from the start of rupture and the simulation of tsunami waves takes less than 2 min, which could facilitate a timely tsunami warning. The predicted arrival time and wave amplitude reasonably fit observations. Based on this method, we propose to develop an automatic warning mechanism that provides rapid near-field warning for areas of high tsunami risk. The initial focus will be Japan, Pacific Northwest and Alaska, where dense seismic networks with the capability of real-time data telemetry and open data accessibility, such as the Japanese HiNet (>800

  16. Towards Improved Considerations of Risk in Seismic Design (Plinius Medal Lecture)

    Science.gov (United States)

    Sullivan, T. J.

    2012-04-01

    The aftermath of recent earthquakes is a reminder that seismic risk is a very relevant issue for our communities. Implicit within the seismic design standards currently in place around the world is that minimum acceptable levels of seismic risk will be ensured through design in accordance with the codes. All the same, none of the design standards specify what the minimum acceptable level of seismic risk actually is. Instead, a series of deterministic limit states are set which engineers then demonstrate are satisfied for their structure, typically through the use of elastic dynamic analyses adjusted to account for non-linear response using a set of empirical correction factors. From the early nineties the seismic engineering community has begun to recognise numerous fundamental shortcomings with such seismic design procedures in modern codes. Deficiencies include the use of elastic dynamic analysis for the prediction of inelastic force distributions, the assignment of uniform behaviour factors for structural typologies irrespective of the structural proportions and expected deformation demands, and the assumption that hysteretic properties of a structure do not affect the seismic displacement demands, amongst other things. In light of this a number of possibilities have emerged for improved control of risk through seismic design, with several innovative displacement-based seismic design methods now well developed. For a specific seismic design intensity, such methods provide a more rational means of controlling the response of a structure to satisfy performance limit states. While the development of such methodologies does mark a significant step forward for the control of seismic risk, they do not, on their own, identify the seismic risk of a newly designed structure. In the U.S. a rather elaborate performance-based earthquake engineering (PBEE) framework is under development, with the aim of providing seismic loss estimates for new buildings. The PBEE framework

  17. Advancing New 3D Seismic Interpretation Methods for Exploration and Development of Fractured Tight Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    James Reeves

    2005-01-31

    In a study funded by the U.S. Department of Energy and GeoSpectrum, Inc., new P-wave 3D seismic interpretation methods to characterize fractured gas reservoirs are developed. A data driven exploratory approach is used to determine empirical relationships for reservoir properties. Fractures are predicted using seismic lineament mapping through a series of horizon and time slices in the reservoir zone. A seismic lineament is a linear feature seen in a slice through the seismic volume that has negligible vertical offset. We interpret that in regions of high seismic lineament density there is a greater likelihood of fractured reservoir. Seismic AVO attributes are developed to map brittle reservoir rock (low clay) and gas content. Brittle rocks are interpreted to be more fractured when seismic lineaments are present. The most important attribute developed in this study is the gas sensitive phase gradient (a new AVO attribute), as reservoir fractures may provide a plumbing system for both water and gas. Success is obtained when economic gas and oil discoveries are found. In a gas field previously plagued with poor drilling results, four new wells were spotted using the new methodology and recently drilled. The wells have estimated best of 12-months production indicators of 2106, 1652, 941, and 227 MCFGPD. The latter well was drilled in a region of swarming seismic lineaments but has poor gas sensitive phase gradient (AVO) and clay volume attributes. GeoSpectrum advised the unit operators that this location did not appear to have significant Lower Dakota gas before the well was drilled. The other three wells are considered good wells in this part of the basin and among the best wells in the area. These new drilling results have nearly doubled the gas production and the value of the field. The interpretation method is ready for commercialization and gas exploration and development. The new technology is adaptable to conventional lower cost 3D seismic surveys.

  18. TECHNICAL NOTES SEISMIC SOIL-STRUCTURE INTERACTION ...

    African Journals Online (AJOL)

    dell

    SEISMIC SOIL-STRUCTURE INTERACTION AS A POTENTIAL TOOL FOR. ECONOMICAL SEISMIC ... inherent in the system as in any other material like the superstructure itself. ..... [9] Gazetas, G., “Analysis of Machine. Foundation Vibration: ...

  19. SEG Advances in Rotational Seismic Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Pierson, Robert; Laughlin, Darren; Brune, Bob

    2016-10-17

    Significant advancements in the development of sensors to enable rotational seismic measurements have been achieved. Prototypes are available now to support experiments that help validate the utility of rotational seismic measurements.

  20. Estimating the detectability of faults in 3D-seismic data - A valuable input to Induced Seismic Hazard Assessment (ISHA)

    Science.gov (United States)

    Goertz, A.; Kraft, T.; Wiemer, S.; Spada, M.

    2012-12-01

    In the past several years, some geotechnical operations that inject fluid into the deep subsurface, such as oil and gas development, waste disposal, and geothermal energy development, have been found or suspected to cause small to moderate sized earthquakes. In several cases the largest events occurred on previously unmapped faults, within or in close vicinity to the operated reservoirs. The obvious conclusion drawn from this finding, also expressed in most recently published best practice guidelines and recommendations, is to avoid injecting into faults. Yet, how certain can we be that all faults relevant to induced seismic hazard have been identified, even around well studied sites? Here we present a probabilistic approach to assess the capability of detecting faults by means of 3D seismic imaging. First, we populate a model reservoir with seed faults of random orientation and slip direction. Drawing random samples from a Gutenberg-Richter distribution, each seed fault is assigned a magnitude and corresponding size using standard scaling relations based on a circular rupture model. We then compute the minimum resolution of a 3D seismic survey for given acquisition parameters and frequency bandwidth. Assuming a random distribution of medium properties and distribution of image frequencies, we obtain a probability that a fault of a given size is detected, or respectively overlooked, by the 3D seismic. Weighting the initial Gutenberg-Richter fault size distribution with the probability of imaging a fault, we obtain a modified fault size distribution in the imaged volume from which we can constrain the maximum magnitude to be considered in the seismic hazard assessment of the operation. We can further quantify the value of information associated with the seismic image by comparing the expected insured value loss between the image-weighted and the unweighted hazard estimates.

  1. Seismic scanning tunneling macroscope - Theory

    KAUST Repository

    Schuster, Gerard T.

    2012-09-01

    We propose a seismic scanning tunneling macroscope (SSTM) that can detect the presence of sub-wavelength scatterers in the near-field of either the source or the receivers. Analytic formulas for the time reverse mirror (TRM) profile associated with a single scatterer model show that the spatial resolution limit to be, unlike the Abbe limit of λ/2, independent of wavelength and linearly proportional to the source-scatterer separation as long as the point scatterer is in the near-field region; if the sub-wavelength scatterer is a spherical impedance discontinuity then the resolution will also be limited by the radius of the sphere. Therefore, superresolution imaging can be achieved as the scatterer approaches the source. This is analogous to an optical scanning tunneling microscope that has sub-wavelength resolution. Scaled to seismic frequencies, it is theoretically possible to extract 100 Hz information from 20 Hz data by imaging of near-field seismic energy.

  2. The Apollo passive seismic experiment

    Science.gov (United States)

    Latham, G. V.; Dorman, H. J.; Horvath, P.; Ibrahim, A. K.; Koyama, J.; Nakamura, Y.

    1979-01-01

    The completed data set obtained from the 4-station Apollo seismic network includes signals from approximately 11,800 events of various types. Four data sets for use by other investigators, through the NSSDC, are in preparation. Some refinement of the lunar model based on seismic data can be expected, but its gross features remain as presented two years ago. The existence of a small, molten core remains dependent upon the analysis of signals from a single, far-side impact. Analysis of secondary arrivals from other sources may eventually resolve this issue, as well as continued refinement of the magnetic field measurements. Evidence of considerable lateral heterogeneity within the moon continues to build. The mystery of the much meteoroid flux estimate derived from lunar seismic measurements, as compared with earth-based estimates, remains; although, significant correlations between terrestrial and lunar observations are beginning to emerge.

  3. USGS National Seismic Hazard Maps

    Science.gov (United States)

    Frankel, A.D.; Mueller, C.S.; Barnhard, T.P.; Leyendecker, E.V.; Wesson, R.L.; Harmsen, S.C.; Klein, F.W.; Perkins, D.M.; Dickman, N.C.; Hanson, S.L.; Hopper, M.G.

    2000-01-01

    The U.S. Geological Survey (USGS) recently completed new probabilistic seismic hazard maps for the United States, including Alaska and Hawaii. These hazard maps form the basis of the probabilistic component of the design maps used in the 1997 edition of the NEHRP Recommended Provisions for Seismic Regulations for New Buildings and Other Structures, prepared by the Building Seismic Safety Council arid published by FEMA. The hazard maps depict peak horizontal ground acceleration and spectral response at 0.2, 0.3, and 1.0 sec periods, with 10%, 5%, and 2% probabilities of exceedance in 50 years, corresponding to return times of about 500, 1000, and 2500 years, respectively. In this paper we outline the methodology used to construct the hazard maps. There are three basic components to the maps. First, we use spatially smoothed historic seismicity as one portion of the hazard calculation. In this model, we apply the general observation that moderate and large earthquakes tend to occur near areas of previous small or moderate events, with some notable exceptions. Second, we consider large background source zones based on broad geologic criteria to quantify hazard in areas with little or no historic seismicity, but with the potential for generating large events. Third, we include the hazard from specific fault sources. We use about 450 faults in the western United States (WUS) and derive recurrence times from either geologic slip rates or the dating of pre-historic earthquakes from trenching of faults or other paleoseismic methods. Recurrence estimates for large earthquakes in New Madrid and Charleston, South Carolina, were taken from recent paleoliquefaction studies. We used logic trees to incorporate different seismicity models, fault recurrence models, Cascadia great earthquake scenarios, and ground-motion attenuation relations. We present disaggregation plots showing the contribution to hazard at four cities from potential earthquakes with various magnitudes and

  4. Historical Seismicity of Central Panama

    Science.gov (United States)

    Camacho, E.

    2013-05-01

    Central Panama lies in the Panama microplate, neighboring seismically active regions of Costa Rica and Colombia. This region, crossed by the Panama Canal, concentrates most of the population and economic activity of the Republic of Panama. Instrumental observation of earthquakes in Panama began on 1882 by the Compagnie Universelle du Canal Interocéanique de Panama and continued from 1904 to 1977 by the Panama Canal Company. From October 1997 to March 1998 the USGS deployed a temporary digital seismic network. Since 2003 this region is monitored by a digital seismic network operated by the Panama Canal Authority and I complemented by the broad band stations of the University of Panama seismic network. The seismicity in this region is very diffuse and the few events which are recorded have magnitudes less than 3.0. Historical archives and antique newspapers from Spain, Colombia, Panama and the United Sates have been searched looking for historical earthquake information which could provide a better estimate of the seismicity in this region. We find that Panama City has been shaken by two destructive earthquakes in historical times. One by a local fault (i.e. Pedro Miguel fault) on May 2, 1621 (I=Vlll MM), and a subduction event from the North Panama Deformed Belt (NPDB) on September 7, 1882 (I=Vll MM). To test these findings two earthquakes scenarios were generated, using SELENA, for Panama City Old Quarter. Panama City was rebuilt on January 21, 1673, on a rocky point facing the Pacific Ocean after the sack by pirate Morgan on January 28, 1671. The pattern of damage to calicanto (unreinforced colonial masonry) and wood structures for a crustal local event are higher than those for an event from the NPDB and seem to confirm that the city has not been shaken by a major local event since May 2, 1621 and a subduction event since September 7, 1882

  5. A Review of Seismicity in 2008

    Institute of Scientific and Technical Information of China (English)

    Li Gang; Liu Jie; Yu Surong

    2009-01-01

    @@ 1 SURVEY OF GLOBE SEISMICITY IN 2008 A total of 19 strong earthquakes with Ms≥7.0 occurred in the world in 2008 according to the Chinese Seismic Station Network (Table 1 ). The strongest earthquake was the Wenchuan earthquake with Ms8.0 on May 12,2008 (Fig.1). Earthquake frequency was apparently lower and the energy release remarkably attenuated in 2008, compared to seismicity in 2007. The characteristics of seismicity are as follows:

  6. Seismic detection of meteorite impacts on Mars

    OpenAIRE

    Teanby, N.A.; Wookey, J.

    2011-01-01

    Abstract Meteorite impacts provide a potentially important seismic source for probing Mars? interior. It has recently been shown that new craters can be detected from orbit using high resolution imaging, which means the location of any impact-related seismic event could be accurately determined thus improving the constraints that could be placed on internal structure using a single seismic station. This is not true of other seismic sources on Mars such as sub-surface faulting, whic...

  7. Time-lapse seismic within reservoir engineering

    OpenAIRE

    Oldenziel, T.

    2003-01-01

    Time-lapse 3D seismic is a fairly new technology allowing dynamic reservoir characterisation in a true volumetric sense. By investigating the differences between multiple seismic surveys, valuable information about changes in the oil/gas reservoir state can be captured. Its interpretation involves different disciplines, of which the main three are: reservoir management, rock physics, and seismics. The main challenge is expressed as "How to optimally benefit from time-lapse seismic". The chall...

  8. Robustness of timber structures in seismic areas

    OpenAIRE

    Neves, Luís A.C.; Branco, Jorge M.

    2011-01-01

    Major similarities between robustness assessment and seismic design exist, and significant information can be brought from seismic design to robustness design. As will be discussed, although some methods and limitations considered in seismic design can improve robustness, the capacity of the structure to sustain limited damage without disproportionate effects is significantly more complex. In fact, seismic design can either improve or reduce the resistance of structures to unfo...

  9. NULL Convention Floating Point Multiplier

    OpenAIRE

    Anitha Juliette Albert; Seshasayanan Ramachandran

    2015-01-01

    Floating point multiplication is a critical part in high dynamic range and computational intensive digital signal processing applications which require high precision and low power. This paper presents the design of an IEEE 754 single precision floating point multiplier using asynchronous NULL convention logic paradigm. Rounding has not been implemented to suit high precision applications. The novelty of the research is that it is the first ever NULL convention logic multiplier, designed to p...

  10. Simulating a guitar with a conventional sonometer

    CERN Document Server

    Burstein, Zily; Varieschi, Gabriele U

    2011-01-01

    Musical acoustics is an interesting sub-field of physics which is usually able to engage students in a dual perspective, by combining science and art together. The physics principles involved in most musical instruments can be easily demonstrated with standard laboratory equipment and can become part of lecture or lab activities. In particular, we will show in this paper how to simulate a guitar using a conventional sonometer, in relation to the problem of the instrument intonation, i.e., how to obtain correctly tuned notes on a guitar or similar string instruments.

  11. Simulating a Guitar with a Conventional Sonometer

    Science.gov (United States)

    Burstein, Zily; Gower, Christina M.; Varieschi, Gabriele U.

    Musical acoustics is an interesting sub-field of physics which is usually able to engage students in a dual perspective, by combining science and art together. The physics principles involved in most musical instruments can be easily demonstrated with standard laboratory equipment and can become part of lecture or lab activities. In particular, we will show in this paper how to simulate a guitar using a conventional sonometer, in relation to the problem of the instrument intonation, i.e., how to obtain correctly tuned notes on a guitar or similar string instruments.

  12. A new design strategy based on a deterministic definition of the seismic input to overcome the limits of design procedures based on probabilistic approaches

    CERN Document Server

    Fasan, Marco; Noè, Salavore; Panza, Giuliano; Magrin, Andrea; Romanelli, Fabio; Vaccari, Franco

    2015-01-01

    In this paper, a new seismic Performance Based Design (PBD) process based on a deterministic definition of the seismic input is presented. The proposed procedure aims to address the following considerations, arisen from the analysis of seismic phenomena, which cannot be taken in account using standard probabilistic seismic input (PSHA): a) any structure at a given location, regardless of its importance, is subject to the same shaking as a result of a given earthquake, b) it is impossible to determine when a future earthquake of a given intensity/magnitude will occur, c) insufficient data are available to develop reliable statistics with regards to earthquakes. On the basis of these considerations, the seismic input at a given site - determined on the basis of the seismic history, the seismogenic zones and the seismogenic nodes - is defined using the Neo Deterministic Seismic Hazard Assessment (NDSHA). Two different analysis are carried out at different levels of detail. The first one (RSA) provides the Maximu...

  13. Development of seismic analysis model of LMFBR and seismic time history response analysis

    Energy Technology Data Exchange (ETDEWEB)

    Koo, K. H.; Lee, J. H.; Yoo, B. [KAERI, Taejon (Korea, Republic of)

    2001-05-01

    The main objective of this paper is to develop the seismic analysis model of KALIMER reactor structures including the primary coolant of sodium and to evaulate the seismic responses of the maximum peak acceleration and the relative displacements by the time history seismic response analysis. The seismic time history response analyses were carried out for both cases of the seismic isolation design and the non-isolation one to verify the seismic isolation performance. From the results of seismic response analysis using the developed seismic analysis model, it is clearly verified that the seismic isolation design gives very significantly reduced seismic responses compared with the non-isolation design. All design criteria for the relative displacement repsonse were satisfied for KALIMER reactor structures.

  14. Ground truth : vertical seismic profile data enables geophysicists to image ahead of the drill bit

    Energy Technology Data Exchange (ETDEWEB)

    Eaton, S. [SR ECO Consultants Inc., Calgary, AB (Canada)

    2001-08-01

    This paper presented a new technology which makes it possible to obtain a vertical seismic profile (VSP) of a wellbore via a wireline tool. Downhole seismic is of extreme importance in cases when there is a discrepancy between the geology in the well and surface seismic data and when drilling has gone deeper than the prognosis for oil or gas. Once VSP data are interpreted, the decision can be made to either abandon the well or sidetrack it to an optimum target position. The VSP data give the geophysicist the opportunity to recalibrate the processing of conventional 2-D or 3-D surface seismic data while drilling. Crucial assumptions for the velocity fields can be tested. This new technology links geology and geophysics, making it possible to quantify subsurface reservoir parameters and to obtain downhole seismic that provides a higher frequency and spatial resolution than conventional surface seismic surveys. The energy source for downhole seismic is situated at ground level. The signal then travels down into the earth where it is recorded in the subsurface by a vertical array of geophones situated in the wellbore. Some of the signal travels past the bottom of the borehole, through the underlying layers that still have to be drilled. Geophysicists with PanCanadian Petroleum Ltd. and Baker Atlas state that a VSP gives ground truth because the acquired data enables the geophysicist to image ahead of the drill bit. VSP is the ultimate tool in interval velocity and time to depth conversion. Downhole seismic has 25 per cent higher frequencies than surface seismic. The technology has been successfully used by Talisman Energy Inc., to drill Foothills wells in the Monkman Pass area of northeastern British Columbia. VSP data can be used to predict formation pressures, porosities, lithologies or rock types, and fluid content. The technology has been useful in the drilling of hostile holes offshore Sable Island in Nova Scotia where wells can cost up to $30 million. VSPs are

  15. A study of seismic wave propagation in heterogeneous crust

    Science.gov (United States)

    Akerberg, Peeter Michael

    Three different aspects of estimating properties from seismic data are treated in this thesis: (1) Deterministic processing of a high resolution shallow seismic data set with good geologic control, (2) traveltime estimation from complicated models described statistically, and (3) estimation of a the vertical autocorrelation length of such models. The first part of this thesis is the processing and interpretation of a shallow seismic dataset collected in an open pit copper mine near Tyrone, New Mexico. The seismic image is compared with the outcrop in the open pit mine wall along which the seismic line was collected, and with drill data obtained from the mine operators. Specific features imaged by the experiment include the base of the overlaying sediment, the base of the leached capping, and fractures and shear zones that control local ground water flow. The features in the migrated section compare well with outcrop and drill data. The second part of the thesis studies the systematic bias of velocities estimated from first arrival travel times measured from a class of very complicated velocity models. Traveltimes were computed for statistically described velocity models with anisotropic von Karman correlation functions. The results of a finite difference eikonal solver, corresponding to very small wavelength experiments, are compared to results from picking first arrivals of full wavefield finite difference simulations. The eikonal solver results show the largest systematic bias, corresponding to the ray theoretical limit, and the results from the full wavefield experiments are smaller, but with very similar dependence on aspect ratio of the anisotropic correlation function. The third part defines two methods to obtain the vertical correlation length from seismic data approximated by the primary reflectivity series, which conventionally is used as the ideal result of seismic imaging. The first method is based on fitting a theoretical power spectrum based on the

  16. Urban shear-wave reflection seismics: Reconstruction support by combined shallow seismic and engineering geology investigations

    Science.gov (United States)

    Polom, U.; Guenther, A.; Arsyad, I.; Wiyono, P.; Krawczyk, C. M.

    2009-12-01

    After the big 2004 Sumatra-Andaman earthquake, the massive reconstruction activities in the Aceh province (Northern Sumatra) were promoted by the Republic of Indonesia and the Federal Ministry of Economic Cooperation and Development. The aims of the project MANGEONAD (Management of Georisk Nanggroe Aceh Darussalam). are to establish geoscientific on the ground support for a sustainable development and management of save building constructions, lifelines, infrastructure and also natural resources. Therefore, shallow shear-wave reflection seismics was applied in close combination to engineering geology investigations in the period between 2005-2009 since depth and internal structure of the Krueng Aceh River delta (mainly young alluvial sediments) were widely unknown. Due to the requirements in the densely populated Banda Aceh region, lacking also traffic infrastructure, a small and lightweight engineering seismic setup of high mobility and high subsurface resolution capability was chosen. The S-wave land streamer system with 48 channels was applied successfully together with the ELVIS vibratory source using S- and P-waves on paved roads within the city of Banda Aceh. The performance of the S-wave system enabled the detailed seismic investigation of the shallow subsurface down to 50-150 m depth generating shaking frequencies between 20 Hz to 200 Hz. This also provides depth information extending the maximum depths of boreholes and Standard Penetrometer Testings (SPT), which could only be applied to max. 20 m depth. To integrate the results gained from all three methods, and further to provide a fast statistical analysis tool for engineering use, the Information System Engineering Geology (ISEG, BGR) was developed. This geospatial information tool includes the seismic data, all borehole information, geotechnical SPT and laboratory results from samples available in the investigation area. Thereby, the geotechnical 3D analysis of the subsurface units is enabled. The

  17. Seismic Physical Modeling Technology and Its Applications

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper introduces the seismic physical modeling technology in the CNPC Key Lab of Geophysical Exploration. It includes the seismic physical model positioning system, the data acquisition system, sources, transducers,model materials, model building techniques, precision measurements of model geometry, the basic principles of the seismic physical modeling and experimental methods, and two physical model examples.

  18. Seismic processing in the inverse data space

    NARCIS (Netherlands)

    Berkhout, A.J.

    2006-01-01

    Until now, seismic processing has been carried out by applying inverse filters in the forward data space. Because the acquired data of a seismic survey is always discrete, seismic measurements in the forward data space can be arranged conveniently in a data matrix (P). Each column in the data matrix

  19. Simplified seismic performance assessment and implications for seismic design

    Science.gov (United States)

    Sullivan, Timothy J.; Welch, David P.; Calvi, Gian Michele

    2014-08-01

    The last decade or so has seen the development of refined performance-based earthquake engineering (PBEE) approaches that now provide a framework for estimation of a range of important decision variables, such as repair costs, repair time and number of casualties. This paper reviews current tools for PBEE, including the PACT software, and examines the possibility of extending the innovative displacement-based assessment approach as a simplified structural analysis option for performance assessment. Details of the displacement-based s+eismic assessment method are reviewed and a simple means of quickly assessing multiple hazard levels is proposed. Furthermore, proposals for a simple definition of collapse fragility and relations between equivalent single-degree-of-freedom characteristics and multi-degree-of-freedom story drift and floor acceleration demands are discussed, highlighting needs for future research. To illustrate the potential of the methodology, performance measures obtained from the simplified method are compared with those computed using the results of incremental dynamic analyses within the PEER performance-based earthquake engineering framework, applied to a benchmark building. The comparison illustrates that the simplified method could be a very effective conceptual seismic design tool. The advantages and disadvantages of the simplified approach are discussed and potential implications of advanced seismic performance assessments for conceptual seismic design are highlighted through examination of different case study scenarios including different structural configurations.

  20. Critical assessment of seismic and geomechanics literature related to a high-level nuclear waste underground repository

    Energy Technology Data Exchange (ETDEWEB)

    Kana, D.D.; Vanzant, B.W.; Nair, P.K. [Southwest Research Inst., San Antonio, TX (USA). Center for Nuclear Waste Regulatory Analyses; Brady, B.H.G. [ITASCA Consulting Group, Inc., Minneapolis, MN (USA)

    1991-06-01

    A comprehensive literature assessment has been conducted to determine the nature and scope of technical information available to characterize the seismic performance of an underground repository and associated facilities. Significant deficiencies were identified in current practices for prediction of seismic response of underground excavations in jointed rock. Conventional analytical methods are based on a continuum representation of the host rock mass. Field observations and laboratory experiments indicate that, in jointed rock, the behavior of the joints controls the overall performance of underground excavations. Further, under repetitive seismic loading, shear displacement develops progressively at block boundaries. Field observations correlating seismicity and groundwater conditions have provided significant information on hydrological response to seismic events. However, lack of a comprehensive model of geohydrological response to seismicity has limited the transportability conclusions from field observations. Based on the literature study, matters requiring further research in relation to the Yucca Mountain repository are identified. The report focuses on understanding seismic processes in fractured tuff, and provides a basis for work on the geohydrologic response of a seismically disturbed rock mass. 220 refs., 43 figs., 11 tabs.

  1. Physical mechanism of seismic attenuation in a two-phase medium%双相介质中地震波衰减的物理机制

    Institute of Scientific and Technical Information of China (English)

    李子顺

    2008-01-01

    High-frequency seismic attenuation is conventionally attributed to anelastic absorption. In this paper, I present three studies on high-frequency seismic attenuation and propose that the physical mechanism results from the interference of elastic microscopic multiple scattering waves. First, I propose a new theory on wave propagation in a two-phase medium which is based on the concept that the basic unit for wave propagation is a nano-mass point. As a result of the elasticity variations of pore fluid and rock framework, micro multiple scattering waves would emerge at the wavelength of the seismic waves passing through the two-phase medium and their interference and overlap would generate high-frequency seismic attenuation. Second, I present a study of the frequency response of seismic transmitted waves by modeling thin-layers with thicknesses no larger than pore diameters. Results indicate that high-frequency seismic waves attenuate slightly in a near-surface water zone but decay significantly in a near-surface gas zone. Third, I analyze the seismic attenuation characteristics in near-surface water and gas zones using dual-well shots in the Songliao Basin, and demonstrate that the high-frequency seismic waves attenuate slightly in water zones but in gas zones the 160-1600 Hz propagating waves decay significantly. The seismic attenuation characteristics from field observations coincide with the modeling results. Conclusions drawn from these studies theoretically support seismic attenuation recovery.

  2. AutoCAD discipline layering convention. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, B.L.

    1995-05-17

    This document is a user`s guide to establishing layering standards for drawing development. Uniform layering standards are established to exchange of AutoCAD datasets between organizations and companies. Consistency in the layering conventions assists the user through logical separation and identification of drawing data. This allows the user to view and plot related aspects of a drawing separately or in combination. The use of color and Linetype by layer is the preferred layering convention method, however to accommodate specific needs, colors and linetypes can also be assigned on an entity basis. New drawing setup files (also identified in AutoCAD documentation as Prototype drawings) use this layering convention to establish discipline drawing layers that are routinely used. Additions, deletions or revisions to the layering conventions are encourage.

  3. The Self-Organising Seismic Early Warning Information Network

    Science.gov (United States)

    Eveslage, Ingmar; Fischer, Joachim; Kühnlenz, Frank; Lichtblau, Björn; Milkereit, Claus; Picozzi, Matteo

    2010-05-01

    The Self-Organising Seismic Early Warning Information Network (SOSEWIN) represents a new approach for Earthquake Early Warning Systems (EEWS), consisting in taking advantage of novel wireless communications technologies without the need of a planned, centralised infrastructure. It also sets out to overcome problems of insufficient node density, which typically affects present existing early warning systems, by having the SOSEWIN seismological sensing units being comprised of low-cost components (generally bought "off-the-shelf"), with each unit initially costing 100's of Euros, in contrast to 1,000's to 10,000's for standard seismological stations. The reduced sensitivity of the new sensing units arising from the use of lower-cost components will be compensated by the network's density, which in the future is expected to number 100's to 1000's over areas served currently by the order of 10's of standard stations. The robustness, independence of infrastructure, spontaneous extensibility due to a self-healing/self-organizing character in the case of removing/failing or adding sensors makes SOSEWIN potentially useful for various use cases, e.g. monitoring of building structures (as we could proof during the L'Aquila earthquake) or technical systems and most recently for seismic microzonation. Nevertheless the main purpose SOSEWIN was initially invented for is the earthquake early warning and rapid response, for which reason the ground motion is continuously monitored by conventional accelerometers (3-component) and processed within a station. Based on this, the network itself decides whether an event is detected cooperatively in a two-level hierarchical alarming protocol. Experiences and experiment results with the SOSEWIN-prototype installation in the Ataköy district of Istanbul (Turkey) are presented. The limited size of this installation with currently 20 nodes allows not answering certain questions regarding the useful or possible size of a SOSEWIN installation

  4. Einstein Synchronisation and the Conventionality of Simultaneity

    Directory of Open Access Journals (Sweden)

    Mladen Domazet

    2006-06-01

    Full Text Available Despite a broad-range title the paper settles for the related issue of whether the Special Theory of Relativity (STR necessarily advocates the demise of an ontological difference between past and future events, between past and future in general. In the jargon of H. Stein: are we forced, within the framework of the STR, to choose only between ‘solipsism’ and ‘determinism’ exclusively? A special emphasis is placed on the role that the conventionality of simultaneity plays in the STR with regards to this question. The standard arguments rely on the relativity of simultaneity, the claim that the STR negates the existence of a universal ‘present’ that divides the ‘past’ and the ‘future’, so as to conclude that there is no ontological difference between past and future events, that both are equally determined/real (‘determinism’. This often neglects the fact that to establish the ontological claims related to relativity of simultaneity, one must first resolve the issues of conventionality of simultaneity within the STR. The paper will aim to show that by addressing the issue of conventionality from Dummett’s ‘purely philosophical’ determination of the difference between the past and the future, we develop an understanding of the said difference, within the framework of the STR, beyond the (unwanted strict ontological dichotomy of ‘solipsism/determinism’, given that the criterion that is provided by the STR is understood as epistemic and not ontological.

  5. Communicating novel and conventional scientific metaphors

    DEFF Research Database (Denmark)

    Knudsen, Sanne

    2005-01-01

    Metaphors are more popular than ever in the study of scientific reasoning and culture because of their innovative and generative powers. It is assumed, that novel scientific metaphors become more clear and well-defined, as they become more established and conventional within the relevant discours...... changes too during the career of the metaphor. Whereas the standard scientific article is central in experimentally researching and explaining the metaphor, a mixture of more popular scientific genres dominate in the innovative conceptual development of the metaphor.......Metaphors are more popular than ever in the study of scientific reasoning and culture because of their innovative and generative powers. It is assumed, that novel scientific metaphors become more clear and well-defined, as they become more established and conventional within the relevant discourses....... But we still need empirical studies of the career of metaphors in scientific discourse and of the communicative strategies identifying a given metaphor as either novel or conventional. This paper presents a case study of the discursive development of the metaphor of "the genetic code" from...

  6. Seismic link at plate boundary

    Indian Academy of Sciences (India)

    Faical Ramdani; Omar Kettani; Benaissa Tadili

    2015-06-01

    Seismic triggering at plate boundaries has a very complex nature that includes seismic events at varying distances. The spatial orientation of triggering cannot be reduced to sequences from the main shocks. Seismic waves propagate at all times in all directions, particularly in highly active zones. No direct evidence can be obtained regarding which earthquakes trigger the shocks. The first approach is to determine the potential linked zones where triggering may occur. The second step is to determine the causality between the events and their triggered shocks. The spatial orientation of the links between events is established from pre-ordered networks and the adapted dependence of the spatio-temporal occurrence of earthquakes. Based on a coefficient of synchronous seismic activity to grid couples, we derive a network link by each threshold. The links of high thresholds are tested using the coherence of time series to determine the causality and related orientation. The resulting link orientations at the plate boundary conditions indicate that causal triggering seems to be localized along a major fault, as a stress transfer between two major faults, and parallel to the geothermal area extension.

  7. Southern Appalachian Regional Seismic Network

    Energy Technology Data Exchange (ETDEWEB)

    Chiu, S.C.C.; Johnston, A.C.; Chiu, J.M. [Memphis State Univ., TN (United States). Center for Earthquake Research and Information

    1994-08-01

    The seismic activity in the southern Appalachian area was monitored by the Southern Appalachian Regional Seismic Network (SARSN) since late 1979 by the Center for Earthquake Research and Information (CERI) at Memphis State University. This network provides good spatial coverage for earthquake locations especially in east Tennessee. The level of activity concentrates more heavily in the Valley and Ridge province of eastern Tennessee, as opposed to the Blue Ridge or Inner Piedmont. The large majority of these events lie between New York - Alabama lineament and the Clingman/Ocoee lineament, magnetic anomalies produced by deep-seated basement structures. Therefore SARSN, even with its wide station spacing, has been able to define the essential first-order seismological characteristics of the Southern Appalachian seismic zone. The focal depths of the southeastern U.S. earthquakes concentrate between 8 and 16 km, occurring principally beneath the Appalachian overthrust. In cross-sectional views, the average seismicity is shallower to the east beneath the Blue Ridge and Piedmont provinces and deeper to the west beneath the Valley and Ridge and the North American craton. Results of recent focal mechanism studies by using the CERI digital earthquake catalog between October, 1986 and December, 1991, indicate that the basement of the Valley and Ridge province is under a horizontal, NE-SW compressive stress. Right-lateral strike-slip faulting on nearly north-south fault planes is preferred because it agrees with the trend of the regional magnetic anomaly pattern.

  8. Seismic hazard studies in Egypt

    Science.gov (United States)

    Mohamed, Abuo El-Ela A.; El-Hadidy, M.; Deif, A.; Abou Elenean, K.

    2012-12-01

    The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba-Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5°) within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA) values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA) values were detected in the western part of the western desert and it is less than 25 gal.

  9. Seismic amplitude recovery with curvelets

    NARCIS (Netherlands)

    Moghaddam, P.P.; Herrmann, F.J.; Stolk, C.C.

    2007-01-01

    A non-linear singularity-preserving solution to the least-squares seismic imaging problem with sparseness and continuity constraints is proposed. The applied formalism explores curvelets as a directional frame that, by their sparsity on the image, and their invariance under the imaging operators,

  10. Quantifying Similarity in Seismic Polarizations

    Science.gov (United States)

    Eaton, D. W. S.; Jones, J. P.; Caffagni, E.

    2015-12-01

    Measuring similarity in seismic attributes can help identify tremor, low S/N signals, and converted or reflected phases, in addition to diagnosing site noise and sensor misalignment in arrays. Polarization analysis is a widely accepted method for studying the orientation and directional characteristics of seismic phases via. computed attributes, but similarity is ordinarily discussed using qualitative comparisons with reference values. Here we introduce a technique for quantitative polarization similarity that uses weighted histograms computed in short, overlapping time windows, drawing on methods adapted from the image processing and computer vision literature. Our method accounts for ambiguity in azimuth and incidence angle and variations in signal-to-noise (S/N) ratio. Using records of the Mw=8.3 Sea of Okhotsk earthquake from CNSN broadband sensors in British Columbia and Yukon Territory, Canada, and vertical borehole array data from a monitoring experiment at Hoadley gas field, central Alberta, Canada, we demonstrate that our method is robust to station spacing. Discrete wavelet analysis extends polarization similarity to the time-frequency domain in a straightforward way. Because histogram distance metrics are bounded by [0 1], clustering allows empirical time-frequency separation of seismic phase arrivals on single-station three-component records. Array processing for automatic seismic phase classification may be possible using subspace clustering of polarization similarity, but efficient algorithms are required to reduce the dimensionality.

  11. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-12-01

    We have developed and tested technology for a new type of direct hydrocarbon detection. The method uses inelastic rock properties to greatly enhance the sensitivity of surface seismic methods to the presence of oil and gas saturation. These methods include use of energy absorption, dispersion, and attenuation (Q) along with traditional seismic attributes like velocity, impedance, and AVO. Our approach is to combine three elements: (1) a synthesis of the latest rock physics understanding of how rock inelasticity is related to rock type, pore fluid types, and pore microstructure, (2) synthetic seismic modeling that will help identify the relative contributions of scattering and intrinsic inelasticity to apparent Q attributes, and (3) robust algorithms that extract relative wave attenuation attributes from seismic data. This project provides: (1) Additional petrophysical insight from acquired data; (2) Increased understanding of rock and fluid properties; (3) New techniques to measure reservoir properties that are not currently available; and (4) Provide tools to more accurately describe the reservoir and predict oil location and volumes. These methodologies will improve the industry's ability to predict and quantify oil and gas saturation distribution, and to apply this information through geologic models to enhance reservoir simulation. We have applied for two separate patents relating to work that was completed as part of this project.

  12. Seismic isolation for Advanced LIGO

    CERN Document Server

    Abbott, R; Allen, G; Cowley, S; Daw, E; Debra, D; Giaime, J; Hammond, G; Hammond, M; Hardham, C; How, J; Hua, W; Johnson, W; Lantz, B; Mason, K; Mittleman, R; Nichol, J; Richman, S; Rollins, J; Shoemaker, D; Stapfer, G; Stebbins, R

    2002-01-01

    The baseline design concept for a seismic isolation component of the proposed 'Advanced LIGO' detector upgrade has been developed with proof-of-principle experiments and computer models. It consists of a two-stage in-vacuum active isolation platform that is supported by an external hydraulic actuation stage. Construction is underway for prototype testing of a full-scale preliminary design.

  13. Seismicity dynamics and earthquake predictability

    Directory of Open Access Journals (Sweden)

    G. A. Sobolev

    2011-02-01

    Full Text Available Many factors complicate earthquake sequences, including the heterogeneity and self-similarity of the geological medium, the hierarchical structure of faults and stresses, and small-scale variations in the stresses from different sources. A seismic process is a type of nonlinear dissipative system demonstrating opposing trends towards order and chaos. Transitions from equilibrium to unstable equilibrium and local dynamic instability appear when there is an inflow of energy; reverse transitions appear when energy is dissipating. Several metastable areas of a different scale exist in the seismically active region before an earthquake. Some earthquakes are preceded by precursory phenomena of a different scale in space and time. These include long-term activation, seismic quiescence, foreshocks in the broad and narrow sense, hidden periodical vibrations, effects of the synchronization of seismic activity, and others. Such phenomena indicate that the dynamic system of lithosphere is moving to a new state – catastrophe. A number of examples of medium-term and short-term precursors is shown in this paper. However, no precursors identified to date are clear and unambiguous: the percentage of missed targets and false alarms is high. The weak fluctuations from outer and internal sources play a great role on the eve of an earthquake and the occurrence time of the future event depends on the collective behavior of triggers. The main task is to improve the methods of metastable zone detection and probabilistic forecasting.

  14. Improving the Monitoring, Verification, and Accounting of CO{sub 2} Sequestered in Geologic Systems with Multicomponent Seismic Technology and Rock Physics Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Alkan, Engin; DeAngelo, Michael; Hardage, Bob; Sava, Diana; Sullivan, Charlotte; Wagner, Donald

    2012-12-31

    concept that the same weight must be given to S-wave sequences and facies as is given to P-wave sequences and facies. This philosophy differs from the standard practice of depending on only conventional P-wave seismic stratigraphy to characterize reservoir units. The fundamental physics of elastic wavefield seismic stratigraphy is that S- wave modes sense different sequences and facies across some intervals than does a P-wave mode because S-wave displacement vectors are orthogonal to P- wave displacement vectors and thus react to a different rock fabric than do P waves. Although P and S images are different, both images can still be correct in terms of the rock fabric information they reveal.

  15. Seismic Imaging of Sandbox Models

    Science.gov (United States)

    Buddensiek, M. L.; Krawczyk, C. M.; Kukowski, N.; Oncken, O.

    2009-04-01

    Analog sandbox simulations have been applied to study structural geological processes to provide qualitative and quantitative insights into the evolution of mountain belts and basins. These sandbox simulations provide either two-dimensional and dynamic or pseudo-three-dimensional and static information. To extend the dynamic simulations to three dimensions, we combine the analog sandbox simulation techniques with seismic physical modeling of these sandbox models. The long-term objective of this approach is to image seismic and seismological events of static and actively deforming 3D analog models. To achieve this objective, a small-scale seismic apparatus, composed of a water tank, a PC control unit including piezo-electric transducers, and a positioning system, was built for laboratory use. For the models, we use granular material such as sand and glass beads, so that the simulations can evolve dynamically. The granular models are required to be completely water saturated so that the sources and receivers are directly and well coupled to the propagating medium. Ultrasonic source frequencies (˜500 kHz) corresponding to wavelengths ˜5 times the grain diameter are necessary to be able to resolve small scale structures. In three experiments of different two-layer models, we show that (1) interfaces of layers of granular materials can be resolved depending on the interface preparation more than on the material itself. Secondly, we show that the dilation between the sand grains caused by a string that has been pulled through the grains, simulating a shear zone, causes a reflection that can be detected in the seismic data. In the third model, we perform a seismic reflection survey across a model that contains both the prepared interface and a shear zone, and apply 2D-seismic reflection processing to improve the resolution. Especially for more complex models, the clarity and penetration depth need to be improved to study the evolution of geological structures in dynamic

  16. Statistical Seismology and Induced Seismicity

    Science.gov (United States)

    Tiampo, K. F.; González, P. J.; Kazemian, J.

    2014-12-01

    While seismicity triggered or induced by natural resources production such as mining or water impoundment in large dams has long been recognized, the recent increase in the unconventional production of oil and gas has been linked to rapid rise in seismicity in many places, including central North America (Ellsworth et al., 2012; Ellsworth, 2013). Worldwide, induced events of M~5 have occurred and, although rare, have resulted in both damage and public concern (Horton, 2012; Keranen et al., 2013). In addition, over the past twenty years, the increase in both number and coverage of seismic stations has resulted in an unprecedented ability to precisely record the magnitude and location of large numbers of small magnitude events. The increase in the number and type of seismic sequences available for detailed study has revealed differences in their statistics that previously difficult to quantify. For example, seismic swarms that produce significant numbers of foreshocks as well as aftershocks have been observed in different tectonic settings, including California, Iceland, and the East Pacific Rise (McGuire et al., 2005; Shearer, 2012; Kazemian et al., 2014). Similarly, smaller events have been observed prior to larger induced events in several occurrences from energy production. The field of statistical seismology has long focused on the question of triggering and the mechanisms responsible (Stein et al., 1992; Hill et al., 1993; Steacy et al., 2005; Parsons, 2005; Main et al., 2006). For example, in most cases the associated stress perturbations are much smaller than the earthquake stress drop, suggesting an inherent sensitivity to relatively small stress changes (Nalbant et al., 2005). Induced seismicity provides the opportunity to investigate triggering and, in particular, the differences between long- and short-range triggering. Here we investigate the statistics of induced seismicity sequences from around the world, including central North America and Spain, and

  17. Development of Deep-tow Autonomous Cable Seismic (ACS) for Seafloor Massive Sulfides (SMSs) Exploration.

    Science.gov (United States)

    Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hitoshi; Saito, Shutaro; Lee, Sangkyun; Tara, Kenji; Kato, Masafumi; Jamali Hondori, Ehsan; Sumi, Tomonori; Kadoshima, Kazuyuki; Kose, Masami

    2017-04-01

    Within the EEZ of Japan, numerous surveys exploring ocean floor resources have been conducted. The exploration targets are gas hydrates, mineral resources (manganese, cobalt or rare earth) and especially seafloor massive sulphide (SMS) deposits. These resources exist in shallow subsurface areas in deep waters (>1500m). For seismic explorations very high resolution images are required. These cannot be effectively obtained with conventional marine seismic techniques. Therefore we have been developing autonomous seismic survey systems which record the data close to the seafloor to preserve high frequency seismic energy. Very high sampling rate (10kHz) and high accurate synchronization between recording systems and shot time are necessary. We adopted Cs-base atomic clock considering its power consumption. At first, we developed a Vertical Cable Seismic (VCS) system that uses hydrophone arrays moored vertically from the ocean bottom to record close to the target area. This system has been successfully applied to SMS exploration. Specifically it fixed over known sites to assess the amount of reserves with the resultant 3D volume. Based on the success of VCS, we modified the VCS system to use as a more efficient deep-tow seismic survey system. Although there are other examples of deep-tow seismic systems, signal transmission cables present challenges in deep waters. We use our autonomous recording system to avoid these problems. Combining a high frequency piezoelectric source (Sub Bottom Profiler:SBP) that automatically shots with a constant interval, we achieve the high resolution deep-tow seismic without data transmission/power cable to the board. Although the data cannot be monitored in real-time, the towing system becomes very simple. We have carried out survey trial, which showed the systems utility as a high-resolution deep-tow seismic survey system. Furthermore, the frequency ranges of deep-towed source (SBP) and surface towed sparker are 700-2300Hz and 10-200Hz

  18. Recovering physical property information from subduction plate boundaries using 3D full-waveform seismic inversion

    Science.gov (United States)

    Bell, R. E.; Morgan, J. V.; Warner, M.

    2013-12-01

    Our understanding of subduction margin seismogenesis has been revolutionised in the last couple of decades with the discovery that the size of the seismogenic zone may not be controlled simply by temperature and a broad spectrum of seismic behaviour exists from stick-slip to stable sliding. Laboratory and numerical experiments suggest that physical properties, particularly fluid pressure may play an important role in controlling the seismic behaviour of subduction margins. Although drilling can provide information on physical properties along subduction thrust faults at point locations at relatively shallow depths, correlations between physical properties and seismic velocity using rock physics relationships are required to resolve physical properties along the margin and down-dip. Therefore, high resolution seismic velocity models are key to recovering physical property information at subduction plate boundaries away from drill sites. 3D Full waveform inversion (FWI) is a technique pioneered by the oil industry to obtain high-resolution high-fidelity models of physical properties in the sub-surface. 3D FWI involves the inversion of low-frequency (>2 to noise and inverted the windowed transmitted arrivals only. We also ran a suite of resolution tests across the model. The results show that 3D FWI of conventionally collected 3D seismic data across the Muroto Basin would be capable of resolving variations in P-wave velocity along the décollement of the order of half the seismic wavelength at the plate boundary. This is a significant improvement on conventional travel-time tomography which resolves to the Fresnel width. In this presentation we will also postulate on the optimal 3D FWI experiment design for the next generation of 3D seismic surveys across subduction margins as a guide for those embarking on new data collection.

  19. Robust seismicity forecasting based on Bayesian parameter estimation for epidemiological spatio-temporal aftershock clustering models.

    Science.gov (United States)

    Ebrahimian, Hossein; Jalayer, Fatemeh

    2017-08-29

    In the immediate aftermath of a strong earthquake and in the presence of an ongoing aftershock sequence, scientific advisories in terms of seismicity forecasts play quite a crucial role in emergency decision-making and risk mitigation. Epidemic Type Aftershock Sequence (ETAS) models are frequently used for forecasting the spatio-temporal evolution of seismicity in the short-term. We propose robust forecasting of seismicity based on ETAS model, by exploiting the link between Bayesian inference and Markov Chain Monte Carlo Simulation. The methodology considers the uncertainty not only in the model parameters, conditioned on the available catalogue of events occurred before the forecasting interval, but also the uncertainty in the sequence of events that are going to happen during the forecasting interval. We demonstrate the methodology by retrospective early forecasting of seismicity associated with the 2016 Amatrice seismic sequence activities in central Italy. We provide robust spatio-temporal short-term seismicity forecasts with various time intervals in the first few days elapsed after each of the three main events within the sequence, which can predict the seismicity within plus/minus two standard deviations from the mean estimate within the few hours elapsed after the main event.

  20. The development of seismic guidelines for the Stanford Linear Accelerator Center

    Energy Technology Data Exchange (ETDEWEB)

    Huggins, R.

    1996-08-01

    This paper describes the development of Seismic Guidelines for the Stanford Linear Accelerator Center (SLAC). Although structures have always been built conservatively, SLAC management decided to review and update their seismic guidelines. SLAC is about mid-way between the epicenters of the 8.3 Richter magnitude 1906 San Francisco and the 7.2 Loma Prieta Earthquakes. The west end of the two mile long electron/positron particle accelerator lies a half mile from the large San Andreas Fault. Suggestions for seismic planning processes were solicited from local computer manufacturing firms, universities, and federal laboratories. A Committee of the various stakeholders in SLAC`s seismic planning retained an internationally known Seismic Planning Consultant and reviewed relevant standards and drafted Guidelines. A panel of seismic experts was convened to help define the hazard, site response spectra, probabilistic analysis of shaking, and near field effects. The Facility`s structures were assigned to seismic classes of importance, and an initial assessment of a sample of a dozen buildings conducted. This assessment resulted in emergency repairs to one structure, and provided a {open_quotes}reality basis{close_quotes} for establishing the final Guidelines and Administrative Procedures, and a program to evaluate remaining buildings, shielding walls, tunnels, and other special structures.

  1. Seismic acquisition parameters analysis for deep weak reflectors in the South Yellow Sea

    Science.gov (United States)

    Liu, Kai; Liu, Huaishan; Wu, Zhiqiang; Yue, Long

    2016-10-01

    The Mesozoic-Paleozoic marine residual basin in the South Yellow Sea (SYS) is a significant deep potential hydrocarbon reservoir. However, the imaging of the deep prospecting target is quite challenging due to the specific seismic-geological conditions. In the Central and Wunansha Uplifts, the penetration of the seismic wavefield is limited by the shallow high-velocity layers (HVLs) and the weak reflections in the deep carbonate rocks. With the conventional marine seismic acquisition technique, the deep weak reflection is difficult to image and identify. In this paper, we could confirm through numerical simulation that the combination of multi-level air-gun array and extended cable used in the seismic acquisition is crucial for improving the imaging quality. Based on the velocity model derived from the geological interpretation, we performed two-dimensional finite difference forward modeling. The numerical simulation results show that the use of the multi-level air-gun array can enhance low-frequency energy and that the wide-angle reflection received at far offsets of the extended cable has a higher signal-to-noise ratio (SNR) and higher energy. Therefore, we have demonstrated that the unconventional wide-angle seismic acquisition technique mentioned above could overcome the difficulty in imaging the deep weak reflectors of the SYS, and it may be useful for the design of practical seismic acquisition schemes in this region.

  2. Back azimuth constrained double-difference seismic location and tomography for downhole microseismic monitoring

    Science.gov (United States)

    Chen, Yukuan; Zhang, Haijiang; Miao, Yuanyuan; Zhang, Yinsheng; Liu, Qiang

    2017-03-01

    We have developed a new seismic tomography method, back azimuth constrained double-difference (DD) seismic tomography, which is suitable for downhole microseismic monitoring of hydraulic fracturing. The new method simultaneously locates microseismic events and determines three-dimensional (3D) Vp and Vs models for the fracturing zone using differential arrival times from pairs of events and event back azimuths in addition to absolute arrival times. Compared to the existing DD location and tomography method, our method incorporates back azimuth information to better constrain microseismic event locations in the case of poor spatial station coverage such as the linear downhole seismic array generally used for microseismic monitoring. By incorporating the relative arrival time and back azimuth information of events, the extended DD method can provide better relative event locations, and thus can better characterize the fracture distribution. In addition to microseismic locations, seismic velocity anomalies determined around the fracturing zone may also provide valuable information for fracture development. Due to the existence of fractures and fluids, the seismic velocity is expected to be lower in the fractured zone compared to the surrounding regions. Therefore the area of low seismic velocity anomaly may be used as a proxy for the stimulated reservoir volume. We have applied the new method to a downhole microseismic dataset from shale gas hydraulic fracturing. The microseismic events are more accurately relocated than the conventional grid search location method, and they are generally associated with low velocity anomalies.

  3. Application of a New Wavelet Threshold Method in Unconventional Oil and Gas Reservoir Seismic Data Denoising

    Directory of Open Access Journals (Sweden)

    Guxi Wang

    2015-01-01

    Full Text Available Seismic data processing is an important aspect to improve the signal to noise ratio. The main work of this paper is to combine the characteristics of seismic data, using wavelet transform method, to eliminate and control such random noise, aiming to improve the signal to noise ratio and the technical methods used in large data systems, so that there can be better promotion and application. In recent years, prestack data denoising of all-digital three-dimensional seismic data is the key to data processing. Contrapose the characteristics of all-digital three-dimensional seismic data, and, on the basis of previous studies, a new threshold function is proposed. Comparing between conventional hard threshold and soft threshold, this function not only is easy to compute, but also has excellent mathematical properties and a clear physical meaning. The simulation results proved that this method can well remove the random noise. Using this threshold function in actual seismic processing of unconventional lithologic gas reservoir with low porosity, low permeability, low abundance, and strong heterogeneity, the results show that the denoising method can availably improve seismic processing effects and enhance the signal to noise ratio (SNR.

  4. Changes in mining-induced seismicity before and after the 2007 Crandall Canyon Mine collapse

    Science.gov (United States)

    Kubacki, Tex; Koper, Keith D.; Pankow, Kristine L.; McCarter, Michael K.

    2014-06-01

    On 6 August 2007, the Crandall Canyon Mine in central Utah experienced a major collapse that was recorded as an Mw 4.1 seismic event. Application of waveform cross-correlation detection techniques to data recorded at permanent seismic stations located within ~30 km of the mine has resulted in the discovery of 1494 previously unknown microseismic events related to the collapse. These events occurred between 26 July 2007 and 19 August 2007 and were detected with a magnitude threshold of completeness of 0.0, about 1.6 magnitude units smaller than the threshold associated with conventional techniques. Relative locations for the events were determined using a double-difference approach that incorporated absolute and differential arrival times. Absolute locations were determined using ground-truth reported in mine logbooks. Lineations apparent in the newly detected events have strikes similar to those of known vertical joints in the mine region, which may have played a role in the collapse. Prior to the collapse, seismicity occurred mostly in close proximity to active mining, though several distinct seismogenic hot spots within the mine were also apparent. In the 48 h before the collapse, changes in b value and event locations were observed. The collapse appears to have occurred when the migrating seismicity associated with direct mining activity intersected one of the areas identified as a seismic hot spot. Following the collapse, b values decreased and seismicity clustered farther to the east.

  5. Resolution enhancement of non-stationary seismic data using amplitude-frequency partition

    Science.gov (United States)

    Xie, Yujiang; Liu, Gao

    2015-02-01

    As the Earth's inhomogeneous and viscoelastic properties, seismic signal attenuation we are trying to mitigate is a long-standing problem facing with high-resolution techniques. For addressing such a problem in the fields of time-frequency transform, Gabor transform methods such as atom-window method (AWM) and molecular window method (MWM) have been reported recently. However, we observed that these methods might be much better if we partition the non-stationary seismic data into adaptive stationary segments based on the amplitude and frequency information of the seismic signal. In this study, we present a new method called amplitude-frequency partition (AFP) to implement this process in the time-frequency domain. Cases of a synthetic and field seismic data indicated that the AFP method could partition the non-stationary seismic data into stationary segments approximately, and significantly, a high-resolution result would be achieved by combining the AFP method with conventional spectral-whitening method, which could be considered superior to previous resolution-enhancement methods like time-variant spectral whitening method, the AWM and the MWM as well. This AFP method presented in this study would be an effective resolution-enhancement tool for the non-stationary seismic data in the fields of an adaptive time-frequency transform.

  6. Exterior beam-column joint study with non-conventional reinforcement detailing using mechanical anchorage under reversal loading

    Indian Academy of Sciences (India)

    S Rajagopal; S Prabavathy

    2014-10-01

    Reinforced concrete structures beam-column joints are the most critical regions in seismic prone areas. Proper reinforcement anchorage is essential to enhance the performance of the joints. An attempt has been made to appraise the performance of the anchorages and joints. The anchorages are detailed as per ACI-352 (mechanical anchorages), ACI-318 (conventional bent hooks) and IS-456 (conventional full anchorage). The joints are detailed without confinement in group-I and with additional X-cross bar in group-II. To assess the seismic performance, the specimens are assembled into two groups of three specimens each and were tested under reversal loading, The specimen with T-type mechanical anchorage (Headed bar) and T-type mechanical anchorage combination with X-cross bar exhibited significant improvement in seismic performance: load-displacement capacity, displacement ductility, stiffness degradation, controlled crack capacity in the joint shear panel and also reduced congestion of reinforcement in joint core.

  7. Time-dependent seismic tomography

    Science.gov (United States)

    Julian, B.R.; Foulger, G.R.

    2010-01-01

    Of methods for measuring temporal changes in seismic-wave speeds in the Earth, seismic tomography is among those that offer the highest spatial resolution. 3-D tomographic methods are commonly applied in this context by inverting seismic wave arrival time data sets from different epochs independently and assuming that differences in the derived structures represent real temporal variations. This assumption is dangerous because the results of independent inversions would differ even if the structure in the Earth did not change, due to observational errors and differences in the seismic ray distributions. The latter effect may be especially severe when data sets include earthquake swarms or aftershock sequences, and may produce the appearance of correlation between structural changes and seismicity when the wave speeds are actually temporally invariant. A better approach, which makes it possible to assess what changes are truly required by the data, is to invert multiple data sets simultaneously, minimizing the difference between models for different epochs as well as the rms arrival-time residuals. This problem leads, in the case of two epochs, to a system of normal equations whose order is twice as great as for a single epoch. The direct solution of this system would require twice as much memory and four times as much computational effort as would independent inversions. We present an algorithm, tomo4d, that takes advantage of the structure and sparseness of the system to obtain the solution with essentially no more effort than independent inversions require. No claim to original US government works Journal compilation ?? 2010 RAS.

  8. EMERALD: Coping with the Explosion of Seismic Data

    Science.gov (United States)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2009-12-01

    The geosciences are currently generating an unparalleled quantity of new public broadband seismic data with the establishment of large-scale seismic arrays such as the EarthScope USArray, which are enabling new and transformative scientific discoveries of the structure and dynamics of the Earth’s interior. Much of this explosion of data is a direct result of the formation of the IRIS consortium, which has enabled an unparalleled level of open exchange of seismic instrumentation, data, and methods. The production of these massive volumes of data has generated new and serious data management challenges for the seismological community. A significant challenge is the maintenance and updating of seismic metadata, which includes information such as station location, sensor orientation, instrument response, and clock timing data. This key information changes at unknown intervals, and the changes are not generally communicated to data users who have already downloaded and processed data. Another basic challenge is the ability to handle massive seismic datasets when waveform file volumes exceed the fundamental limitations of a computer’s operating system. A third, long-standing challenge is the difficulty of exchanging seismic processing codes between researchers; each scientist typically develops his or her own unique directory structure and file naming convention, requiring that codes developed by another researcher be rewritten before they can be used. To address these challenges, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The overarching goal of the EMERALD project is to enable more efficient and effective use of seismic datasets ranging from just a few hundred to millions of waveforms with a complete database-driven system, leading to higher quality seismic datasets for scientific analysis and enabling faster, more efficient scientific research. We will present a preliminary (beta) version of EMERALD, an integrated

  9. NULL Convention Floating Point Multiplier

    Directory of Open Access Journals (Sweden)

    Anitha Juliette Albert

    2015-01-01

    Full Text Available Floating point multiplication is a critical part in high dynamic range and computational intensive digital signal processing applications which require high precision and low power. This paper presents the design of an IEEE 754 single precision floating point multiplier using asynchronous NULL convention logic paradigm. Rounding has not been implemented to suit high precision applications. The novelty of the research is that it is the first ever NULL convention logic multiplier, designed to perform floating point multiplication. The proposed multiplier offers substantial decrease in power consumption when compared with its synchronous version. Performance attributes of the NULL convention logic floating point multiplier, obtained from Xilinx simulation and Cadence, are compared with its equivalent synchronous implementation.

  10. NULL convention floating point multiplier.

    Science.gov (United States)

    Albert, Anitha Juliette; Ramachandran, Seshasayanan

    2015-01-01

    Floating point multiplication is a critical part in high dynamic range and computational intensive digital signal processing applications which require high precision and low power. This paper presents the design of an IEEE 754 single precision floating point multiplier using asynchronous NULL convention logic paradigm. Rounding has not been implemented to suit high precision applications. The novelty of the research is that it is the first ever NULL convention logic multiplier, designed to perform floating point multiplication. The proposed multiplier offers substantial decrease in power consumption when compared with its synchronous version. Performance attributes of the NULL convention logic floating point multiplier, obtained from Xilinx simulation and Cadence, are compared with its equivalent synchronous implementation.

  11. Towards a Theory of Convention

    DEFF Research Database (Denmark)

    Hansen, Pelle Guldborg

    2006-01-01

    theory. Like for the study of common knowledge much has happened in this latter field since then. The theory of convention has been developed and extended so as to include multiple types as well as a basis for the study of social norms. However, classical game theory is currently undergoing severe crisis...... as a tool for understanding and explaining social phenomena; a crisis emerging from the problem of equilibrium selection around which any theory of convention must revolve. The so-called evolutionary turn in game theory marks a transition from the classical assumptions of rationality and common knowledge...... of such to evolutionary game theoretical frameworks inspired by the models of (Maynard Smith & Price 1973), (Taylor & Jonker 1978) and (Maynard Smith 1982). By providing an account of equilibrium selection these are thought to work as well-defined metaphors of learning processes upon which a revised theory of convention...

  12. Seismic evaluation methods for existing buildings

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, B.J.

    1995-07-01

    Recent US Department of Energy natural phenomena hazards mitigation directives require the earthquake reassessment of existing hazardous facilities and general use structures. This applies also to structures located in accordance with the Uniform Building Code in Seismic Zone 0 where usually no consideration is given to seismic design, but where DOE specifies seismic hazard levels. An economical approach for performing such a seismic evaluation, which relies heavily on the use of preexistent structural analysis results is outlined below. Specifically, three different methods are used to estimate the seismic capacity of a building, which is a unit of a building complex located on a site considered low risk to earthquakes. For structures originally not seismically designed, which may not have or be able to prove sufficient capacity to meet new arbitrarily high seismic design requirement and which are located on low-seismicity sites, it may be very cost effective to perform detailed site-specific seismic hazard studies in order to establish the true seismic threat. This is particularly beneficial, to sites with many buildings and facilities to be seismically evaluated.

  13. Assessment of seismic loss dependence using copula.

    Science.gov (United States)

    Goda, Katsuichiro; Ren, Jiandong

    2010-07-01

    The catastrophic nature of seismic risk is attributed to spatiotemporal correlation of seismic losses of buildings and infrastructure. For seismic risk management, such correlated seismic effects must be adequately taken into account, since they affect the probability distribution of aggregate seismic losses of spatially distributed structures significantly, and its upper tail behavior can be of particular importance. To investigate seismic loss dependence for two closely located portfolios of buildings, simulated seismic loss samples, which are obtained from a seismic risk model of spatially distributed buildings by taking spatiotemporally correlated ground motions into account, are employed. The characterization considers a loss frequency model that incorporates one dependent random component acting as a common shock to all buildings, and a copula-based loss severity model, which facilitates the separate construction of marginal loss distribution functions and nonlinear copula function with upper tail dependence. The proposed method is applied to groups of wood-frame buildings located in southwestern British Columbia. Analysis results indicate that the dependence structure of aggregate seismic losses can be adequately modeled by the right heavy tail copula or Gumbel copula, and that for the considered example, overall accuracy of the proposed method is satisfactory at probability levels of practical interest (at most 10% estimation error of fractiles of aggregate seismic loss). The developed statistical seismic loss model may be adopted in dynamic financial analysis for achieving faster evaluation with reasonable accuracy.

  14. Seismic Risk Perception compared with seismic Risk Factors

    Science.gov (United States)

    Crescimbene, Massimo; La Longa, Federica; Pessina, Vera; Pino, Nicola Alessandro; Peruzza, Laura

    2016-04-01

    The communication of natural hazards and their consequences is one of the more relevant ethical issues faced by scientists. In the last years, social studies have provided evidence that risk communication is strongly influenced by the risk perception of people. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. A theory that offers an integrative approach to understanding and explaining risk perception is still missing. To explain risk perception, it is necessary to consider several perspectives: social, psychological and cultural perspectives and their interactions. This paper presents the results of the CATI survey on seismic risk perception in Italy, conducted by INGV researchers on funding by the DPC. We built a questionnaire to assess seismic risk perception, with a particular attention to compare hazard, vulnerability and exposure perception with the real data of the same factors. The Seismic Risk Perception Questionnaire (SRP-Q) is designed by semantic differential method, using opposite terms on a Likert scale to seven points. The questionnaire allows to obtain the scores of five risk indicators: Hazard, Exposure, Vulnerability, People and Community, Earthquake Phenomenon. The questionnaire was administered by telephone interview (C.A.T.I.) on a statistical sample at national level of over 4,000 people, in the period January -February 2015. Results show that risk perception seems be underestimated for all indicators considered. In particular scores of seismic Vulnerability factor are extremely low compared with house information data of the respondents. Other data collected by the questionnaire regard Earthquake information level, Sources of information, Earthquake occurrence with respect to other natural hazards, participation at risk reduction activities and level of involvement. Research on risk perception aims to aid risk analysis and policy-making by

  15. Towards a Theory of Convention

    DEFF Research Database (Denmark)

    Hansen, Pelle Guldborg

    2006-01-01

    theory. Like for the study of common knowledge much has happened in this latter field since then. The theory of convention has been developed and extended so as to include multiple types as well as a basis for the study of social norms. However, classical game theory is currently undergoing severe crisis...... as a tool for understanding and explaining social phenomena; a crisis emerging from the problem of equilibrium selection around which any theory of convention must revolve. The so-called evolutionary turn in game theory marks a transition from the classical assumptions of rationality and common knowledge...

  16. A modified symplectic PRK scheme for seismic wave modeling

    Science.gov (United States)

    Liu, Shaolin; Yang, Dinghui; Ma, Jian

    2017-02-01

    A new scheme for the temporal discretization of the seismic wave equation is constructed based on symplectic geometric theory and a modified strategy. The ordinary differential equation in terms of time, which is obtained after spatial discretization via the spectral-element method, is transformed into a Hamiltonian system. A symplectic partitioned Runge-Kutta (PRK) scheme is used to solve the Hamiltonian system. A term related to the multiplication of the spatial discretization operator with the seismic wave velocity vector is added into the symplectic PRK scheme to create a modified symplectic PRK scheme. The symplectic coefficients of the new scheme are determined via Taylor series expansion. The positive coefficients of the scheme indicate that its long-term computational capability is more powerful than that of conventional symplectic schemes. An exhaustive theoretical analysis reveals that the new scheme is highly stable and has low numerical dispersion. The results of three numerical experiments demonstrate the high efficiency of this method for seismic wave modeling.

  17. Integrating population dynamics into mapping human exposure to seismic hazard

    Directory of Open Access Journals (Sweden)

    S. Freire

    2012-11-01

    Full Text Available Disaster risk is not fully characterized without taking into account vulnerability and population exposure. Assessment of earthquake risk in urban areas would benefit from considering the variation of population distribution at more detailed spatial and temporal scales, and from a more explicit integration of this improved demographic data with existing seismic hazard maps. In the present work, "intelligent" dasymetric mapping is used to model population dynamics at high spatial resolution in order to benefit the analysis of spatio-temporal exposure to earthquake hazard in a metropolitan area. These night- and daytime-specific population densities are then classified and combined with seismic intensity levels to derive new spatially-explicit four-class-composite maps of human exposure. The presented approach enables a more thorough assessment of population exposure to earthquake hazard. Results show that there are significantly more people potentially at risk in the daytime period, demonstrating the shifting nature of population exposure in the daily cycle and the need to move beyond conventional residence-based demographic data sources to improve risk analyses. The proposed fine-scale maps of human exposure to seismic intensity are mainly aimed at benefiting visualization and communication of earthquake risk, but can be valuable in all phases of the disaster management process where knowledge of population densities is relevant for decision-making.

  18. Seismic Loading for FAST: May 2011 - August 2011

    Energy Technology Data Exchange (ETDEWEB)

    Asareh, M. A.; Prowell, I.

    2012-08-01

    As more wind farms are constructed in seismically active regions, earthquake loading increases in prominence for design and analysis of wind turbines. Early investigation of seismic load tended to simplify the rotor and nacelle as a lumped mass on top of the turbine tower. This simplification allowed the use of techniques developed for conventional civil structures, such as buildings, to be easily applied to wind turbines. However, interest is shifting to more detailed models that consider loads for turbine components other than the tower. These improved models offer three key capabilities in consideration of base shaking for turbines: 1) The inclusion of aerodynamics and turbine control; 2) The ability to consider component loads other than just tower loads; and 3) An improved representation of turbine response in higher modes by reducing modeling simplifications. Both experimental and numerical investigations have shown that, especially for large modern turbines, it is important to consider interaction between earthquake input, aerodynamics, and operational loads. These investigations further show that consideration of higher mode activity may be necessary in the analysis of the seismic response of turbines. Since the FAST code is already capable of considering these factors, modifications were developed that allow simulation of base shaking. This approach allows consideration of this additional load source within a framework, the FAST code that is already familiar to many researchers and practitioners.

  19. Performance-based concept on seismic evaluation of existing bridges

    Institute of Scientific and Technical Information of China (English)

    Yu-Chi Sung; Wen-I Liao; W.Phillip Yen

    2009-01-01

    Conventional seismic evaluation of existing bridges explores the ability of a bridge to survive under significant earthquake excitations. This approach has several major drawbacks, such as only a single structural performance of near collapse is considered, and the simplified approach of adopting strength-based concept to indirectly estimate the nonlinear behavior of a structure lacks accuracy. As a result, performance-based concepts that include a wider variety of structural performance states of a given bridge excited by different levels of earthquake intensity is needed by the engineering community. This paper introduces an improved process for the seismic evaluation of existing bridges. The relationship between the overall structural performance and earthquakes with varying levels of peak ground acceleration (PGA) can successfully be linked. A universal perspective on the seismic evaluation of bridges over their entire life-cycle can be easily obtained to investigate multiple performance objectives. The accuracy of the proposed method, based on pushover analysis, is proven in a case study that compares the results from the proposed procedure with additional nonlinear time history analyses.

  20. Linearized inversion of two components seismic data; Inversion linearisee de donnees sismiques a deux composantes

    Energy Technology Data Exchange (ETDEWEB)

    Lebrun, D.

    1997-05-22

    The aim of the dissertation is the linearized inversion of multicomponent seismic data for 3D elastic horizontally stratified media, using Born approximation. A Jacobian matrix is constructed; it will be used to model seismic data from elastic parameters. The inversion technique, relying on single value decomposition (SVD) of the Jacobian matrix, is described. Next, the resolution of inverted elastic parameters is quantitatively studies. A first use of the technique is shown in the frame of an evaluation of a sea bottom acquisition (synthetic data). Finally, a real data set acquired with conventional marine technique is inverted. (author) 70 refs.

  1. Delineation of seismic source zones based on seismicity parameters and probabilistic evaluation of seismic hazard using logic tree approach

    Indian Academy of Sciences (India)

    K S Vipin; T G Sitharam

    2013-06-01

    The delineation of seismic source zones plays an important role in the evaluation of seismic hazard. In most of the studies the seismic source delineation is done based on geological features. In the present study, an attempt has been made to delineate seismic source zones in the study area (south India) based on the seismicity parameters. Seismicity parameters and the maximum probable earthquake for these source zones were evaluated and were used in the hazard evaluation. The probabilistic evaluation of seismic hazard for south India was carried out using a logic tree approach. Two different types of seismic sources, linear and areal, were considered in the present study to model the seismic sources in the region more precisely. In order to properly account for the attenuation characteristics of the region, three different attenuation relations were used with different weightage factors. Seismic hazard evaluation was done for the probability of exceedance (PE) of 10% and 2% in 50 years. The spatial variation of rock level peak horizontal acceleration (PHA) and spectral acceleration (Sa) values corresponding to return periods of 475 and 2500 years for the entire study area are presented in this work. The peak ground acceleration (PGA) values at ground surface level were estimated based on different NEHRP site classes by considering local site effects.

  2. Introspection on improper seismic retrofit of Basilica Santa Maria di Collemaggio after 2009 Italian earthquake

    Science.gov (United States)

    Cimellaro, Gian Paolo; Reinhorn, Andrei M.; de Stefano, Alessandro

    2011-03-01

    The 2009 L'Aquila, Italy earthquake highlighted the seismic vulnerability of historic masonry building structures due to improper "strengthening" retrofit work that has been done in the last 50 years. Italian seismic standards recommend the use of traditional reinforcement techniques such as replacing the original wooden roof structure with new reinforced concrete (RC) or steel elements, inserting RC tie-beams in the masonry and new RC floors, and using RC jacketing on the shear walls. The L'Aquila earthquake revealed the numerous limitations of these interventions, because they led to increased seismic forces (due to greater additional weight) and to deformation incompatibilities of the incorporated elements with the existing masonry walls. This paper provides a discussion of technical issues pertaining to the seismic retrofit of the Santa Maria di Collemaggio Basilica and in particular, the limitations of the last (2000) retrofit intervention. Considerable damage was caused to the church because of questionable actions and incorrect and improper technical choices.

  3. Application of Catastrophe Theory in 3D Seismic Data Interpretation of Coal Mine

    Institute of Scientific and Technical Information of China (English)

    ZHAO Mu-hua; YANG Wen-qiang; CUI Hui-xia

    2005-01-01

    In order to detect fault exactly and quickly, cusp catastrophe theory is used to interpret 3D coal seismic data in this paper. By establishing a cusp model, seismic signal is transformed into standard form of cusp catastrophe and catastrophe parameters, including time-domain catastrophe potential, time-domain catastrophe time, frequency-domain catastrophe potential and frequency- domain degree, are calculated. Catastrophe theory is used in 3D seismic structural interpretation in coal mine. The results show that the position of abnormality of the catastrophe parameter profile or curve is related to the location of fault, and the cusp catastrophe theory is effective to automatically pick up geology information and improve the interpretation precision in 3D seismic data.

  4. Seismic evaluation of the U1a complex at the Nevada Test Site

    Energy Technology Data Exchange (ETDEWEB)

    McCamant, R R; Davito, A M; Hahn, K R; Murray, R C; Ng, D S; Sahni, V K; Schnechter, K M; Van Dyke, M

    1998-10-16

    As part of an overall safety evaluation of the Ula Complex, a seismic evaluation of structures, systems, and components (SSC) was conducted. A team of seismic, safety, and operation engineers from Los Alamos National Laboratory (LANL), Bechtel Nevada (BN) and Lawrence Livermore National Laboratory (LLNL) was chartered to perform the seismic evaluation. The UlA Complex is located in Area 1 of the Nevada Test Site (NTS) in Nevada. The complex is a test facility for physics experiments in support of the Science Based Stockpile Stewardship Program. The Ula Complex consists of surface and subsurface facilities. The subsurface facility is a tunnel complex located 963 feet below the surface. The seismic evaluation of U 1 a Complex is required to comply with the DOE Natural Phenomena Policy. This policy consists of an order, an implementing guide, and standards which provide guidance for design and evaluation of SSCs, categorization of SSCs, characterization of site, and hazard level definition.

  5. Cursory seismic drift assessment for buildings in moderate seismicity regions

    Institute of Scientific and Technical Information of China (English)

    Zhu Yong; R.K.L. Su; Zhou Fulin

    2007-01-01

    This paper outlines a methodology to assess the seismic drift of reinforced concrete buildings with limited structural and geotechnical information. Based on the latest and the most advanced research on predicting potential near-field and far field earthquakes affecting Hong Kong, the engineering response spectra for both rock and soil sites are derived. A new step-by-step procedure for displacement-based seismic hazard assessment of building structures is proposed to determine the maximum inter-storey drift demand for reinforced concrete buildings. The primary information required for this assessment is only the depth of the soft soil above bedrock and the height of the building. This procedure is further extended to assess the maximum chord rotation angle demand for the coupling beam of coupled shear wall or frame wall structures, which may be very critical when subjected to earthquake forces. An example is provided to illustrate calibration of the assessment procedure by using actual engineering structural models.

  6. Seismic damage and destructive potential of seismic events

    Directory of Open Access Journals (Sweden)

    S. M. Petrazzuoli

    1995-06-01

    Full Text Available This paper has been written within a research framework investigating the destructive potential of seismic events. The elastic response spectra seem insufficient to explain the behaviour of structures subject to large earthquakes in which they experience extensive plastic deformations. Recent works emphasise that there were many difficulties in the definition of a single pararneter linked to the destl-uctive potential of an earthquake. In this work a study on the effect of frequency content on structural damage has been carried out. The behaviour of two different elastoplastic oscillators has been analysed, considering several artificial earthquakes. The results obtained suggest a method for evaluating the destructive seismic potential of an earthquake through the response spectra ad the frequency content of the signal. and through the mechai~ical characteristics of the structures within the analysed area.

  7. Comparative Tests Between Shallow Downhole Installation and Classical Seismic Vaults

    Science.gov (United States)

    Charade, Olivier; Vergne, Jérôme; Bonaimé, Sébastien; Bonnin, Mickaël; Louis-Xavier, Thierry; Beucler, Eric; Manhaval, Bertrand; Arnold, Benoît

    2016-04-01

    The French permanent broadband network is engaged in a major evolution with the installation of a hundred of new stations within the forthcoming years. Since most of them will be located in open field environments, we are looking for a standardized installation method able to provide good noise level performance at a reasonable cost. Nowadays, the use of posthole seismometers that can be deployed at the bottom of shallow boreholes appears to be an affordable and alternative solution to more traditional installation methods such as seismic vaults or dedicated underground cellars. Here we present some comparative tests performed at different sites (including two GEOSCOPE stations), spanning various geological conditions. On each site, posthole sensors were deployed for several weeks to months at various depths from 1.5m up to 20m. We compare the seismic noise levels measured in the different boreholes with the one for a reference sensor either directly buried or installed in a tunnel, a cellar or a seismic vault. Apart from the microseism frequency band, seismic noise level in most of the boreholes equals or outperforms the one obtained for the reference sensors. At periods higher than 20s we observe a strong reduction of the seismic noise on the horizontal components in the deepest boreholes compared to near surface installations. This improvement can reach up to 30dB and appears to be mostly due to a reduction in tilt noise induced by wind or local pressure variations. However, the absolute noise level that can be achieved strongly depends on the local geology.

  8. Hydrogen storage: beyond conventional methods.

    Science.gov (United States)

    Dalebrook, Andrew F; Gan, Weijia; Grasemann, Martin; Moret, Séverine; Laurenczy, Gábor

    2013-10-09

    The efficient storage of hydrogen is one of three major hurdles towards a potential hydrogen economy. This report begins with conventional storage methods for hydrogen and broadly covers new technology, ranging from physical media involving solid adsorbents, to chemical materials including metal hydrides, ammonia borane and liquid precursors such as alcohols and formic acid.

  9. Grounding Damage to Conventional Vessels

    DEFF Research Database (Denmark)

    Lützen, Marie; Simonsen, Bo Cerup

    2003-01-01

    The present paper is concerned with rational design of conventional vessels with regard to bottom damage generated in grounding accidents. The aim of the work described here is to improve the design basis, primarily through analysis of new statistical data for grounding damage. The current...

  10. Inventory non-conventional gas

    Energy Technology Data Exchange (ETDEWEB)

    Muntendam-Bos, A.G.; Wassing, B.B.T.; Ter Heege, J.H.; Van Bergen, F.; Schavemaker, Y.A.; Van Gessel, S.F.; De Jong, M.L.; Nelskamp, S.; Van Thienen-Visser, K.; Guasti, E.; Van den Belt; Marges, V.C. [TNO Built Environment and Geosciences, Utrecht (Netherlands)

    2009-10-15

    This report describes the results of the inventory for each non-conventional gas resource expected to be present in the Netherlands, which are: Tight Gas, Shallow gas, Coal bed Methane (CBM), Shale gas, Basin Centered Gas, Aquifer Gas and Stratigraphic traps.

  11. A bayesian approach for determining velocity and uncertainty estimates from seismic cone penetrometer testing or vertical seismic profiling data

    Science.gov (United States)

    Pidlisecky, A.; Haines, S.S.

    2011-01-01

    Conventional processing methods for seismic cone penetrometer data present several shortcomings, most notably the absence of a robust velocity model uncertainty estimate. We propose a new seismic cone penetrometer testing (SCPT) data-processing approach that employs Bayesian methods to map measured data errors into quantitative estimates of model uncertainty. We first calculate travel-time differences for all permutations of seismic trace pairs. That is, we cross-correlate each trace at each measurement location with every trace at every other measurement location to determine travel-time differences that are not biased by the choice of any particular reference trace and to thoroughly characterize data error. We calculate a forward operator that accounts for the different ray paths for each measurement location, including refraction at layer boundaries. We then use a Bayesian inversion scheme to obtain the most likely slowness (the reciprocal of velocity) and a distribution of probable slowness values for each model layer. The result is a velocity model that is based on correct ray paths, with uncertainty bounds that are based on the data error. ?? NRC Research Press 2011.

  12. Reconstruction of a 2D seismic wavefield by seismic gradiometry

    Science.gov (United States)

    Maeda, Takuto; Nishida, Kiwamu; Takagi, Ryota; Obara, Kazushige

    2016-12-01

    We reconstructed a 2D seismic wavefield and obtained its propagation properties by using the seismic gradiometry method together with dense observations of the Hi-net seismograph network in Japan. The seismic gradiometry method estimates the wave amplitude and its spatial derivative coefficients at any location from a discrete station record by using a Taylor series approximation. From the spatial derivatives in horizontal directions, the properties of a propagating wave packet, including the arrival direction, slowness, geometrical spreading, and radiation pattern can be obtained. In addition, by using spatial derivatives together with free-surface boundary conditions, the 2D vector elastic wavefield can be decomposed into divergence and rotation components. First, as a feasibility test, we performed an analysis with a synthetic seismogram dataset computed by a numerical simulation for a realistic 3D medium and the actual Hi-net station layout. We confirmed that the wave amplitude and its spatial derivatives were very well-reproduced for period bands longer than 25 s. Applications to a real large earthquake showed that the amplitude and phase of the wavefield were well reconstructed, along with slowness vector. The slowness of the reconstructed wavefield showed a clear contrast between body and surface waves and regional non-great-circle-path wave propagation, possibly owing to scattering. Slowness vectors together with divergence and rotation decomposition are expected to be useful for determining constituents of observed wavefields in inhomogeneous media.

  13. Seismic monitoring of rockfalls at Spitz quarry (NÖ, Austria)

    Science.gov (United States)

    del Puy Papí Isaba, María; Brückl, Ewald; Roncat, Andreas; Schweigl, Joachim

    2016-04-01

    In the recent past, significant rockfalls, which pose a danger to persons, railways and roads, occurred in the quarry of Spitz (NÖ-Austria). An existing seismic warning system did not fulfill the expected efficiency and reliability standards since the ratio of well-detected events to undetected events or false alarms was not satisfactory. Our aim was to analyze how a seismic warning system must be designed in order to overcome these deficiencies. A small-scale seismic network was deployed in the Spitz quarry to evaluate the possibility of improving the early-warning rockfall monitoring network by means of seismic observations. A new methodology based on seismic methods, which enables the detection and location of rockfalls above a critical size, was developed. In order to perform this task, a small-scale (200x200 m2) passive seismic network comprised of 7 monitoring seismic stations acquiring data in continuous mode was established in the quarry of Spitz so that it covered the rockfall hazard area. On the 2nd of October 2015, an induced rockfall experiment was performed. It began at 09:00 a.m (local time, 07:00 UTC) and lasted about 1.5 hours. The entire data set was analyzed using the pSysmon software. In order to locate the impact point of the rock falls, we used a procedure based on the back-projection of the maximum resultant amplitude recorded at each station of the network within a time window to every grid-point covering the whole area of interest. In order to verify the performance of the employed algorithm for detection and localization, we performed man-induced rock falls. We also used a terrestrial laser scanner and a camera, not only to draw the rockfall block trajectories, but also to determine the volume of rock lost or gained in the different areas of the quarry. This allowed us to relate the lost mass with the strength of the collision (Pseudo-magnitude) of the rockfall, and draw and rebuild their associated trajectory. The location test performed

  14. Crustal and deep seismicity in Italy (30 years after

    Directory of Open Access Journals (Sweden)

    G. Selvaggi

    1997-06-01

    Full Text Available The first modern studies of seismicity in Italy date back to the late 60's and early 70's. Although with a sparse seismic network available and only a few telemetered short-period stations, significant studies were carried out that outlined the main features of Italian seismicity (see, e.g., Boschi et al., 1969. Among these studies, one of the most important achievements was the reconnaissance of a Wadati-Benioff zone in Southern Tyrrhenian, described for the first time in detail in the papers of Caputo et al.(1970, 1973. Today, after three decades of more and more detailed seismological monitoring of the Italian region and tens of thousands earthquakes located since then, the knowledge of the earthquake generation processes in our country is much improved, although some of the conclusions reached in these early papers still hold. These improvements were made possible by the efforts of many institutions and seismologists who have been working hard to bring seismological research in Italy to standards of absolute quality, under the pivoting role of the Istituto Nazionale di Geofisica (ING. From the relocation of about 30000 crustal earthquakes and detailed studies on intermediate and deep shocks carried out in the last few years, we show that seismic release in peninsular Italy is only weakly related to the Africa-Eurasia convergence, but rather is best explained by the existence of two separate subduction/collision arcs (Northern Apennines and Southern Apennines-Calabria-Sicily. The width of the deforming belt running along peninsular Italy is 30 to 60 km, it is broader in the north than in the south, and the two arcs are separated by a region of more distributed deformation and stress rotations in the Central Apennines. Along the belt, the reconnaissance of regions of continuous and weak release of seismic energy, adjacent to fault areas which are currently «locked» (and therefore are the best candidates for future earthquakes is another

  15. Reassessment of Seismic Design and Noise Simulation using Finite Element Calculation of the Condensate Storage Tank of Cofrentes NPP according to standard API-650 11th Ed; Reevaluacion del diseno Sismico mediante Simulacion de Fluidos y Calculo por Elementos Finitos del Deposito del Almacenamiento de Condensado de Central Nuclear de Cofrentes conforme a la norma API-650 11th Ed

    Energy Technology Data Exchange (ETDEWEB)

    Sarti Fernandez, F.; Gavilan Moreno, C.; Paez Ortega, E.

    2012-07-01

    There have been several dynamic simulations in which I analyzed: fluid-structure interaction effect of the wave, studying stress, vibration modes and possible effects of structural instability. After this process to make the changes in the tank to comply with the new rules and updated seismic conditions were designed. were performed.

  16. A Preliminary Feasibility Study On Seismic Monitoring Of Polymer Flooding

    Science.gov (United States)

    Nguyen, P. K.; Park, C.; Lim, B.; Nam, M.

    2012-12-01

    Polymer flooding using water with soluble polymers is an enhanced oil recovery technique, which intends to maximize oil-recovery sweep efficiency by minimizing fingering effects and as a result creating a smooth flood front; polymer flooding decreases the flow rates within high permeability zone while enhances those of lower permeabilities. Understanding of fluid fronts and saturations is critical to not only optimizing polymer flooding but also monitoring the efficiency. Polymer flooding monitoring can be made in single well scale with high-resolution wireline logging, in inter-well scale with tomography, and in reservoir scale with surface survey. For reservoir scale monitoring, this study makes a preliminary feasibility study based on constructing rock physics models (RPMs), which can bridge variations in reservoir parameters to the changes in seismic responses. For constructing RPMs, we change reservoir parameters with consideration of polymer flooding to a reservoir. Time-lapse seismic data for corresponding RPMs are simulated using a time-domain staggered-finite-difference modeling with implementation of a boundary condition of conventional perfect match layer. Analysis on time-lapse seismic data with respect to the changes in fluid front and saturation can give an insight on feasibility of surface seismic survey to polymer flooding. Acknowledgements: This work was supported by the Energy Efficiency & Resources of the Korea Institute of Energy Technology Evaluation and Planning (KETEP) grant funded by the Korea government Ministry of Knowledge Economy (No. 2012T100201588). Myung Jin Nam was partially supported by the National Research Foundation of Korea(NRF) grant funded by the Korea government(MEST) (No. 2011-0014684).

  17. The Role of Synthetic Reconstruction Tests in Seismic Tomography

    Science.gov (United States)

    Rawlinson, N.; Spakman, W.

    2015-12-01

    Synthetic reconstruction tests are widely used in seismic tomography as a means for assessing the robustness of solutions produced by linear or iterative non-linear inversion schemes. The most common test is the so-called checkerboard resolution test, which uses an alternating pattern of high and low wavespeeds (or some other seismic property such as attenuation). However, checkerboard tests have a number of limitations, including that they (1) only provide indirect evidence of quantitative measures of reliability such as resolution and uncertainty; (2) give a potentially misleading impression of the range of scale-lengths that can be resolved; (3) don't give a true picture of the structural distortion or smearing caused by the data coverage; and (4) result in an inverse problem that is biased towards an accurate reconstruction. The widespread use of synthetic reconstruction tests in seismic tomography is likely to continue for some time yet, so it is important to implement best practice where possible. The goal here is to provide a general set of guidelines, derived from the underlying theory and illustrated by a series of numerical experiments, on their implementation in seismic tomography. In particular, we recommend (1) using a sparse distribution of spikes, rather than the more conventional tightly-spaced checkerboard; (2) using the identical data coverage (e.g. geometric rays) for the synthetic model that was computed for the observation-based model; (3) carrying out multiple tests using anomalies of different scale length; (4) exercising caution when analysing synthetic recovery tests that use anomaly patterns that closely mimic the observation-based model; (5) investigating the trade-off between data noise levels and the minimum wavelength of recovered structure; (6) where possible, test the extent to which preconditioning (e.g. identical parameterization for input and output models) influences the recovery of anomalies.

  18. Seismic force modification factor for ductile structures

    Institute of Scientific and Technical Information of China (English)

    TONG Gen-shu; HUANG Jin-qiao

    2005-01-01

    The earthquake forces used in design codes of buildings should be theoretically determinable. This work examines the seismic force modification factor R based on elastic-plastic time-history earthquake analysis of SDOF systems, wherein the hysteresis models are elastic-perfectly-plastic (EPP), elastic-linearly-hardening (ELH), shear-slipped and bilinear-elastic. The latter two models are analysed for separating the effect of the ductility and the energy-dissipating capacity. Three-hundred eighty-eight earthquake records from different site conditions are used in analysis. The ductility is taken to be 2, 3, 4, 5 and 6, with the damping ratio being 0.02, 0.035 and 0.05 respectively. The post-yield stiffness ratios 0.0, 0.1 and 0.2 are used in the analysis. The R spectra are standardized by the characteristic period of the earthquake records, which leads to a much smaller scatter in averaged numerical results. It was found that the most important factor determining R is the ductility. R increases more than linearly with ductility. The energy-dissipating capacity, damping and the post-yield stiffness are the less important factors. The energy dissipating capacity is important only for structures with short period and moderate period (0.3≤T/Tg<5.0). For EPP and ELH models, R for 0.05 damping is 10% to 15% smaller than for 0.02 damping. For EPP and ELH models, greater post-yield stiffness leads to greater R, but the influence of post-yield stiffness is obvious only when the post-yield stiffness is less than 10% of the initial stiffness. By means of statistical regression analysis the relation of the seismic force modification factor R with the natural period of the system and ductility for EPP and ELH models were established for each site and soil condition.

  19. Discussion about the relationship between seismic belt and seismic statistical zone

    Institute of Scientific and Technical Information of China (English)

    潘华; 金严; 胡聿贤

    2003-01-01

    This paper makes a summary of status of delimitation of seismic zones and belts of China firstly in aspects of studying history, purpose, usage, delimiting principles, various presenting forms and main specialties. Then the viewpoints are emphasized, making geographical divisions by seismicity is just the most important purpose of delimiting seismic belts and the concept of seismic belt is also quite different from that of seismic statistical zone used in CPSHA method. The concept of seismic statistical zone and its history of evolvement are introduced too. Large differences between these two concepts exist separately in their statistical property, actual meaning, gradation, required scale, and property of refusing to overlap each other, aim and usage of delimitation. But in current engineering practice, these two concepts are confused. On the one hand, it causes no fit theory for delimiting seismic statistical zone in PSHA to be set up; on the other hand, researches about delimitation of seismic belts with purposes of seismicity zoning and studying on structural environment, mechanism of earthquake generating also pause to go ahead. Major conclusions are given in the end of this paper, that seismic statistical zone bases on the result of seismic belt delimiting, it only arises in and can be used in the especial PSHA method of China with considering spatially and temporally inhomogeneous seismic activities, and its concept should be clearly differentiated from the concept of seismic belt.

  20. Seismic attribute-based characterization of coalbed methane reservoirs: An example from the Fruitland Formation, San Juan basin, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Marroquin, I.D.; Hart, B.S. [McGill University, Montreal, PQ (Canada)

    2004-11-01

    The Fruitland Formation of the San Juan basin is the largest producer of coalbed methane in the world. Production patterns vary from one well to another throughout the basin, reflecting factors such as coal thickness and fracture and cleat density. In this study, we integrated conventional P-wave three-dimensional (3-D) seismic and well data to investigate geological controls on production from a thick, continuous coal seam in the lower part of the Fruitland Formation. Our objective was to show the potential of using 3-D seismic data to predict coal thickness, as well as the distribution and orientation of subtle structures that may be associated with enhanced permeability zones. To do this, we first derived a seismic attribute-based model that predicts coal thickness. We then used curvature attributes derived from seismic horizons to detect subtle structural features that might be associated with zones of enhanced permeability. Production data show that the best producing wells are associated with seismically definable structural features and thick coal. Although other factors (e.g., completion practices and coal type) affect coalbed methane production, our results suggest that conventional 3-D seismic data, integrated with wire-line logs and production data, are useful for characterizing coalbed methane reservoirs.

  1. Adaptive finite difference for seismic wavefield modelling in acoustic media.

    Science.gov (United States)

    Yao, Gang; Wu, Di; Debens, Henry Alexander

    2016-08-05

    Efficient numerical seismic wavefield modelling is a key component of modern seismic imaging techniques, such as reverse-time migration and full-waveform inversion. Finite difference methods are perhaps the most widely used numerical approach for forward modelling, and here we introduce a novel scheme for implementing finite difference by introducing a time-to-space wavelet mapping. Finite difference coefficients are then computed by minimising the difference between the spatial derivatives of the mapped wavelet and the finite difference operator over all propagation angles. Since the coefficients vary adaptively with different velocities and source wavelet bandwidths, the method is capable to maximise the accuracy of the finite difference operator. Numerical examples demonstrate that this method is superior to standard finite difference methods, while comparable to Zhang's optimised finite difference scheme.

  2. Seismic techniques in coal mining

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharyya, A.K.; Belleza, G.V.

    1983-01-01

    The aim of this study is to investigate the peripheral fracture zones in coal pillars left for support in underground mines. The fracture zones are caused by the redistribution of stresses in strata resulting from the process of excavation and blasting if it is used. The extent and degree of these fracture zones, in turn, have a direct influence on the ability of pillars to provide stable support to the overlying strata. Seismic methods such as refraction, uphole, and collinear techniques outlined in previous reports are being used to investigate the extent and degree of the peripheral fracture zones. Some of the work that has been carried out and is described in this report, relates to the study of peripheral fracture zones in coal pillars using seismic techniques.

  3. Statistical Physics Approaches to Seismicity

    CERN Document Server

    Sornette, D

    2008-01-01

    This entry in the Encyclopedia of Complexity and Systems Science, Springer present a summary of some of the concepts and calculational tools that have been developed in attempts to apply statistical physics approaches to seismology. We summarize the leading theoretical physical models of the space-time organization of earthquakes. We present a general discussion and several examples of the new metrics proposed by statistical physicists, underlining their strengths and weaknesses. The entry concludes by briefly outlining future directions. The presentation is organized as follows. I Glossary II Definition and Importance of the Subject III Introduction IV Concepts and Calculational Tools IV.1 Renormalization, Scaling and the Role of Small Earthquakes in Models of Triggered Seismicity IV.2 Universality IV.3 Intermittent Periodicity and Chaos IV.4 Turbulence IV.5 Self-Organized Criticality V Competing mechanisms and models V.1 Roots of complexity in seismicity: dynamics or heterogeneity? V.2 Critical earthquakes ...

  4. Tube-wave seismic imaging

    Science.gov (United States)

    Korneev, Valeri A [LaFayette, CA

    2009-05-05

    The detailed analysis of cross well seismic data for a gas reservoir in Texas revealed two newly detected seismic wave effects, recorded approximately 2000 feet above the reservoir. A tube-wave (150) is initiated in a source well (110) by a source (111), travels in the source well (110), is coupled to a geological feature (140), propagates (151) through the geological feature (140), is coupled back to a tube-wave (152) at a receiver well (120), and is and received by receiver(s) (121) in either the same (110) or a different receiving well (120). The tube-wave has been shown to be extremely sensitive to changes in reservoir characteristics. Tube-waves appear to couple most effectively to reservoirs where the well casing is perforated, allowing direct fluid contact from the interior of a well case to the reservoir.

  5. An economical educational seismic system

    Science.gov (United States)

    Lehman, J. D.

    1980-01-01

    There is a considerable interest in seismology from the nonprofessional or amateur standpoint. The operation of a seismic system can be satisfying and educational, especially when you have built and operated the system yourself. A long-period indoor-type sensor and recording system that works extremely well has been developed in the James Madison University Physics Deparment. The system can be built quite economically, and any educational institution that cannot commit themselves to a professional installation need not be without first-hand seismic information. The system design approach has been selected by college students working a project or senior thesis, several elementary and secondary science teachers, as well as the more ambitious tinkerer or hobbyist at home 

  6. Seismic hazard studies in Egypt

    Directory of Open Access Journals (Sweden)

    Abuo El-Ela A. Mohamed

    2012-12-01

    Full Text Available The study of earthquake activity and seismic hazard assessment of Egypt is very important due to the great and rapid spreading of large investments in national projects, especially the nuclear power plant that will be held in the northern part of Egypt. Although Egypt is characterized by low seismicity, it has experienced occurring of damaging earthquake effect through its history. The seismotectonic sitting of Egypt suggests that large earthquakes are possible particularly along the Gulf of Aqaba–Dead Sea transform, the Subduction zone along the Hellenic and Cyprean Arcs, and the Northern Red Sea triple junction point. In addition some inland significant sources at Aswan, Dahshour, and Cairo-Suez District should be considered. The seismic hazard for Egypt is calculated utilizing a probabilistic approach (for a grid of 0.5° × 0.5° within a logic-tree framework. Alternative seismogenic models and ground motion scaling relationships are selected to account for the epistemic uncertainty. Seismic hazard values on rock were calculated to create contour maps for four ground motion spectral periods and for different return periods. In addition, the uniform hazard spectra for rock sites for different 25 periods, and the probabilistic hazard curves for Cairo, and Alexandria cities are graphed. The peak ground acceleration (PGA values were found close to the Gulf of Aqaba and it was about 220 gal for 475 year return period. While the lowest (PGA values were detected in the western part of the western desert and it is less than 25 gal.

  7. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-10-01

    In this report we will show the fundamental concepts of two different methods to compute seismic energy absorption. The first methods gives and absolute value of Q and is based on computation with minimum phase operators. The second method gives a relative energy loss compared to a background trend. This method is a rapid, qualitative indicator of anomalous absorption and can be combined with other attributes such as band limited acoustic impedance to indicate areas of likely gas saturation.

  8. Research on the effect estimation of seismic safety evaluation

    Institute of Scientific and Technical Information of China (English)

    邹其嘉; 陶裕禄

    2004-01-01

    Seismic safety evaluation is a basic work for determining the seismic resistance requirements of major construction projects. The effect, especially the economic effect of the seismic safety evaluation has been generally concerned. The paper gives a model for estimating the effect of seismic safety evaluation and calculates roughly the economic effect of seismic safety evaluation with some examples.

  9. Seismicity of Afghanistan and vicinity

    Science.gov (United States)

    Dewey, James W.

    2006-01-01

    This publication describes the seismicity of Afghanistan and vicinity and is intended for use in seismic hazard studies of that nation. Included are digital files with information on earthquakes that have been recorded in Afghanistan and vicinity through mid-December 2004. Chapter A provides an overview of the seismicity and tectonics of Afghanistan and defines the earthquake parameters included in the 'Summary Catalog' and the 'Summary of Macroseismic Effects.' Chapter B summarizes compilation of the 'Master Catalog' and 'Sub-Threshold Catalog' and documents their formats. The 'Summary Catalog' itself is presented as a comma-delimited ASCII file, the 'Summary of Macroseismic Effects' is presented as an html file, and the 'Master Catalog' and 'Sub-Threshold Catalog' are presented as flat ASCII files. Finally, this report includes as separate plates a digital image of a map of epicenters of earthquakes occurring since 1964 (Plate 1) and a representation of areas of damage or strong shaking from selected past earthquakes in Afghanistan and vicinity (Plate 2).

  10. Seismic risk mapping for Germany

    Science.gov (United States)

    Tyagunov, S.; Grünthal, G.; Wahlström, R.; Stempniewski, L.; Zschau, J.

    2006-06-01

    The aim of this study is to assess and map the seismic risk for Germany, restricted to the expected losses of damage to residential buildings. There are several earthquake prone regions in the country which have produced Mw magnitudes above 6 and up to 6.7 corresponding to observed ground shaking intensity up to VIII-IX (EMS-98). Combined with the fact that some of the earthquake prone areas are densely populated and highly industrialized and where therefore the hazard coincides with high concentration of exposed assets, the damaging implications from earthquakes must be taken seriously. In this study a methodology is presented and pursued to calculate the seismic risk from (1) intensity based probabilistic seismic hazard, (2) vulnerability composition models, which are based on the distribution of residential buildings of various structural types in representative communities and (3) the distribution of assets in terms of replacement costs for residential buildings. The estimates of the risk are treated as primary economic losses due to structural damage to residential buildings. The obtained results are presented as maps of the damage and risk distributions. For a probability level of 90% non-exceedence in 50 years (corresponding to a mean return period of 475 years) the mean damage ratio is up to 20% and the risk up to hundreds of millions of euro in the most endangered communities. The developed models have been calibrated with observed data from several damaging earthquakes in Germany and the nearby area in the past 30 years.

  11. Building a Smartphone Seismic Network

    Science.gov (United States)

    Kong, Q.; Allen, R. M.

    2013-12-01

    We are exploring to build a new type of seismic network by using the smartphones. The accelerometers in smartphones can be used to record earthquakes, the GPS unit can give an accurate location, and the built-in communication unit makes the communication easier for this network. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. In order to build this network, we developed an application for android phones and server to record the acceleration in real time. These records can be sent back to a server in real time, and analyzed at the server. We evaluated the performance of the smartphone as a seismic recording instrument by comparing them with high quality accelerometer while located on controlled shake tables for a variety of tests, and also the noise floor test. Based on the daily human activity data recorded by the volunteers and the shake table tests data, we also developed algorithm for the smartphones to detect earthquakes from daily human activities. These all form the basis of setting up a new prototype smartphone seismic network in the near future.

  12. Evolutionary Games and Social Conventions

    DEFF Research Database (Denmark)

    Hansen, Pelle Guldborg

    2007-01-01

    Some thirty years ago Lewis published his Convention: A Philosophical Study (Lewis, 2002). This laid the foundation for a game-theoretic approach to social conventions, but became more famously known for its seminal analysis of common knowledge; the concept receiving its canonical analysis...... in Aumann (1976) and which, together with the assumptions of perfect rationality, came to be defining of classical game theory. However, classical game theory is currently undergoing severe crisis as a tool for exploring social phenomena; a crisis emerging from the problem of equilibrium selection around...... knowledge to assumptions characterising agents as conditioned for playing certain strategies upon the population of which evolutionary processes operate. By providing accounts of equilibrium selection and stability properties of behaviours, the resulting frameworks have been brought to work as well...

  13. Conventional treatments for ankylosing spondylitis

    OpenAIRE

    Dougados, M; Dijkmans, B; Khan, M.(Department of Physics, Aligarh Muslim University, Aligarh, India); Maksymowych, W; van der Linden, S; Brandt, J

    2002-01-01

    Management of ankylosing spondylitis (AS) is challenged by the progressive nature of the disease. To date, no intervention is available that alters the underlying mechanism of inflammation in AS. Currently available conventional treatments are palliative at best, and often fail to control symptoms in the long term. Current drug treatment may perhaps induce a spurious state of "disease remission," which is merely a low level of disease activity. Non-steroidal anti-inflammatory drugs are first ...

  14. CONVENTIONAL DEVELOPMENT OF ENVIRONMENTAL PREOCCUPATIONS

    Directory of Open Access Journals (Sweden)

    Claudia ANDRITOI

    2011-12-01

    Full Text Available A great number of the conventions referring to nature, even if they do not refer ton particular species, were limited from the point of view of geography and territories: we may give as example here a convention for the protection of flora, fauna and panoramic beauties of America, the African convention for nature and natural resources… By the Stockholm conferences, from the 5th of June 1972, we entered in a “dynamic of globalization”. Article 1 of the Declaration that followed the conference is important for the global awareness: “Human beings have the basic right for freedom, equality and conditions of a satisfying life, in an environment with a quality that allows him to live with dignity and well being. He has the solemn duty to protect and improve the environment for the present and future generations (…”. This article proclaims a right for the environment. A new law seems to have arisen with the apparition of this convention: the right of a healthy human being and of a healthy environment. This law is bipolar because it associates the human beings to nature. Human beings have the right to live in a healthy environment and this is why he has to protect nature. This does not represent a right of the human beings from a strict point of view. This is a right that has a universal value. The right to a healthy environment can not be put in the same category as the right to live or the right to be healthy, because this right contains the latter.

  15. Catching up on conventions grammar lessons for middle school writers

    CERN Document Server

    Francois, Chantal

    2009-01-01

    Are Chantal Francois and Elisa Zonana's students like yours? Economically, linguistically, and culturally diverse; excited to write; yet underprepared for the kinds of writing demanded in middle school and beyond? For success in school, Standard English grammar isn't optional-it's an option every student must have. Don't be daunted. Francois and Zonana found a solution, and in Catching Up on Conventions they share lessons that help kids quickly master Standard English grammar.

  16. Active Fault Exploration and Seismic Hazard Assessment in Fuzhou City

    Institute of Scientific and Technical Information of China (English)

    Zhu Jinfang; Han Zhujun; Huang Zonglin; Xu Xiwei; Zheng Rongzhang; Fang Shengmin; Bai Denghai; Wang Guangcai; Min Wei; Wen Xueze

    2005-01-01

    It has been proven by a number of earthquake case studies that an active fault-induced earthquake beneath a city can be devastating. It is an urgent issue for seismic hazard reduction to explore the distribution of active faults beneath the urban area and identify the seismic source and the risks underneath. As a pilot project of active fault exploration in China, the project, entitled "Active fault exploration and seismic hazard assessment in Fuzhou City",started in early 2001 and passed the check before acceptance of China Earthquake Administration in August 2004. The project was aimed to solve a series of scientific issues such as fault location, dating, movement nature, deep settings, seismic risk and hazard,preparedness of earthquake prevention and disaster reduction, and etc. by means of exploration and assessment of active faults by stages, i.e., the preliminary survey and identification of active faults in target area, the exploration of deep seismotectonic settings, the risk evaluation of active seismogenic faults, the construction of geographic information system of active faults, and so on. A lot of exploration methods were employed in the project such as the detection of absorbed mercury, free mercury and radon in soil, the geological radar,multi-channel DC electrical method, tsansient electromagnetic method, shallow seismic refraction and reflection, effect contrast of explored sources, and various sounding experiments, to establish the buried Quaternary standard section of the Fuzhou basin. By summing up, the above explorations and experiments have achieved the following results and conclusions:(1) The results of the synthetic pilot project of active fault exploration in Fuzhou City demonstrate that, on the basis of sufficient collection, sorting out and analysis of geological,geophysical and borehole data, the best method for active fault exploration (location) and seismic risk assessnent (dating and characterizing) in urban area is the combination

  17. Importance of direct and indirect triggered seismicity

    CERN Document Server

    Helmstetter, A; Helmstetter, Agnes; Sornette, Didier

    2003-01-01

    Using the simple ETAS branching model of seismicity, which assumes that each earthquake can trigger other earthquakes, we quantify the role played by the cascade of triggered seismicity in controlling the rate of aftershock decay as well as in the overall level of seismicity in the presence of a constant external seismicity source. We show that, in this model, the proportion of triggered seismicity is equal to the proportion of secondary plus later-generation aftershocks, and is given by the average number of triggered events per earthquake. Based on these results and on the observation that a large fraction of seismicity are triggered earthquakes, we conclude that similarly a large fraction of aftershocks occurring a few hours or days after a mainshock are triggered indirectly by the mainshock.

  18. Seismic activity at the western Pyrenean edge

    Science.gov (United States)

    Ruiz, M.; Gallart, J.; Díaz, J.; Olivera, C.; Pedreira, D.; López, C.; González-Cortina, J. M.; Pulgar, J. A.

    2006-01-01

    The present-day seismicity at the westernmost part of the Pyrenean domain reported from permanent networks is of low to moderate magnitude. However, it is poorly constrained due to the scarce station coverage of the area. We present new seismic data collected from a temporary network deployed there for 17 months that provides an enhanced image of the seismic activity and its tectonic implications. Our results delineate the westward continuity of the E-W Pyrenean band of seismicity, through the Variscan Basque Massifs along the Leiza Fault, ending up at the Hendaya Fault. This seismicity belt is distributed on a crustal scale, dipping northward to almost 30 km depth. Other relevant seismic events located in the area can be related to the central segment of the Pamplona fault, and to different E-W thrust structures.

  19. Seismic spatial effects on dynamic response of long-span bridges in stationary inhomogeneous random fields

    Institute of Scientific and Technical Information of China (English)

    林家浩; 张亚辉; 赵岩

    2004-01-01

    The seismic analysis of long-span bridges subjected to multiple ground excitations is an important problem.The conventional response spectrum method neglects the spatial effects of ground motion, and therefore may result in questionable conclusions. The random vibration approach has been regarded as more reliable. Unfortunately, so far,computational difficulties have not yet been satisfactorily resolved. In this paper, an accurate and efficient random vibration approach - pseudo excitation method (PEM), by which the above difficulties are overcome, is presented. It has been successfully used in the three dimensional seismic analysis of a number of long-span bridges with thousands of degrees of freedom and dozens of supports. The numerical results of a typical bridge show that the seismic spatial etfects, particularly the wave passage effect, are sometimes quite important in evaluating the safety of long-span bridges.

  20. Seismic spatial effects on dynamic response of long-span bridges in stationary inhomogeneous random fields

    Science.gov (United States)

    Jiahao, Lin; Yahui, Zhang; Yan, Zhao

    2004-12-01

    The seismic analysis of long-span bridges subjected to multiple ground excitations is an important problem. The conventional response spectrum method neglects the spatial effects of ground motion, and therefore may result in questionable conclusions. The random vibration approach has been regarded as more reliable. Unfortunately, so far, computational difficulties have not yet been satisfactorily resolved. In this paper, an accurate and efficient random vibration approach — pseudo excitation method (PEM), by which the above difficulties are overcome, is presented. It has been successfully used in the three dimensional seismic analysis of a number of long-span bridges with thousands of degrees of freedom and dozens of supports. The numerical results of a typical bridge show that the seismic spatial effects, particularly the wave passage effect, are sometimes quite important in evaluating the safety of long-span bridges.

  1. pSIN: A scalable, Parallel algorithm for Seismic INterferometry of large-N ambient-noise data

    Science.gov (United States)

    Chen, Po; Taylor, Nicholas J.; Dueker, Ken G.; Keifer, Ian S.; Wilson, Andra K.; McGuffy, Casey L.; Novitsky, Christopher G.; Spears, Alec J.; Holbrook, W. Steven

    2016-08-01

    Seismic interferometry is a technique for extracting deterministic signals (i.e., ambient-noise Green's functions) from recordings of ambient-noise wavefields through cross-correlation and other related signal processing techniques. The extracted ambient-noise Green's functions can be used in ambient-noise tomography for constructing seismic structure models of the Earth's interior. The amount of calculations involved in the seismic interferometry procedure can be significant, especially for ambient-noise datasets collected by large seismic sensor arrays (i.e., "large-N" data). We present an efficient parallel algorithm, named pSIN (Parallel Seismic INterferometry), for solving seismic interferometry problems on conventional distributed-memory computer clusters. The design of the algorithm is based on a two-dimensional partition of the ambient-noise data recorded by a seismic sensor array. We pay special attention to the balance of the computational load, inter-process communication overhead and memory usage across all MPI processes and we minimize the total number of I/O operations. We have tested the algorithm using a real ambient-noise dataset and obtained a significant amount of savings in processing time. Scaling tests have shown excellent strong scalability from 80 cores to over 2000 cores.

  2. Seismic imaging of sandbox experiments – laboratory hardware setup and first reflection seismic sections

    OpenAIRE

    Kukowski, N.; Oncken, O.; M.-L. Buddensiek; C. M. Krawczyk

    2012-01-01

    With the study and technical development introduced here, we combine analogue sandbox simulation techniques with seismic physical modelling of sandbox models. For that purpose, we designed and developed a new mini-seismic facility for laboratory use, comprising a seismic tank, a PC-driven control unit, a positioning system, and piezo-electric transducers used here the first time in an array mode. To assess the possibilities and limits of seismic imaging of small-scale structures in sandbox mo...

  3. Seismic imaging of sandbox experiments – laboratory hardware setup and first reflection seismic sections

    OpenAIRE

    C. M. Krawczyk; Buddensiek, M.-L.; Oncken, O.; Kukowski, N.

    2013-01-01

    With the study and technical development introduced here, we combine analogue sandbox simulation techniques with seismic physical modelling of sandbox models. For that purpose, we designed and developed a new mini-seismic facility for laboratory use, comprising a seismic tank, a PC-driven control unit, a positioning system, and piezoelectric transducers used here for the first time in an array mode. To assess the possibilities and limits of seismic imaging of small-scale str...

  4. A Review of Seismicity in 2010

    Institute of Scientific and Technical Information of China (English)

    Ji Ping; Li Gang; Liu Jie; Ni Sidao

    2011-01-01

    @@ 1 SURVEY OF GLOBE SEISMICITY IN 2010 A total of 28 strong earthquakes with Ms ≥ 7.0 occurred in 2010 throughout the world according to the China Seismic Network (Table 1).The strongest was the Chile earthquake measuring Ms8.8 on February 27, 2010 (Fig.1).There was an apparent increase in frequency and the energy release of earthquakes in 2010, compared with seismicity in 2009.

  5. LANL seismic screening method for existing buildings

    Energy Technology Data Exchange (ETDEWEB)

    Dickson, S.L.; Feller, K.C.; Fritz de la Orta, G.O. [and others

    1997-01-01

    The purpose of the Los Alamos National Laboratory (LANL) Seismic Screening Method is to provide a comprehensive, rational, and inexpensive method for evaluating the relative seismic integrity of a large building inventory using substantial life-safety as the minimum goal. The substantial life-safety goal is deemed to be satisfied if the extent of structural damage or nonstructural component damage does not pose a significant risk to human life. The screening is limited to Performance Category (PC) -0, -1, and -2 buildings and structures. Because of their higher performance objectives, PC-3 and PC-4 buildings automatically fail the LANL Seismic Screening Method and will be subject to a more detailed seismic analysis. The Laboratory has also designated that PC-0, PC-1, and PC-2 unreinforced masonry bearing wall and masonry infill shear wall buildings fail the LANL Seismic Screening Method because of their historically poor seismic performance or complex behavior. These building types are also recommended for a more detailed seismic analysis. The results of the LANL Seismic Screening Method are expressed in terms of separate scores for potential configuration or physical hazards (Phase One) and calculated capacity/demand ratios (Phase Two). This two-phase method allows the user to quickly identify buildings that have adequate seismic characteristics and structural capacity and screen them out from further evaluation. The resulting scores also provide a ranking of those buildings found to be inadequate. Thus, buildings not passing the screening can be rationally prioritized for further evaluation. For the purpose of complying with Executive Order 12941, the buildings failing the LANL Seismic Screening Method are deemed to have seismic deficiencies, and cost estimates for mitigation must be prepared. Mitigation techniques and cost-estimate guidelines are not included in the LANL Seismic Screening Method.

  6. Using Composites in Seismic Retrofit Applications

    Science.gov (United States)

    2007-11-02

    composite shells/wraps/ jackets for the seismic retrofit of concrete columns, a number of these could potentially be used, and these are briefly discussed...Use of Concrete and Steel Jackets for Seismic Retrofit For reinforced concrete columns with these substandard reinforcement details, retrofit systems...an overview of the variations possible for the application of FRP composite jackets for purposes of seismic retrofit of columns. Wherever possible

  7. New Methodology for Rapid Seismic Risk Assessment

    Science.gov (United States)

    Melikyan, A. E.; Balassanian, S. Y.

    2002-05-01

    Seismic risk is growing worldwide and is, increasingly, a problem of developing countries. Along with growing urbanization future earthquakes will have more disastrous social and economic consequences. Seismic risk assessment and reduction are important goals for each country located in seismically active zone. For Armenia these goals are of primary importance because the results of studies carried out by Armenian NSSP for assessment of the losses caused by various types of disasters in Armenia had shown that earthquakes are the most disastrous hazard for Armenia. The strategy for seismic risk reduction in 1999 was adopted by the Government of Armenia as a high priority state program. The world experience demonstrates that for efficient response the rapid assessment of seismic losses is necessary. There are several state-of-the-art approaches for seismic risk assessment (Radius, Hazus, etc.). All of them required large amount of various input data, which is impossible to collect in many developing countries, in particular in Armenia. Taking into account this very serious problem existing for developing countries, as well as rapid seismic risk assessment need immediately after strong earthquake the author undertake the attempt to contribute into a new approach for rapid seismic risk assessment under the supervision of Prof. S. Balassanian. The analysis of numerous factors influencing seismic risk in Armenia shows that the following elements contribute most significantly to the possible losses: seismic hazard; density of population; vulnerability of structures. Proposed approach for rapid seismic risk assessment based on these three factors has been tested for several seismic events. These tests have shown that such approach might represent from 80 to 90 percent of real losses.

  8. Location, Reprocessing, and Analysis of Two Dimensional Seismic Reflection Data on the Jicarilla Apache Indian Reservation, New Mexico, Final Report, September 1, 1997-February 1, 2000

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie; Taylor, David J.; Huffman, Jr., A. Curtis

    2000-06-08

    Multichannel surface seismic reflection data recording is a standard industry tool used to examine various aspects of geology, especially the stratigraphic characteristics and structural style of sedimentary formations in the subsurface. With the help of the Jicarilla Apache Tribe and the Bureau of Indian Affairs we were able to locate over 800 kilometers (500 miles) of multichannel seismic reflection data located on the Jicarilla Apache Indian reservation. Most of the data was received in hardcopy form, but there were data sets where either the demultiplexed digital field data or the processed data accompanied the hardcopy sections. The seismic data was acquired from the mid 1960's to the early 1990's. The most extensive seismic coverage is in the southern part of the reservation, although there are two good surveys located on the northeastern and northwestern parts of the reservation. Most of the data show that subsurface formations are generally flat-lying in the southern and western portion of the reservation. There is, however, a significant amount of structure imaged on seismic data located over the San Juan Basin margin along the east-central and northern part of the reservation. Several west to east trending lines in these areas show a highly faulted monoclinal structure from the deep basin in the west up onto the basin margin to the east. Hydrocarbon exploration in flat lying formations is mostly stratigraphic in nature. Where there is structure in the subsurface and indications are that rocks have been folded, faulted, and fractured, exploration has concentrated on structural traps and porosity/permeability "sweet spots" caused by fracturing. Therefore, an understanding of the tectonics influencing the entire section is critical in understanding mechanisms for generating faults and fractures in the Cretaceous. It is apparent that much of the hydrocarbon production on the reservation is from fracture porosity in either source or reservoir

  9. Causality between expansion of seismic cloud and maximum magnitude of induced seismicity in geothermal field

    Science.gov (United States)

    Mukuhira, Yusuke; Asanuma, Hiroshi; Ito, Takatoshi; Häring, Markus

    2016-04-01

    Occurrence of induced seismicity with large magnitude is critical environmental issues associated with fluid injection for shale gas/oil extraction, waste water disposal, carbon capture and storage, and engineered geothermal systems (EGS). Studies for prediction of the hazardous seismicity and risk assessment of induced seismicity has been activated recently. Many of these studies are based on the seismological statistics and these models use the information of the occurrence time and event magnitude. We have originally developed physics based model named "possible seismic moment model" to evaluate seismic activity and assess seismic moment which can be ready to release. This model is totally based on microseismic information of occurrence time, hypocenter location and magnitude (seismic moment). This model assumes existence of representative parameter having physical meaning that release-able seismic moment per rock volume (seismic moment density) at given field. Seismic moment density is to be estimated from microseismic distribution and their seismic moment. In addition to this, stimulated rock volume is also inferred by progress of microseismic cloud at given time and this quantity can be interpreted as the rock volume which can release seismic energy due to weakening effect of normal stress by injected fluid. Product of these two parameters (equation (1)) provide possible seismic moment which can be released from current stimulated zone as a model output. Difference between output of this model and observed cumulative seismic moment corresponds the seismic moment which will be released in future, based on current stimulation conditions. This value can be translated into possible maximum magnitude of induced seismicity in future. As this way, possible seismic moment can be used to have feedback to hydraulic stimulation operation in real time as an index which can be interpreted easily and intuitively. Possible seismic moment is defined as equation (1), where D

  10. Seismic Response Control Of Structures Using Semi-Active and Passive Variable Stiffness Devices

    Science.gov (United States)

    Salem, Mohamed M. A.

    Controllable devices such as Magneto-Rheological Fluid Dampers, Electro-Rheological Dampers, and controllable friction devices have been studied extensively with limited implementation in real structures. Such devices have shown great potential in reducing seismic demands, either as smart base isolation systems, or as smart devices for multistory structures. Although variable stiffness devices can be used for seismic control of structures, the vast majority of research effort has been given to the control of damping. The primary focus of this dissertation is to evaluate the seismic control of structures using semi-active and passive variable stiffness characteristics. Smart base isolation systems employing variable stiffness devices have been studied, and two semi-active control strategies are proposed. The control algorithms were designed to reduce the superstructure and base accelerations of seismically isolated structures subject to near-fault and far-field ground motions. Computational simulations of the proposed control algorithms on the benchmark structure have shown that excessive base displacements associated with the near-fault ground motions may be better mitigated with the use of variable stiffness devices. However, the device properties must be controllable to produce a wide range of stiffness changes for an effective control of the base displacements. The potential of controllable stiffness devices in limiting the base displacement due to near-fault excitation without compromising the performance of conventionally isolated structures, is illustrated. The application of passive variable stiffness devices for seismic response mitigation of multistory structures is also investigated. A stiffening bracing system (SBS) is proposed to replace the conventional bracing systems of braced frames. An optimization process for the SBS parameters has been developed. The main objective of the design process is to maintain a uniform inter-story drift angle over the

  11. The character and amplitude of 'discontinuous' bottom-simulating reflections in marine seismic data

    Science.gov (United States)

    Hillman, Jess I. T.; Cook, Ann E.; Sawyer, Derek E.; Küçük, H. Mert; Goldberg, David S.

    2017-02-01

    Bottom-simulating reflections (BSRs) identified in seismic data are well documented; and are commonly interpreted to indicate the presence of gas hydrates along continental margins, as well as to estimate regional volumes of gas hydrate. A BSR is defined as a reflection that sub-parallels the seafloor but is opposite in polarity and cross-cuts dipping sedimentary strata. BSRs form as a result of a strong negative acoustic impedance contrast. BSRs, however, are a diverse seismic phenomena that manifest in strikingly contrasting ways in different geological settings, and in different seismic data types. We investigate the characteristics of BSRs, using conventional and high resolution, 2D and 3D seismic data sets in three locations: the Terrebonne and Orca Basins in the Gulf of Mexico, and Blake Ridge on the US Atlantic Margin. The acquisition geometry and frequency content of the seismic data significantly impact the resultant character of BSRs, as observed with depth and amplitude maps of the BSRs. Furthermore, our amplitude maps reinforce the concept that the BSR represents a zone, over which the transition from hydrate to free gas occurs, as opposed to the conventional model of the BSR occurring at a single interface. Our results show that a BSR can be mapped in three dimensions but it is not spatially continuous, at least not at the basin scale. Rather, a BSR manifests itself as a discontinuous, or patchy, reflection and only at local scales is it continuous. We suggest the discontinuous nature of BSRs is the result of variable saturation and distribution of free gas and hydrate, acquisition geometry and frequency content of the recorded seismic data. The commonly accepted definition of a BSR should be broadened with careful consideration of these factors, to represent the uppermost extent of enhanced amplitude at the shallowest occurrence of free gas trapped by overlying hydrate-bearing sediments.

  12. Infrasound Generation from the HH Seismic Hammer.

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Kyle Richard

    2014-10-01

    The HH Seismic hammer is a large, "weight-drop" source for active source seismic experiments. This system provides a repetitive source that can be stacked for subsurface imaging and exploration studies. Although the seismic hammer was designed for seismological studies it was surmised that it might produce energy in the infrasonic frequency range due to the ground motion generated by the 13 metric ton drop mass. This study demonstrates that the seismic hammer generates a consistent acoustic source that could be used for in-situ sensor characterization, array evaluation and surface-air coupling studies for source characterization.

  13. Issues on the seismic performance of embankments

    DEFF Research Database (Denmark)

    Zania, Varvara; Tsompanakis, Y.; Psarropoulos, P.N.

    2011-01-01

    Seismic vulnerability of embankments with reinforcement at their base is strongly related to the slip displacements, which may accumulate along the interface between soil and geosynthetic. The inertial accelerations within the embankment, due to the propagation of seismic waves and the subsequent...... performed. The stability of the soil mass was estimated in terms of seismic slip deformations along low-shear-strength interfaces, while the response of the embankment is assessed through the acceleration time histories at the top surface of the soil. This investigation presents also the effect of the most......’t be neglected during the seismic design of embankments....

  14. Seismic Structure of Southern African Cratons

    DEFF Research Database (Denmark)

    Soliman, Mohammad Youssof Ahmad; Artemieva, Irina; Levander, Alan

    2014-01-01

    Cratons are extremely stable continental crustal areas above thick depleted lithosphere. These regions have remained largely unchanged for more than 2.5 Ga. This study presents a new seismic model of the seismic structure of the crust and lithospheric mantle constrained by seismic receiver...... functions and finite-frequency tomography based on data from the South Africa Seismic Experiment (SASE). Combining the two methods provides high vertical and lateral resolution. The main results obtained are (1) the presence of a highly heterogeneous crustal structure, in terms of thickness, composition (as...

  15. Infrasound Generation from the HH Seismic Hammer.

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Kyle Richard

    2014-10-01

    The HH Seismic hammer is a large, "weight-drop" source for active source seismic experiments. This system provides a repetitive source that can be stacked for subsurface imaging and exploration studies. Although the seismic hammer was designed for seismological studies it was surmised that it might produce energy in the infrasonic frequency range due to the ground motion generated by the 13 metric ton drop mass. This study demonstrates that the seismic hammer generates a consistent acoustic source that could be used for in-situ sensor characterization, array evaluation and surface-air coupling studies for source characterization.

  16. A Review of Seismicity in 2004

    Institute of Scientific and Technical Information of China (English)

    Li Gang; Liu Jie; Yu Surong

    2005-01-01

    @@ 1. SURVEY OF GLOBE SEISMICITY IN 2004 A total of 19 strong earthquakes with Ms≥7.0 occurred in the world according to the Chinese Seismic Station Network in 2004 (Table 1 ). The strongest earthquake was the Sumatra earthquake with Ms 8.7 near the northwest coast of Sumatra on December 26 ( Fig. 1 ). Global seismicity maintains the same patterns from recent years, being distributed mainly on the western part of the circum-Pacific seismic zone. Remarkable macroseismic activities were seen in the India-Australian plate and in the Japan region. The macroseismic activities of Ms≥7.0 in 2004 were as follows:

  17. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  18. SISYPHUS: A high performance seismic inversion factory

    Science.gov (United States)

    Gokhberg, Alexey; Simutė, Saulė; Boehm, Christian; Fichtner, Andreas

    2016-04-01

    In the recent years the massively parallel high performance computers became the standard instruments for solving the forward and inverse problems in seismology. The respective software packages dedicated to forward and inverse waveform modelling specially designed for such computers (SPECFEM3D, SES3D) became mature and widely available. These packages achieve significant computational performance and provide researchers with an opportunity to solve problems of bigger size at higher resolution within a shorter time. However, a typical seismic inversion process contains various activities that are beyond the common solver functionality. They include management of information on seismic events and stations, 3D models, observed and synthetic seismograms, pre-processing of the observed signals, computation of misfits and adjoint sources, minimization of misfits, and process workflow management. These activities are time consuming, seldom sufficiently automated, and therefore represent a bottleneck that can substantially offset performance benefits provided by even the most powerful modern supercomputers. Furthermore, a typical system architecture of modern supercomputing platforms is oriented towards the maximum computational performance and provides limited standard facilities for automation of the supporting activities. We present a prototype solution that automates all aspects of the seismic inversion process and is tuned for the modern massively parallel high performance computing systems. We address several major aspects of the solution architecture, which include (1) design of an inversion state database for tracing all relevant aspects of the entire solution process, (2) design of an extensible workflow management framework, (3) integration with wave propagation solvers, (4) integration with optimization packages, (5) computation of misfits and adjoint sources, and (6) process monitoring. The inversion state database represents a hierarchical structure with

  19. Detailed seismic modeling of induced seismicity at the Groningen gas field

    NARCIS (Netherlands)

    Paap, B.F.; Steeghs, T.P.H.; Kraaijpoel, D.A.

    2016-01-01

    We present the results of a detailed seismic modeling study of induced seismicity observed at the Groningen gas field, situated in the North-eastern part of the Netherlands. Seismic simulations are valuable to support the interpretation of observed earthquake waveforms recordings and to increase the

  20. ['Gold standard', not 'golden standard'

    NARCIS (Netherlands)

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same w

  1. ['Gold standard', not 'golden standard'

    NARCIS (Netherlands)

    Claassen, J.A.H.R.

    2005-01-01

    In medical literature, both 'gold standard' and 'golden standard' are employed to describe a reference test used for comparison with a novel method. The term 'gold standard' in its current sense in medical research was coined by Rudd in 1979, in reference to the monetary gold standard. In the same

  2. High Resolution Seismic Imaging of the Brawley Seismic Fault Zone

    Science.gov (United States)

    Goldman, M.; Catchings, R. D.; Rymer, M. J.; Lohman, R. B.; McGuire, J. J.; Sickler, R. R.; Criley, C.; Rosa, C.

    2011-12-01

    In March 2010, we acquired a series of high-resolution P-wave seismic reflection and refraction data sets across faults in the Brawley seismic zone (BSZ) within the Salton Sea Geothermal Field (SSGF). Our objectives were to determine the dip, possible structural complexities, and seismic velocities within the BSZ. One dataset was 3.4 km long trending east-west, and consisted of 334 shots recorded by a 2.4 km spread of 40 hz geophones placed every 10 meters. The spread was initially laid out from the first station at the eastern end of the profile to roughly 2/3 into the profile. After about half the shots, the spread was shifted from roughly 1/3 into the profile to the last station at the western end of the profile. P-waves were generated by Betsy-Seisgun 'shots' spaced every 10 meters. Initial analysis of first breaks indicate near-surface velocities of ~500-600 meters/sec, and deeper velocities of around 2000 meters/sec. Preliminary investigation of shot gathers indicate a prominent fault that extends to the ground surface. This fault is on a projection of the Kalin fault from about 40 m to the south, and broke the surface down to the west with an approximately north-south strike during a local swarm of earthquakes in 2005 and also slipped at the surface in association with the 2010 El Mayor-Cucapah earthquake in Baja California. The dataset is part of the combined Obsidian Creep data set, and provides the most detailed, publicly available subsurface images of fault structures in the BSZ and SSGF.

  3. Mesoscopics of ultrasound and seismic waves: application to passive imaging

    Science.gov (United States)

    Larose, É.

    2006-05-01

    This manuscript deals with different aspects of the propagation of acoustic and seismic waves in heterogeneous media, both simply and multiply scattering ones. After a short introduction on conventional imaging techniques, we describe two observations that demonstrate the presence of multiple scattering in seismic records: the equipartition principle, and the coherent backscattering effect (Chap. 2). Multiple scattering is related to the mesoscopic nature of seismic and acoustic waves, and is a strong limitation for conventional techniques like medical or seismic imaging. In the following part of the manuscript (Chaps. 3 5), we present an application of mesoscopic physics to acoustic and seismic waves: the principle of passive imaging. By correlating records of ambient noise or diffuse waves obtained at two passive sensors, it is possible to reconstruct the impulse response of the medium as if a source was placed at one sensor. This provides the opportunity of doing acoustics and seismology without a source. Several aspects of this technique are presented here, starting with theoretical considerations and numerical simulations (Chaps. 3, 4). Then we present experimental applications (Chap. 5) to ultrasound (passive tomography of a layered medium) and to seismic waves (passive imaging of California, and the Moon, with micro-seismic noise). Physique mésoscopique des ultrasons et des ondes sismiques : application à l'imagerie passive. Cet article de revue rassemble plusieurs aspects fondamentaux et appliqués de la propagation des ondes acoustiques et élastiques dans les milieux hétérogènes, en régime de diffusion simple ou multiple. Après une introduction sur les techniques conventionelles d'imagerie sismique et ultrasonore, nous présentons deux expériences qui mettent en évidence la présence de diffusion multiple dans les enregistrements sismologiques : l'équipartition des ondes, et la rétrodiffusion cohérente (Chap. 2). La diffusion multiple des

  4. Integral anomalous effect of an oil and gas deposit in a seismic wave field

    Energy Technology Data Exchange (ETDEWEB)

    Korostyshevskiy, M.B.; Nabokov, G.N.

    1981-01-01

    The basic precepts of an elaborated version of a procedure for forecasting (direct exploration) of oil and gas deposits according to seismic prospecting data MOV are examined. This procedure was previously called the procedure of analysis of the integral affect of an oil and gas deposit in a seismic wave field (MIIEZ-VP). The procedure is implemented in the form of an automated system ASOM-VP for the BESM-4 computer in a standard configuration equipped with standard input-output devices for seismic information (''Potok'', MVU, ''Atlas''). The entire procedure of processing from input of data into the computer to output of resulting maps and graphs on graph plotter ''Atlas'' is automated. Results of testing of procedure MIIEZ-VP and system ASOM-VP on drilled areas of Kazakhstan, Azerbaydzhan and Uzbekistan are cited.

  5. Investigations on Local Seismic Phases and Modeling of Seismic Signals

    Science.gov (United States)

    1993-10-31

    Brocher, T. M., 1987. Coincident seismic reflection/refraction studies of the continental lithosphere: a global review, Rev. Geophys., 25, 723-742...36.39 Laza 300889 42.105 -07.516 13. 3.7 3.9 35.37 Nazare 310389 39.601 -09.493 25 ? 3.7 3.5 33.35 Camero 2009 87 42.138 - 02.476 05. 3.5 3.6 34.35 Aldea ...used might be accurate enough to describe the global waveforms recorded. NEAR SOURCE SITE EFFECTS EXPECTED AT YUCCA FLAT The map of Paleozoic basement

  6. Development of Simplified Probabilistic Risk Assessment Model for Seismic Initiating Event

    Energy Technology Data Exchange (ETDEWEB)

    S. Khericha; R. Buell; S. Sancaktar; M. Gonzalez; F. Ferrante

    2012-06-01

    ABSTRACT This paper discusses a simplified method to evaluate seismic risk using a methodology built on dividing the seismic intensity spectrum into multiple discrete bins. The seismic probabilistic risk assessment model uses Nuclear Regulatory Commission’s (NRC’s) full power Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The seismic PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from the full power SPAR model with seismic event tree logic. The peak ground acceleration is divided into five bins. The g-value for each bin is estimated using the geometric mean of lower and upper values of that particular bin and the associated frequency for each bin is estimated by taking the difference between upper and lower values of that bin. The component’s fragilities are calculated for each bin using the plant data, if available, or generic values of median peak ground acceleration and uncertainty values for the components. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheets that include the performance shaping factors (PSFs). The results are then used to estimate human error probabilities (HEPs) of interest. This work is expected to improve the NRC’s ability to include seismic hazards in risk assessments for operational events in support of the reactor oversight program (e.g., significance determination process).

  7. Revision of seismic design codes corresponding to building damages in the ``5.12'' Wenchuan earthquake

    Science.gov (United States)

    Wang, Yayong

    2010-06-01

    A large number of buildings were seriously damaged or collapsed in the “5.12” Wenchuan earthquake. Based on field surveys and studies of damage to different types of buildings, seismic design codes have been updated. This paper briefly summarizes some of the major revisions that have been incorporated into the “Standard for classification of seismic protection of building constructions GB50223-2008” and “Code for Seismic Design of Buildings GB50011-2001.” The definition of seismic fortification class for buildings has been revisited, and as a result, the seismic classifications for schools, hospitals and other buildings that hold large populations such as evacuation shelters and information centers have been upgraded in the GB50223-2008 Code. The main aspects of the revised GB50011-2001 code include: (a) modification of the seismic intensity specified for the Provinces of Sichuan, Shanxi and Gansu; (b) basic conceptual design for retaining walls and building foundations in mountainous areas; (c) regularity of building configuration; (d) integration of masonry structures and pre-cast RC floors; (e) requirements for calculating and detailing stair shafts; and (f) limiting the use of single-bay RC frame structures. Some significant examples of damage in the epicenter areas are provided as a reference in the discussion on the consequences of collapse, the importance of duplicate structural systems, and the integration of RC and masonry structures.

  8. A GIS-based time-dependent seismic source modeling of Northern Iran

    Science.gov (United States)

    Hashemi, Mahdi; Alesheikh, Ali Asghar; Zolfaghari, Mohammad Reza

    2017-01-01

    The first step in any seismic hazard study is the definition of seismogenic sources and the estimation of magnitude-frequency relationships for each source. There is as yet no standard methodology for source modeling and many researchers have worked on this topic. This study is an effort to define linear and area seismic sources for Northern Iran. The linear or fault sources are developed based on tectonic features and characteristic earthquakes while the area sources are developed based on spatial distribution of small to moderate earthquakes. Time-dependent recurrence relationships are developed for fault sources using renewal approach while time-independent frequency-magnitude relationships are proposed for area sources based on Poisson process. GIS functionalities are used in this study to introduce and incorporate spatial-temporal and geostatistical indices in delineating area seismic sources. The proposed methodology is used to model seismic sources for an area of about 500 by 400 square kilometers around Tehran. Previous researches and reports are studied to compile an earthquake/fault catalog that is as complete as possible. All events are transformed to uniform magnitude scale; duplicate events and dependent shocks are removed. Completeness and time distribution of the compiled catalog is taken into account. The proposed area and linear seismic sources in conjunction with defined recurrence relationships can be used to develop time-dependent probabilistic seismic hazard analysis of Northern Iran.

  9. Simplified Assessment of R3 Nominal Assurance Degree to Seismic Action of the Existing Masonry Dwellings

    Directory of Open Access Journals (Sweden)

    Teodor Broşteanu

    2008-01-01

    Full Text Available This paper refers to the assessment of the performance level of a building for a given seismic hazard level. Building performance level describes the expected seismic performance given by the computation of R3 Nominal Assurance Degree to Seismic Action of the Existing Masonry Dwellings and Monumental Buildings according to the Romanian Norm P100:1992 [1], modified on 1996 with the chapters 11 and 12, until the Part 3 of P100-1:2006 [2], will be performed for the Assessment and Strengthening Structural Design of the Seismic Vulnerable, Existing Buildings, in the frame of SR EN 1998-1:2004 EC8 [3]. The framing of damages into the potential risk degrees has a social and economic impact. Assessment and retrofitting of the existing buildings have represented a huge engineering challenge as a distinct problem versus a new building design. The performance level of a vulnerable existing building shows us the expected seismic performance level due to the classified damages, the pattern of cracks, the interruption of function, the economic losses and the needed interventions, all in function of the importance class of building on next life span of use. On recommends the computation of R (R3 Nominal Assurance Degree to Seismic Action of the Vulnerable Dwellings for the assessing and strengthening design, in comparison to both norms because of the bearing conventional seismic load computed by [1], will result less than the value which will be computed by the Part 3 of P100-1:2006, i.e. the norm P100:1992 is more severe. In the case of the breakable fracture probability of the existing structural masonry members, one recommends a bigger value of ? – reduction factor unless the given values by [1] for a new structure with a high ductility, especially for the deflections calibration on the same limit state.

  10. Global thermochemical inversion of seismic waveforms, gravity satellite data, and topography

    Science.gov (United States)

    Fullea, J.; Lebedev, S.; Martinec, Z.

    2016-12-01

    Conventional methods of seismic tomography, topography, gravity and electromagnetic data analysis and geodynamic modelling constrain distributions of seismic velocity, density, electrical conductivity, and viscosity at depth, all depending on temperature and composition of Earth's rocks. However, modelling and interpretation of multiple data provide a multifaceted image of the true thermochemical structure of the Earth that needs to be consistently integrated. A simple combination of gravity, electromagnetic, geodynamics, petrological and seismic models alone is insufficient due to the non-uniqueness and different sensitivities of these models, and the internal consistency relationships that must connect all the intermediate parameters describing the Earth. In fact, global Earth models based on different observables often lead to rather different images of the Earth. A breakthrough in global and consistent imaging of the fine-scale thermochemical hydrous and rheological structure of the Earth's lithosphere and underlying mantle is needed. Thermodynamic and petrological links between seismic velocities, density, electrical conductivity, viscosity, melt, water, temperature, pressure and composition within the Earth can now be modelled accurately using new methods of computational petrology and data from laboratory experiments. The growth of very large terrestrial and satellite geophysical data over the last few years, together with the advancement of petrological and geophysical modelling techniques, now present an opportunity for global, thermochemical and deformation 3D imaging of the lithosphere and underlying upper mantle with unprecedented resolution. Here we present a method for self-consistent joint inversion of multiple data sets, including seismic, satellite gravity and surface topography data, applied to obtain a detailed and robust global thermochemical image of the lithosphere and underlying upper mantle. This project combines state-of-the-art seismic

  11. Reduction of Large Seismic Deformations using Elasto-plastic Passive Energy Dissipaters

    Directory of Open Access Journals (Sweden)

    K. Sathish Kumar

    2003-01-01

    Full Text Available The design of supporting systems for pipelines carrying highly toxic or radioactive liquids at very high temperature, is an important issue in the safety aspect for a nuclear power installation. These pipeline systems are normally designed to be held rigid by conventional snubber supports for protection from earthquake. The pipeline system design must balance the seismic deformations and other deformations due to thermal effect. A rigid pipeline system using conventional snubber supports always leads to an increase in thermal stresses, hence a rational seismic design for pipeline supporting systems becomes essential. Contrary to this rigid design, it is possible to design a flexible pipeline system and to decrease the seismic response by increasing the damping using passive energy absorbing (PEA element, which dissipates vibration energy. An X-shaped or a hourglass-shaped metal element is a classic example of elasto-plastic passive energy absorber of metallic yielding type. The inherent ductile property of metals like steel, which undergoes stable energy dissipation in the plastic region, is made use of in achieving energy loss. This paper presents the experimental and analytical studies carried out on yielding-type elasto-plastic PEA elements to be used in a passive energy dissipating device for the control of large seismic deformations of pipelines subjected to earthquake loading.

  12. USING THE KARHUNEN-LOÈVE TRANSFORM TO SUPPRESS GROUND ROLL IN SEISMIC DATA

    Directory of Open Access Journals (Sweden)

    Kazmierczak Thaís de Souza

    2005-08-01

    Full Text Available ABSTRACTThe Sacchi's algorithm (2002 based on the Karhunen-Loève (K-L Transform was modified and implemented to suppress Ground Roll without distortion of the reflection signals, it provided better results than conventional techniques for noise removal like f-k, High-Pass and Band Pass Filters. The K-L Transform is well known in other fields as image processing (Levy and Linderbaurn, 2000, face, iris and fingerprint identification. A seismic section is an image of subsurface where the K-L can be useful in seismic processing because spatially uncorrelated signals can be removed providing a clear and coherent image. The algorithm was applied to seismic data generated with hammer, thumper and explosive sources. Conventional processing flows were used, but one replaced filters with K-L Transform, providing stacked sections. The K-L Transform recovers better the reflector amplitudes when compared with others filters, also it removes refractions that cause unreal shallow events and increases the lateral coherence of seismic events showing a more interpretable geology.

  13. The spatial data-adaptive minimum-variance distortionless-response beamformer on seismic single-sensor data

    NARCIS (Netherlands)

    Panea, I.; Drijkoningen, G.G.

    2008-01-01

    Coherent noise generated by surface waves or ground roll within a heterogeneous near surface is a major problem in land seismic data. Array forming based on single-sensor recordings might reduce such noise more robustly than conventional hardwired arrays. We use the minimum-variance

  14. Seismic Structural Setting of Western Farallon Basin, Southern Gulf of California, Mexico.

    Science.gov (United States)

    Pinero-Lajas, D.; Gonzalez-Fernandez, A.; Lopez-Martinez, M.; Lonsdale, P.

    2007-05-01

    Data from a number of high resolution 2D multichannel seismic (MCS) lines were used to investigate the structure and stratigraphy of the western Farallon basin in the southern Gulf of California. A Generator-Injector air gun provided a clean seismic source shooting each 12 s at a velocity of 6 kts. Each signal was recorded during 6- 8 s, at a sampling interval of 1 ms, by a 600 m long digital streamer with 48 channels and a spacing of 12.5 m. The MCS system was installed aboard CICESE's (Centro de Investigacion Cientifica y de Educacion Superior de Ensenada) 28 m research vessel Francisco de Ulloa. MCS data were conventionally processed, to obtain post- stack time-migrated seismic sections. The MCS seismic sections show a very detailed image of the sub-bottom structure up to 2-3 s two-way travel time (aprox. 2 km). We present detailed images of faulting based on the high resolution and quality of these data. Our results show distributed faulting with many active and inactive faults. Our study also constrains the depth to basement near the southern Baja California eastern coast. The acoustic basement appears as a continuous feature in the western part of the study area and can be correlated with some granite outcrops located in the southern Gulf of California islands. To the East, near the center of the Farallon basin, the acoustic basement changes, it is more discontinuous, and the seismic sections show a number of diffracted waves.

  15. SEISMIC FRAGILITY ANALYSIS OF IMPROVED RC FRAMES USING DIFFERENT TYPES OF BRACING

    Directory of Open Access Journals (Sweden)

    HAMED HAMIDI JAMNANI

    2017-04-01

    Full Text Available Application of bracings to increase the lateral stiffness of building structures is a technique of seismic improvement that engineers frequently have recourse to. Accordingly, investigating the role of bracings in concrete structures along with the development of seismic fragility curves are of overriding concern to civil engineers. In this research, an ordinary RC building, designed according to the 1st edition of Iranian seismic code, was selected for examination. According to FEMA 356 code, this building is considered to be vulnerable. To improve the seismic performance of this building, 3 different types of bracings, which are Concentrically Braced Frames, Eccentrically Braced Frames and Buckling Restrained Frames were employed, and each bracing element was distributed in 3 different locations in the building. The researchers developed fragility curves and utilized 30 earthquake records on the Peak Ground Acceleration seismic intensity scale to carry out a time history analysis. Tow damage scale, including Inter-Story Drifts and Plastic Axial Deformation were also used. The numerical results obtained from this investigation confirm that Plastic Axial Deformation is more reliable than conventional approaches in developing fragility curves for retrofitted frames. In lieu of what is proposed, the researchers selected the suitable damage scale and developed and compared log-normal distribution of fragility curves first for the original and then for the retrofitted building.

  16. Passive monitoring for near surface void detection using traffic as a seismic source

    Science.gov (United States)

    Zhao, Y.; Kuzma, H. A.; Rector, J.; Nazari, S.

    2009-12-01

    In this poster we present preliminary results based on our several field experiments in which we study seismic detection of voids using a passive array of surface geophones. The source of seismic excitation is vehicle traffic on nearby roads, which we model as a continuous line source of seismic energy. Our passive seismic technique is based on cross-correlation of surface wave fields and studying the resulting power spectra, looking for "shadows" caused by the scattering effect of a void. High frequency noise masks this effect in the time domain, so it is difficult to see on conventional traces. Our technique does not rely on phase distortions caused by small voids because they are generally too tiny to measure. Unlike traditional impulsive seismic sources which generate highly coherent broadband signals, perfect for resolving phase but too weak for resolving amplitude, vehicle traffic affords a high power signal a frequency range which is optimal for finding shallow structures. Our technique results in clear detections of an abandoned railroad tunnel and a septic tank. The ultimate goal of this project is to develop a technology for the simultaneous imaging of shallow underground structures and traffic monitoring near these structures.

  17. Near-surface 3D reflections seismic survey; Sanjigen senso hanshaho jishin tansa

    Energy Technology Data Exchange (ETDEWEB)

    Nakahigashi, H.; Mitsui, H.; Nakano, O.; Kobayashi, T. [DIA Consultants Co. Ltd., Tokyo (Japan)

    1997-05-27

    Faults are being actively investigated across Japan since the Great Hanshin-Awaji Earthquake. Discussed in this report is the application of the 3D near-surface reflection seismic survey in big cities. Data from trenching and drilling is used for the geological interpretation of the surroundings of a fault, and the reflection seismic survey is used to identify the position, etc., of the fault. In this article, when the results obtained from the experimental field are examined, it is found that the conventional 2D imaging reflection survey betrays the limit of its capability when the geological structure is complicated, that the 3D reflection seismic survey, on the contrary, is capable of high-precision imaging and, when augmented by drilling, etc., becomes capable of a more detailed interpretation, and that it also contributes effectively to the improvement of local disaster prevention in big cities. Using as the model the Tachikawa fault that runs near JR Tachikawa Station, embodiment of the 3D reflection seismic survey is reviewed. For the acquisition of data excellent in quality in a 3D reflection seismic survey conducted utilizing the roads in the sector chosen for experiment in the urban area, the shock generating points and receiving points should be positioned by taking into account the parameters in the bin arranging process so that the mid-points will be regularly distributed on the surface. 3 refs., 11 figs., 1 tab.

  18. Seismic monitoring of torrential and fluvial processes

    Science.gov (United States)

    Burtin, Arnaud; Hovius, Niels; Turowski, Jens M.

    2016-04-01

    In seismology, the signal is usually analysed for earthquake data, but earthquakes represent less than 1 % of continuous recording. The remaining data are considered as seismic noise and were for a long time ignored. Over the past decades, the analysis of seismic noise has constantly increased in popularity, and this has led to the development of new approaches and applications in geophysics. The study of continuous seismic records is now open to other disciplines, like geomorphology. The motion of mass at the Earth's surface generates seismic waves that are recorded by nearby seismometers and can be used to monitor mass transfer throughout the landscape. Surface processes vary in nature, mechanism, magnitude, space and time, and this variability can be observed in the seismic signals. This contribution gives an overview of the development and current opportunities for the seismic monitoring of geomorphic processes. We first describe the common principles of seismic signal monitoring and introduce time-frequency analysis for the purpose of identification and differentiation of surface processes. Second, we present techniques to detect, locate and quantify geomorphic events. Third, we review the diverse layout of seismic arrays and highlight their advantages and limitations for specific processes, like slope or channel activity. Finally, we illustrate all these characteristics with the analysis of seismic data acquired in a small debris-flow catchment where geomorphic events show interactions and feedbacks. Further developments must aim to fully understand the richness of the continuous seismic signals, to better quantify the geomorphic activity and to improve the performance of warning systems. Seismic monitoring may ultimately allow the continuous survey of erosion and transfer of sediments in the landscape on the scales of external forcing.

  19. Shallow shear-wave reflection seismics in the tsunami struck Krueng Aceh River Basin, Sumatra

    Directory of Open Access Journals (Sweden)

    U. Polom

    2008-01-01

    Full Text Available As part of the project "Management of Georisk" (MANGEONAD of the Federal Institute for Geosciences and Natural Resources (BGR, Hanover, high resolution shallow shear-wave reflection seismics was applied in the Indonesian province Nanggroe Aceh Darussalam, North Sumatra in cooperation with the Government of Indonesia, local counterparts, and the Leibniz Institute for Applied Geosciences, Hanover. The investigations were expected to support classification of earthquake site effects for the reconstruction of buildings and infrastructure as well as for groundwater exploration. The study focussed on the city of Banda Aceh and the surroundings of Aceh Besar. The shear-wave seismic surveys were done parallel to standard geoengineering investigations like cone penetrometer tests to support subsequent site specific statistical calibration. They were also partly supplemented by shallow p-wave seismics for the identification of (a elastic subsurface parameters and (b zones with abundance of groundwater. Evaluation of seismic site effects based on shallow reflection seismics has in fact been found to be a highly useful method in Aceh province. In particular, use of a vibratory seismic source was essential for successful application of shear-wave seismics in the city of Banda Aceh and in areas with compacted ground like on farm tracks in the surroundings, presenting mostly agricultural land use areas. We thus were able to explore the mechanical stiffness of the subsurface down to 100 m depth, occasionally even deeper, with remarkably high resolution. The results were transferred into geotechnical site classification in terms of the International Building Code (IBC, 2003. The seismic images give also insights into the history of the basin sedimentation processes of the Krueng Aceh River delta, which is relevant for the exploration of new areas for construction of safe foundations of buildings and for identification of fresh water aquifers in the tsunami

  20. Accounting standards

    NARCIS (Netherlands)

    Stellinga, B.; Mügge, D.

    2014-01-01

    The European and global regulation of accounting standards have witnessed remarkable changes over the past twenty years. In the early 1990s, EU accounting practices were fragmented along national lines and US accounting standards were the de facto global standards. Since 2005, all EU listed companie

  1. Seismic monitoring at The Geysers

    Energy Technology Data Exchange (ETDEWEB)

    Majer, E.L.; Romero, A.; Vasco, D.; Kirkpatrick, A.; Peterson, J.E. [Lawrence Berkeley Lab., CA (United States); Zucca, J.J.; Hutchings, L.J.; Kasameyer, P.W. [Lawrence Livermore National Lab., CA (United States)

    1993-04-01

    During the last several years Lawrence Berkeley Laboratory (LBL) and Lawrence Livermore National Laboratory (LLNL) have been working with industry partners at The Geysers geothermal field to evaluate and develop methods for applying the results of microearthquake (MEQ) monitoring. It is a well know fact that seismicity at The Geysers is a common occurrence, however, there have been many studies and papers written on the origin and significance of the seismicity. The attitude toward MEQ data ranges from being nothing more than an curious artifact of the production activities, to being a critical tool in evaluating the reservoir performance. The purpose of the work undertaken b y LBL and LLNL is to evaluate the utility, as well as the methods and procedures used in of MEQ monitoring, recommend the most cost effective implementation of the methods, and if possible link physical processes and parameters to the generation of MEQ activity. To address the objectives above the MEQ work can be categorized into two types of studies. The first type is the direct analysis of the spatial and temporal distribution of MEQ activity and studying the nature of the source function relative to the physical or chemical processes causing the seismicity. The second broad area of study is imaging the reservoir/geothermal areas with the energy created by the MEQ activity and inferring the physical and/or chemical properties within the zone of imaging. The two types of studies have obvious overlap, and for a complete evaluation and development require high quality data from arrays of multicomponent stations. Much of the effort to date at The Geysers by both DOE and the producers has concentrated establishing a high quality data base. It is only within the last several years that this data base is being fully evaluated for the proper and cost effective use of MEQ activity. Presented here are the results to date of DOE`s effort in the acquisition and analysis of the MEQ data.

  2. The Italian National Seismic Network

    Science.gov (United States)

    Michelini, Alberto

    2016-04-01

    The Italian National Seismic Network is composed by about 400 stations, mainly broadband, installed in the Country and in the surrounding regions. About 110 stations feature also collocated strong motion instruments. The Centro Nazionale Terremoti, (National Earthquake Center), CNT, has installed and operates most of these stations, although a considerable number of stations contributing to the INGV surveillance has been installed and is maintained by other INGV sections (Napoli, Catania, Bologna, Milano) or even other Italian or European Institutions. The important technological upgrades carried out in the last years has allowed for significant improvements of the seismic monitoring of Italy and of the Euro-Mediterranean Countries. The adopted data transmission systems include satellite, wireless connections and wired lines. The Seedlink protocol has been adopted for data transmission. INGV is a primary node of EIDA (European Integrated Data Archive) for archiving and distributing, continuous, quality checked data. The data acquisition system was designed to accomplish, in near-real-time, automatic earthquake detection and hypocenter and magnitude determination (moment tensors, shake maps, etc.). Database archiving of all parametric results are closely linked to the existing procedures of the INGV seismic monitoring environment. Overall, the Italian earthquake surveillance service provides, in quasi real-time, hypocenter parameters which are then revised routinely by the analysts of the Bollettino Sismico Nazionale. The results are published on the web page http://cnt.rm.ingv.it/ and are publicly available to both the scientific community and the the general public. This presentation will describe the various activities and resulting products of the Centro Nazionale Terremoti. spanning from data acquisition to archiving, distribution and specialised products.

  3. Seismic hazard assessment of Chennai city considering local site effects

    Indian Academy of Sciences (India)

    A Boominathan; G R Dodagoudar; A Suganthi; R Uma Maheswari

    2008-11-01

    Chennai city suffered moderate tremors during the 2001 Bhuj and Pondicherry earthquakes and the 2004 Sumatra earthquake. After the Bhuj earthquake, Indian Standard IS: 1893 was revised and Chennai city was upgraded from zone II to zone III which leads to a substantial increase of the design ground motion parameters. Therefore, a comprehensive study is carried out to assess the seismic hazard of Chennai city based on a deterministic approach. The seismicity and seismotectonic details within a 100 km radius of the study area have been considered. The one-dimensional ground response analysis was carried out for 38 representative sites by the equivalent linear method using the SHAKE91 program to estimate the ground motion parameters considering the local site effects. The shear wave velocity profile was inferred from the corrected blow counts and it was verified with the Multichannel Analysis of Surface Wave (MASW) test performed for a representative site. The seismic hazard is represented in terms of characteristic site period and Spectral Acceleration Ratio (SAR) contours for the entire city. It is found that structures with low natural period undergo significant amplification mostly in the central and southern parts of Chennai city due to the presence of deep soil sites with clayey or sandy deposits and the remaining parts undergo marginal amplification.

  4. Probabilistic seismic hazard assessment of Italy using kernel estimation methods

    Science.gov (United States)

    Zuccolo, Elisa; Corigliano, Mirko; Lai, Carlo G.

    2013-07-01

    A representation of seismic hazard is proposed for Italy based on the zone-free approach developed by Woo (BSSA 86(2):353-362, 1996a), which is based on a kernel estimation method governed by concepts of fractal geometry and self-organized seismicity, not requiring the definition of seismogenic zoning. The purpose is to assess the influence of seismogenic zoning on the results obtained for the probabilistic seismic hazard analysis (PSHA) of Italy using the standard Cornell's method. The hazard has been estimated for outcropping rock site conditions in terms of maps and uniform hazard spectra for a selected site, with 10 % probability of exceedance in 50 years. Both spectral acceleration and spectral displacement have been considered as ground motion parameters. Differences in the results of PSHA between the two methods are compared and discussed. The analysis shows that, in areas such as Italy, characterized by a reliable earthquake catalog and in which faults are generally not easily identifiable, a zone-free approach can be considered a valuable tool to address epistemic uncertainty within a logic tree framework.

  5. Rescaled range (R/S) analysis on seismic activity parameters

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The rescaled range (R/S) analysis, proposed by Hurst, is a newstatistical method. Being different from traditional statistical method, R/S analysis can provide the information of maximum fluctuation (range) of statistical parame-ters. At present paper, several modern instrumental earthquake catalogues in different spatial scale, temporal scale, and with different seismic activity background are studied, and R/S method is used to analyze the variation of range of seismic parameters such as earthquake frequency, and earthquake time interval. For different seismic parameters, the ratio of range to standard deviation - R/S is a power law function of the length of time, and the exponent H of power law is always greater than 0.5. As we know, H=0.5 is the characteristics of all ideal random processes. Our results indicate that earthquake series is not an ideal Poisson process, on the contrary, the earth-quake as a phenomenon bears dual characteristics of randomicity and regularity, and the greater H departs from 0.5, the more regularity the time series will show, and vice versa. With time scale changing, one can give the conserva-tive estimate of the fluctuation, which might occur in a relatively long time scale, only by using the limited and known time records.

  6. Seismic hazard assessment of Chennai city considering local site effects

    Science.gov (United States)

    Boominathan, A.; Dodagoudar, G. R.; Suganthi, A.; Uma Maheswari, R.

    2008-11-01

    Chennai city suffered moderate tremors during the 2001 Bhuj and Pondicherry earthquakes and the 2004 Sumatra earthquake. After the Bhuj earthquake, Indian Standard IS: 1893 was revised and Chennai city was upgraded from zone II to zone III which leads to a substantial increase of the design ground motion parameters. Therefore, a comprehensive study is carried out to assess the seismic hazard of Chennai city based on a deterministic approach. The seismicity and seismotectonic details within a 100 km radius of the study area have been considered. The one-dimensional ground response analysis was carried out for 38 representative sites by the equivalent linear method using the SHAKE91 program to estimate the ground motion parameters considering the local site effects. The shear wave velocity profile was inferred from the corrected blow counts and it was verified with the Multichannel Analysis of Surface Wave (MASW) test performed for a representative site. The seismic hazard is represented in terms of characteristic site period and Spectral Acceleration Ratio (SAR) contours for the entire city. It is found that structures with low natural period undergo significant amplification mostly in the central and southern parts of Chennai city due to the presence of deep soil sites with clayey or sandy deposits and the remaining parts undergo marginal amplification.

  7. PARAMETERS OF KAMCHATKA SEISMICITY IN 2008

    Directory of Open Access Journals (Sweden)

    Vadim A. Saltykov

    2015-09-01

    Full Text Available The paper describes seismicity of Kamchatka for the period of 2008 and presents 2D distribution of background seismicity parameters calculated from data published in the Regional Catalogue of Kamchatka Earthquakes. Parameters under study are total released seismic energy, seismic activity A10, slope of recurrence graph γ, parameters of RTL, ΔS and Z-function methods, and clustering of earthquakes. Estimations of seismicity are obtained for a region bordered by latitude 50.5–56.5N, longitude 156E–167E, with depths to 300 km. Earthquakes of energy classes not less than 8.5 as per the Fedotov’s classification are considered. The total seismic energy released in 2008 is estimated. According to a function of annual seismic energy distribution, an amount of seismic energy released in 2008 was close to the median level (Fig. 1. Over 2/3 of the total amount of seismic energy released in 2008 resulted from three largest earthquakes (МW ≥ 5.9. About 5 percent of the total number of seismic events are comprised of grouped earthquakes, i.e. aftershocks and swarms. A schematic map of the largest earthquakes (МW ≥ 5.9 and grouped seismic events which occurred in 2008 is given in Fig. 2; their parameters are listed in Table 1. Grouped earthquakes are excluded from the catalogue. A map showing epicenters of independent earthquakes is given in Fig. 3. The slope of recurrence graph γ and seismic activity A10 is based on the Gutenberg-Richter law stating the fundamental property of seismic process. The recurrence graph slope is calculated from continuous exponential distribution of earthquakes by energy classes. Using γ is conditioned by observations that in some cases the slope of the recurrence graph decreases prior to a large earthquake. Activity A10 is calculated from the number of earthquakes N and recurrence graph slope γ. Average slopes of recurrence graph γ and seismic activity A10 for the area under study in 2008 are calculated; our

  8. Implementing the chemical weapons convention

    Energy Technology Data Exchange (ETDEWEB)

    Kellman, B.; Tanzman, E. A.

    1999-12-07

    In 1993, as the CWC ratification process was beginning, concerns arose that the complexity of integrating the CWC with national law could cause each nation to implement the Convention without regard to what other nations were doing, thereby causing inconsistencies among States as to how the CWC would be carried out. As a result, the author's colleagues and the author prepared the Manual for National Implementation of the Chemical Weapons Convention and presented it to each national delegation at the December 1993 meeting of the Preparatory Commission in The Hague. During its preparation, the Committee of CWC Legal Experts, a group of distinguished international jurists, law professors, legally-trained diplomats, government officials, and Parliamentarians from every region of the world, including Central Europe, reviewed the Manual. In February 1998, they finished the second edition of the Manual in order to update it in light of developments since the CWC entered into force on 29 April 1997. The Manual tries to increase understanding of the Convention by identifying its obligations and suggesting methods of meeting them. Education about CWC obligations and available alternatives to comply with these requirements can facilitate national response that are consistent among States Parties. Thus, the Manual offers options that can strengthen international realization of the Convention's goals if States Parties act compatibly in implementing them. Equally important, it is intended to build confidence that the legal issues raised by the Convention are finite and addressable. They are now nearing competition of an internet version of this document so that interested persons can access it electronically and can view the full text of all of the national implementing legislation it cites. The internet address, or URL, for the internet version of the Manual is http: //www.cwc.ard.gov. This paper draws from the Manual. It comparatively addresses approximately thirty

  9. Seismic structure of the oceanic lithosphere inferred from guided wave

    Science.gov (United States)

    Shito, A.; Suetsugu, D.; Furumura, T.; Sugioka, H.; Ito, A.

    2012-12-01

    simulation of seismic wave propagation up to 5 Hz by using finite difference method. The 2D model area covers 1600 km in horizontal distance and 400 km in depth with a uniform grid interval of 0.04 km. The parallel simulation is conducted by using supercomputer system at JAMSTEC. Synthetic waveforms from the earthquakes in the Pacific plate at the depths of 205 km and 15 km are computed for a variety of oceanic lithosphere models. We satisfactorily reproduce the characteristic seismograms having low frequency first arrivals and high frequency later phases. The resultant preferred model includes small-scale random heterogeneity both in subducting and horizontal part of the oceanic lithosphere. The small-scale random heterogeneity has elongated scatters described by von Karman function with correlation length of 10 km in stretcher direction and 0.5 km in thickness. The standard deviation of the seismic wave velocity fluctuation from the background model is 2 %.

  10. Seismic Hazard Assessment in Stable Continental Regions of Northen Eurasia

    Science.gov (United States)

    Levshenko, V.; Yunga, S.

    2009-04-01

    critical infrastructures of Novovoronezh, Kursk, Smolensk, Kalinin and Leningrad regions are located on the East European craton. Bilibino critical infrastructure is located in the Mesozoic Verkhoyansk-Chukotka fold belt. For capable faults within the investigated territories microearthquake registration is carried out. Automatic phase pickers are designed for seismogram processing based on STA/LTA ratios jointly with polarization analysis. Neotectonic bending strain rate and the seismotectonic strain rate, which depend on the Mmax, and seismicity parameters are determined. Maximum magnitudes for all the above mentioned sites are found to be in the range 4-4.5 The intensity I of strong shaking is less than 5 balls. Three different approaches were applied in estimating target response spectrum. Generalized response spectrum was calculated based on West Europe seismic records. Parametrisation of Eurocode 8 (1998) as well as standart spectrum spectra from Russian Guideline NP-031-01 (2002) were used. Ground time history acceleration was synthesized on a basis of standard generalized ground spectra. Acknowledgments. This work was partly supported by RFBR, № 07-05-00436.

  11. Quantitative estimation of lithofacies from seismic data in a tertiary turbidite system in the North Sea

    Energy Technology Data Exchange (ETDEWEB)

    Joerstad, A.K.; Avseth, P.Aa; Mukerji, T.; Mavko, G.; Granli, J.R.

    1998-12-31

    Deep water clastic systems and associated turbidite reservoirs are often characterized by very complex sand distributions and reservoir description based on conventional seismic and well-log stratigraphic analysis may be very uncertain in these depositional environments. There is shown that reservoirs in turbidite systems have been produced very inefficiently in conventional development. More than 70% of the mobile oil is commonly left behind, because of the heterogeneous nature of these reservoirs. In this study there is examined a turbidite system in the North Sea with five available wells and a 3-D seismic near and far offset stack to establish most likely estimates of facies and pore fluid within the cube. 5 figs.

  12. P-wave seismic imaging through dipping transversely isotropic media

    Science.gov (United States)

    Leslie, Jennifer Meryl

    2000-10-01

    P-wave seismic anisotropy is of growing concern to the exploration industry. The transmissional effects through dipping anisotropic strata, such as shales, cause substantial depth and lateral positioning errors when imaging subsurface targets. Using anisotropic physical models the limitations of conventional isotropic migration routines were determined to be significant. In addition, these models were used to validate both anisotropic depth migration routines and an anisotropic, numerical raytracer. In order to include anisotropy in these processes, one must be able to quantify the anisotropy using two parameters, epsilon and delta. These parameters were determined from headwave velocity measurements on anisotropic strata, in the parallel-, perpendicular- and 45°-to-bedding directions. This new method was developed using refraction seismic techniques to measure the necessary velocities in the Wapiabi Formation shales, the Brazeau Group interbedded sandstones and shales, the Cardium Formation sandstones and the Palliser Formation limestones. The Wapiabi Formation and Brazeau Group rocks were determined to be anisotropic with epsilon = 0.23 +/- 0.05, delta = --0.05 +/- 0.07 and epsilon = 0.11 +/- 0.04, delta = 0.42 +/- 0.06, respectively. The sandstones and limestones of the Cardium and Palliser formations were both determined to be isotropic, in these studies. In a complementary experiment, a new procedure using vertical seismic profiling (VSP) techniques was developed to measure the anisotropic headwave velocities. Using a multi-offset source configuration on an appropriately dipping, uniform panel of anisotropic strata, the required velocities were measured directly and modelled. In this study, the geologic model was modelled using an anisotropic raytracer, developed for the experiment. The anisotropy was successfully modelled using anisotropic parameters based on the refraction seismic results. With a firm idea of the anisotropic parameters from the

  13. Toward seismic source imaging using seismo-ionospheric data

    Science.gov (United States)

    Rolland, L.; Larmat, C. S.; Mikesell, D.; Sladen, A.; Khelfi, K.; Astafyeva, E.; Lognonne, P. H.

    2014-12-01

    The worldwide coverage offered by global navigation space systems (GNSS) such as GPS, GLONASS or Galileo allows seismological measurements of a new kind. GNSS-derived total electron content (TEC) measurements can be especially useful to image seismically active zones that are not covered by conventional instruments. For instance, it has been shown that the Japanese dense GPS network GEONET was able to record images of the ionosphere response to the initial coseismic sea-surface motion induced by the great Mw 9.0 2011 Tohoku-Oki earthquake less than 10 minutes after the rupture initiation (Astafyeva et al., 2013). But earthquakes of lower magnitude, down to about 6.5 would also induce measurable ionospheric perturbations, when GNSS stations are located less than 250 km away from the epicenter. In order to make use of these new data, ionospheric seismology needs to develop accurate forward models so that we can invert for quantitative seismic sources parameters. We will present our current understanding of the coupling mechanisms between the solid Earth, the ocean, the atmosphere and the ionosphere. We will also present the state-of-the-art in the modeling of coseismic ionospheric disturbances using acoustic ray theory and a new 3D modeling method based on the Spectral Element Method (SEM). This latter numerical tool will allow us to incorporate lateral variations in the solid Earth properties, the bathymetry and the atmosphere as well as realistic seismic source parameters. Furthermore, seismo-acoustic waves propagate in the atmosphere at a much slower speed (from 0.3 to ~1 km/s) than seismic waves propagate in the solid Earth. We are exploring the application of back-projection and time-reversal methods to TEC observations in order to retrieve the time and space characteristics of the acoustic emission in the seismic source area. We will first show modeling and inversion results with synthetic data. Finally, we will illustrate the imaging capability of our approach

  14. The GEOSCOPE broadband seismic observatory

    Science.gov (United States)

    Douet, Vincent; Vallée, Martin; Zigone, Dimitri; Bonaimé, Sébastien; Stutzmann, Eléonore; Maggi, Alessia; Pardo, Constanza; Bernard, Armelle; Leroy, Nicolas; Pesqueira, Frédéric; Lévêque, Jean-Jacques; Thoré, Jean-Yves; Bes de Berc, Maxime; Sayadi, Jihane

    2016-04-01

    The GEOSCOPE observatory has provided continuous broadband data to the scientific community for the past 34 years. The 31 operational GEOSCOPE stations are installed in 17 countries, across all continents and on islands throughout the oceans. They are equipped with three component very broadband seismometers (STS1, T240 or STS2) and 24 or 26 bit digitizers (Q330HR). Seismometers are installed with warpless base plates, which decrease long period noise on horizontal components by up to 15dB. All stations send data in real time to the IPGP data center, which transmits them automatically to other data centers (FDSN/IRIS-DMC and RESIF) and tsunami warning centers. In 2016, three stations are expected to be installed or re-installed: in Western China (WUS station), in Saint Pierre and Miquelon Island (off the East coast of Canada) and in Walis and Futuna (SouthWest Pacific Ocean). The waveform data are technically validated by IPGP (25 stations) or EOST (6 stations) in order to check their continuity and integrity. Scientific data validation is also performed by analyzing seismic noise level of the continuous data and by comparing real and synthetic earthquake waveforms (body waves). After these validations, data are archived by the IPGP data center in Paris. They are made available to the international scientific community through different interfaces (see details on http://geoscope.ipgp.fr). Data are duplicated at the FDSN/IRIS-DMC data center and a similar duplication at the French national data center RESIF will be operational in 2016. The GEOSCOPE broadband seismic observatory also provides near-real time information on global moderate-to-large seismicity (above magnitude 5.5-6) through the automated application of the SCARDEC method (Vallée et al., 2011). By using global data from the FDSN - in particular from GEOSCOPE and IRIS/USGS stations -, earthquake source parameters (depth, moment magnitude, focal mechanism, source time function) are determined about 45

  15. Pre-Seismic Electromagnetic Effects

    Institute of Scientific and Technical Information of China (English)

    Guo Yahong

    2007-01-01

    Along with intense rock strain and rock bursting processes at the late stage of earthquake preparation, mechanical-electrical energy conversion appears in the seismogenic region and its nearby rock formations, which correspondingly stimulate certain electromagnetic effects. The paper mainly analyzes the pre-seismic electromagnetic effect of the ionosphere and proposes a method of monitoring VLF radio waves over the additional ionized region and so on. It is deemed that the method is of significance for short and imminent term prediction of strong earthquakes.

  16. Uncertainty analysis in seismic tomography

    Science.gov (United States)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  17. A Preliminary Study on Seismicity and Stages of Seismic Energy Accumulation in Seismotectonic Regions of Tianshan

    Institute of Scientific and Technical Information of China (English)

    Li Yingzhen; Shen Jun; Wang Haitao

    2006-01-01

    Using seismic parameters, the characteristics of the seismic activity in various seismotectonic regions of Tianshan were studied in this paper. These regions are going through different stages of seismic energy accumulation. Current seismic risk levels of these areas were analyzed synthetically by the tectonic movement rates, as well as the characteristics of the seismic activity and the recurrence intervals of strong earthquakes. We preliminarily studied the characteristics of seismic activity in different seismic energy accumulating stages. The result shows that the characteristics of the seismic activity in various seismotectonic regions of the Tianshan area are influenced, not only by the regional tectonic movement, but also by the energy accumulating stage of various seismic tectonics. In the intense tectonic movement areas, it is important to estimate its stage of energy accumulating in order to predict the upper limit of the potential earthquake magnitude. In the less intense tectonic movement areas, the estimating of the stage of energy accumulation will help us recognize the dangerous level of the potential strong earthquake. The study shows that the seismotectonic regions in southern Tianshan have reached the mid-stage and late-stage of energy accumulation, with a higher seismic activity and thus a higher seismic dangerous level than those in the northern and middle Tianshan. The earthquake risk of southern Tianshan is up to Ms7.0, while that of the middle Tianshan is up to Ms6.0 and that of northern Tianshan is only around Ms5.0 ~ 6.0.

  18. Full Wavefield Recordings of Oklahoma Seismicity from an IRIS-led Community Experiment

    Science.gov (United States)

    Anderson, K. R.; Woodward, R.; Sweet, J. R.; Bilek, S. L.; Brudzinski, M.; Chen, X.; DeShon, H. R.; Karplus, M. S.; Keranen, K. M.; Langston, C. A.; Lin, F. C.; Magnani, M. B.; Stump, B. W.

    2016-12-01

    In June 2016, a field crew of students, faculty, industry personnel and IRIS staff deployed several hundred stations above an active seismic lineament in north-central Oklahoma, with the goal to advance our understanding of general seismicity and earthquake source processes using arrays designed to capture full wavefield seismic data. In addition, we used this as an educational opportunity to extend the experience with nodal type experiment planning and execution. IRIS selected 30 graduate students from 18 different US and foreign based institutions to participate in the deployment. In addition, IRIS was pleased to have the assistance of several individuals from the Oklahoma Geological Survey. The crew deployed 363 3C 5Hz Generation 2 Fairfield Z-Land nodes along three seismic lines and in a seven-layer nested gradiometer array. The seismic lines spanned a region 13 km long by 5 km wide. The nested gradiometer was designed to measure the full seismic wavefield using standard frequency-wavenumber techniques and spatial wave gradients. A broadband, 18 station "Golay 3x6" array was deployed around the gradiometer and seismic lines with an aperture of approximately 5 km to collect waveform data from local and regional events. In addition, 9 infrasound stations were deployed in order to capture and identify acoustic events that might be recorded by the seismic arrays and to quantify the wind acoustic noise effect on co-located broadband stations. The variety of instrumentation used in this deployment was chosen to capture the full seismic wavefield generated by the local and regional seismicity beneath the array and the surrounding region. A demobilization team returned to the sites in mid-July to recover the nodes, after a full month of deployment. The broadband and infrasound stations will remain in place through September to capture any additional local and regional seismicity. This experiment was designed by and for the seismological community. The experiment was

  19. Ability Estimation for Conventional Tests.

    Science.gov (United States)

    Kim, Jwa K.; Nicewander, W. Alan

    1993-01-01

    Bias, standard error, and reliability of five ability estimators were evaluated using Monte Carlo estimates of the unknown conditional means and variances of the estimators. Results indicate that estimates based on Bayesian modal, expected a posteriori, and weighted likelihood estimators were reasonably unbiased with relatively small standard…

  20. College Admissions: Beyond Conventional Testing

    Science.gov (United States)

    Sternberg, Robert J.

    2012-01-01

    Standardized admissions tests such as the SAT (originally stood for "Scholastic Aptitude Test") and the ACT measure only a narrow segment of the skills needed to become an active citizen and possibly a leader who makes a positive, meaningful, and enduring difference to the world. The problem with these tests is that they promised, under…

  1. Site-specific probabilistic seismic hazard analyses for the Idaho National Engineering Laboratory. Volume 1: Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-05-01

    This report describes and summarizes a probabilistic evaluation of ground motions for the Idaho National Engineering Laboratory (INEL). The purpose of this evaluation is to provide a basis for updating the seismic design criteria for the INEL. In this study, site-specific seismic hazard curves were developed for seven facility sites as prescribed by DOE Standards 1022-93 and 1023-96. These sites include the: Advanced Test Reactor (ATR); Argonne National Laboratory West (ANL); Idaho Chemical Processing Plant (ICPP or CPP); Power Burst Facility (PBF); Radioactive Waste Management Complex (RWMC); Naval Reactor Facility (NRF); and Test Area North (TAN). The results, probabilistic peak ground accelerations and uniform hazard spectra, contained in this report are not to be used for purposes of seismic design at INEL. A subsequent study will be performed to translate the results of this probabilistic seismic hazard analysis to site-specific seismic design values for the INEL as per the requirements of DOE Standard 1020-94. These site-specific seismic design values will be incorporated into the INEL Architectural and Engineering Standards.

  2. Time-lapse seismic within reservoir engineering

    NARCIS (Netherlands)

    Oldenziel, T.

    2003-01-01

    Time-lapse 3D seismic is a fairly new technology allowing dynamic reservoir characterisation in a true volumetric sense. By investigating the differences between multiple seismic surveys, valuable information about changes in the oil/gas reservoir state can be captured. Its interpretation involves d

  3. Seismic design of reactors in NUCEF

    Energy Technology Data Exchange (ETDEWEB)

    Kurosaki, Akira [Mitsui Shipbuilding and Engineering Co. Ltd., Tokyo (Japan); Kuchiya, Masao; Yasuda, Naomitsu; Kitanaka, Tsutomu; Ogawa, Kazuhiko; Sakuraba, Koichi; Izawa, Naoki; Takeshita, Isao

    1997-03-01

    Basic concept and calculation method for the seismic design of the main equipment of the reactors in NUCEF (Nuclear Fuel Cycle Safety Engineering Research Facility) are described with actual calculation examples. The present paper is published to help the seismic design of the equipment and application of the authorization for the design and constructing of facilities. (author)

  4. Time-lapse seismic within reservoir engineering

    NARCIS (Netherlands)

    Oldenziel, T.

    2003-01-01

    Time-lapse 3D seismic is a fairly new technology allowing dynamic reservoir characterisation in a true volumetric sense. By investigating the differences between multiple seismic surveys, valuable information about changes in the oil/gas reservoir state can be captured. Its interpretation involves d

  5. Making Waves: Seismic Waves Activities and Demonstrations

    Science.gov (United States)

    Braile, S. J.; Braile, L. W.

    2011-12-01

    The nature and propagation of seismic waves are fundamental concepts necessary for understanding the exploration of Earth's interior structure and properties, plate tectonics, earthquakes, and seismic hazards. Investigating seismic waves is also an engaging approach to learning basic principles of the physics of waves and wave propagation. Several effective educational activities and demonstrations are available for teaching about seismic waves, including the stretching of a spring to demonstrate elasticity; slinky wave propagation activities for compressional, shear, Rayleigh and Love waves; the human wave activity to demonstrate P- and S- waves in solids and liquids; waves in water in a simple wave tank; seismic wave computer animations; simple shake table demonstrations of model building responses to seismic waves to illustrate earthquake damage to structures; processing and analysis of seismograms using free and easy to use software; and seismic wave simulation software for viewing wave propagation in a spherical Earth. The use of multiple methods for teaching about seismic waves is useful because it provides reinforcement of the fundamental concepts, is adaptable to variable classroom situations and diverse learning styles, and allows one or more methods to be used for authentic assessment. The methods described here have been used effectively with a broad range of audiences, including K-12 students and teachers, undergraduate students in introductory geosciences courses, and geosciences majors.

  6. Time-lapse seismic within reservoir engineering

    NARCIS (Netherlands)

    Oldenziel, T.

    2003-01-01

    Time-lapse 3D seismic is a fairly new technology allowing dynamic reservoir characterisation in a true volumetric sense. By investigating the differences between multiple seismic surveys, valuable information about changes in the oil/gas reservoir state can be captured. Its interpretation involves

  7. Global Seismic Hazard Assessment Program - GSHAP legacy

    Directory of Open Access Journals (Sweden)

    Laurentiu Danciu

    2015-04-01

    Full Text Available Global Seismic Hazard Assessment Program - or simply GSHAP, when launched, almost two decades ago, aimed at establishing a common framework to evaluate the seismic hazard over geographical large-scales, i.e. countries, regions, continents and finally the globe. Its main product, the global seismic hazard map was a milestone, unique at that time and for a decade have served as the main reference worldwide. Today, for most of the Earth’s seismically active regions such Europe, Northern and Southern America, Central and South-East Asia, Japan, Australia, New Zealand, the GSHAP seismic hazard map is outdated. The rapid increase of the new data, advance on the earthquake process knowledge, technological progress, both hardware and software, contributed all in updates of the seismic hazard models. We present herein, a short retrospective overview of the achievements as well as the pitfalls of the GSHAP. Further, we describe the next generation of seismic hazard models, as elaborated within the Global Earthquake Model, regional programs: the 2013 European Seismic Hazard Model, the 2014 Earthquake Model for Middle East, and the 2015 Earthquake Model of Central Asia. Later, the main characteristics of these regional models are summarized and the new datasets fully harmonized across national borders are illustrated for the first time after the GSHAP completion.

  8. Permanent downhole seismic sensors in flowing wells

    NARCIS (Netherlands)

    Jaques, P.; Ong, H.; Jupe, A.; Brown, I.; Jansenns, M.

    2003-01-01

    It is generally accepted that the 'Oilfield of the Future' will incorporate distributed permanent downhole seismic sensors in flowing wells. However the effectiveness of these sensors will be limited by the extent to which seismic signals can be discriminated, or de-coupled, from flow induced

  9. Robust seismic images amplitude recovery using curvelets

    NARCIS (Netherlands)

    Moghaddam, Peyman P.; Herrmann, Felix J.; Stolk, C.C.

    2007-01-01

    In this paper, we recover the amplitude of a seismic image by approximating the normal (demigration-migration) operator. In this approximation, we make use of the property that curvelets remain invariant under the action of the normal operator. We propose a seismic amplitude recovery method that

  10. Seismic surveying for coal mine planning

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, B. [CMTE/CSIRO Exploration and Mining, Kenmore, Qld. (Australia)

    2002-07-01

    More and more coal in Australia is extracted by underground mining methods especially by longwall mining. These methods can be particularly sensitive to relatively small-scale structural discontinuities and variations in roof and floor rock character. Traditionally, information on these features has been obtained through drilling. However, this is an expensive process and its relevance is limited to the immediate neighbourhood of the boreholes. Seismic surveying, especially by 3D seismic, is an alternative tool for geological structure delineation. It is one of the most effective geophysical methods available for identification of geological structures such as faults, folds, washouts, seam splits and thickness changes which are normally associated with potential mining hazards. Seismic data even can be used for stratigraphic identification. The information extracted from seismic data can be integrated into mine planning and design. In this paper, computer aided interpretation techniques for maximising the information from seismic data are demonstrated and the ability of seismic reflection methods to resolve localised geological features illustrated. Both synthetic and real seismic data obtained in recent 2D and 3D seismic surveys from Australian coal mines are used. 7 refs., 9 figs.

  11. Seismic Design Guidelines For Port Structures

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Bernal, Alberto; Blazquez, Rafael

    -balance approach, in which structures are designed to resist a prescribed level of seismic force specified as a fraction of gravity. These methodologies have contributed to the acceptable seismic performance of port structures, particularly when the earthquake motions are more or less within the prescribed design...

  12. Synergizing Crosswell Seismic and Electromagnetic Techniques for Enhancing Reservoir Characterization

    KAUST Repository

    Katterbauer, Klemens

    2015-11-18

    Increasing complexity of hydrocarbon projects and the request for higher recovery rates have driven the oil-and-gas industry to look for a more-detailed understanding of the subsurface formation to optimize recovery of oil and profitability. Despite the significant successes of geophysical techniques in determining changes within the reservoir, the benefits from individually mapping the information are limited. Although seismic techniques have been the main approach for imaging the subsurface, the weak density contrast between water and oil has made electromagnetic (EM) technology an attractive complement to improve fluid distinction, especially for high-saline water. This crosswell technology assumes greater importance for obtaining higher-resolution images of the interwell regions to more accurately characterize the reservoir and track fluid-front developments. In this study, an ensemble-Kalman-based history-matching framework is proposed for directly incorporating crosswell time-lapse seismic and EM data into the history-matching process. The direct incorporation of the time-lapse seismic and EM data into the history-matching process exploits the complementarity of these data to enhance subsurface characterization, to incorporate interwell information, and to avoid biases that may be incurred from separate inversions of the geophysical data for attributes. An extensive analysis with 2D and realistic 3D reservoirs illustrates the robustness and enhanced forecastability of critical reservoir variables. The 2D reservoir provides a better understanding of the connection between fluid discrimination and enhanced history matches, and the 3D reservoir demonstrates its applicability to a realistic reservoir. History-matching enhancements (in terms of reduction in the history-matching error) when incorporating both seismic and EM data averaged approximately 50% for the 2D case, and approximately 30% for the 3D case, and permeability estimates were approximately 25

  13. Properties of induced seismicity at the geothermal reservoir Insheim, Germany

    Science.gov (United States)

    Olbert, Kai; Küperkoch, Ludger; Thomas, Meier

    2017-04-01

    Within the framework of the German MAGS2 Project the processing of induced events at the geothermal power plant Insheim, Germany, has been reassessed and evaluated. The power plant is located close to the western rim of the Upper Rhine Graben in a region with a strongly heterogeneous subsurface. Therefore, the location of seismic events particularly the depth estimation is challenging. The seismic network consisting of up to 50 stations has an aperture of approximately 15 km around the power plant. Consequently, the manual processing is time consuming. Using a waveform similarity detection algorithm, the existing dataset from 2012 to 2016 has been reprocessed to complete the catalog of induced seismic events. Based on the waveform similarity clusters of similar events have been detected. Automated P- and S-arrival time determination using an improved multi-component autoregressive prediction algorithm yields approximately 14.000 P- and S-arrivals for 758 events. Applying a dataset of manual picks as reference the automated picking algorithm has been optimized resulting in a standard deviation of the residuals between automated and manual picks of about 0.02s. The automated locations show uncertainties comparable to locations of the manual reference dataset. 90 % of the automated relocations fall within the error ellipsoid of the manual locations. The remaining locations are either badly resolved due to low numbers of picks or so well resolved that the automatic location is outside the error ellipsoid although located close to the manual location. The developed automated processing scheme proved to be a useful tool to supplement real-time monitoring. The event clusters are located at small patches of faults known from reflection seismic studies. The clusters are observed close to both the injection as well as the production wells.

  14. A description of seismic amplitude techniques

    Science.gov (United States)

    Shadlow, James

    2014-02-01

    The acquisition of seismic data is a non-invasive technique used for determining the sub surface geology. Changes in lithology and fluid fill affect the seismic wavelet. Analysing seismic data for direct hydrocarbon indicators (DHIs), such as full stack amplitude anomalies, or amplitude variation with offset (AVO), can help a seismic interpreter relate the geophysical response to real geology and, more importantly, to distinguish the presence of hydrocarbons. Inversion is another commonly used technique that attempts to tie the seismic data back to the geology. Much has been written about these techniques, and attempting to gain an understanding on the theory and application of them by reading through various journals can be quite daunting. The purpose of this paper is to briefly outline DHI analysis, including full stack amplitude anomalies, AVO and inversion and show the relationship between all three. The equations presented have been included for completeness, but the reader can pass over the mathematical detail.

  15. Seismic analysis of nuclear power plant structures

    Science.gov (United States)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  16. Digitized seismic station for colliery use

    Energy Technology Data Exchange (ETDEWEB)

    Proskuryakov, V.M.; Blyakhman, A.S.

    1979-09-01

    Seismic tests used to investigate the stress conditions in rock and conducted in different deposits throughout the USSR, have made it possible to determine the optimum variants for the application of such tests. The VNIMI Institute has developed a seismic station for mines which features a 7-channel numerical control and undertakes measurements via three operating systems: determination of the propagation speed of artifically-emitted seismic waves; measurement of the co-ordinates for the seismic-wave emission sources; and recording the number of natural seismic signals in a given area of solid rock. The Institute a plan of the station together with operational data. The feed voltage is 5 V and the power consumption 500 mA. (In Russian)

  17. AcquisitionFootprintAttenuationDrivenbySeismicAttributes

    Directory of Open Access Journals (Sweden)

    Cuellar-Urbano Mayra

    2014-04-01

    Full Text Available Acquisition footprint, one of the major problems that PEMEX faces in seismic imaging, is noise highly correlated to the geometric array of sources and receivers used for onshore and offshore seismic acquisitions. It prevails in spite of measures taken during acquisition and data processing. This pattern, throughout the image, is easily confused with geological features and misguides seismic attribute computation. In this work, we use seismic data from PEMEX Exploración y Producción to show the conditioning process for removing random and coherent noise using linear filters. Geometric attributes used in a workflow were computed for obtaining an acquisition footprint noise model and adaptively subtract it from the seismic data.

  18. Climatic changes, streamflow, and long-term forecasting of intraplate seismicity

    Science.gov (United States)

    Costain, J. K.; Bollinger, G. A.

    1996-10-01

    bisected by the Mississippi River, Illinois, and James River, Virginia, in the period range of 11-13 years that might be associated with sunspot activity. In addition, there is positive correlation between periods of above average values of the standard deviation of streamflow time series and periods of seismicity in the central Virginia seismic zone. Many aspects of the weather appear to be modulated by a 20-year cycle. We observe a similar periodicity (18-20 years) in seismicity in the central Virginia seismic zone. A good agreement is observed when a streamflow time series is superimposed on the record of the earthquake strain factor if a value of 50 km 2/year is assumed for crustal hydraulic diffusivity. In the central Virginia seismic zone, it is found that the number of earthquakes versus depth, ψ, is directly proportional to pressure fluctuations at the depth ψ. In addition, the fractal dimension determined from downward-continued streamflow is approximately the same as the fractal dimension of intraplate seismicity. Furthermore, using the Gutenberg-Richter relation and assuming that the earthquake data sets in the New Madrid and central Virginia seismic zones are complete for all magnitudes m ⩾ 2, the ratio of the number of earthquakes occurring per year in the New Madrid zone to the central Virginia zone is about 40. The ratio of the standard deviations of downward-continued Mississippi River streamflow (at Thebes, Illinois) to the James River streamflow is also about 40. One interpretation of this common ratio is that the number of intraplate earthquakes generated in a seismogenic crust is directly proportional to the standard deviation of vertical variations in the elevation of the water table. If the hydroseismicity hypothesis is correct, then long-term variations in streamflow can be used to forecast long-term statistical variations in intraplate seismic activity.

  19. Conventionalism and integrable Weyl geometry

    Science.gov (United States)

    Pucheu, M. L.

    2015-03-01

    Since the appearance of Einstein's general relativity, gravitation has been associated to the space-time curvature. This theory introduced a geometrodynamic language which became a convenient tool to predict matter behaviour. However, the properties of space-time itself cannot be measurable by experiments. Taking Poincaré idea that the geometry of space-time is merely a convention, we show that the general theory of relativity can be completely reformulated in a more general setting, a generalization of Riemannian geometry, namely, the Weyl integrable geometry. The choice of this new mathematical language implies, among other things, that the path of particles and light rays should now correspond to Weylian geodesies. Such modification in the dynamic of bodies brings a new perception of physical phenomena that we will explore.

  20. Laparoscopic splenectomy using conventional instruments

    Directory of Open Access Journals (Sweden)

    Dalvi A

    2005-01-01

    Full Text Available INTRODUCTION : Laparoscopic splenectomy (LS is an accepted procedure for elective splenectomy. Advancement in technology has extended the possibility of LS in massive splenomegaly [Choy et al., J Laparoendosc Adv Surg Tech A 14(4, 197-200 (2004], trauma [Ren et al., Surg Endosc 15(3, 324 (2001; Mostafa et al., Surg Laparosc Endosc Percutan Tech 12(4, 283-286 (2002], and cirrhosis with portal hypertension [Hashizume et al., Hepatogastroenterology 49(45, 847-852 (2002]. In a developing country, these advanced gadgets may not be always available. We performed LS using conventional and reusable instruments in a public teaching the hospital without the use of the advanced technology. The technique of LS and the outcome in these patients is reported. MATERIALS AND METHODS : Patients undergoing LS for various hematological disorders from 1998 to 2004 were included. Electrocoagulation, clips, and intracorporeal knotting were the techniques used for tackling short-gastric vessels and splenic pedicle. Specimen was delivered through a Pfannensteil incision. RESULTS : A total of 26 patients underwent LS. Twenty-two (85% of patients had spleen size more than 500 g (average weight being 942.55 g. Mean operative time was 214 min (45-390 min. The conversion rate was 11.5% ( n = 3. Average duration of stay was 5.65 days (3-30 days. Accessory spleen was detected and successfully removed in two patients. One patient developed subphrenic abscess. There was no mortality. There was no recurrence of hematological disease. CONCLUSION : Laparoscopic splenectomy using conventional equipment and instruments is safe and effective. Advanced technology has a definite advantage but is not a deterrent to the practice of LS.

  1. SEISMIC RISK ASSESSMENT OF LEVEES

    Directory of Open Access Journals (Sweden)

    Dario Rosidi

    2007-01-01

    Full Text Available A seismic risk assessment procedure for earth embankments and levees is presented. The procedure consists of three major elements: (1 probability of ground motion at the site, (2 probability of levee failure given a level of ground motion has occurred and (3 expected loss resulting from the failure. This paper discusses the first two elements of the risk assessment. The third element, which includes economic losses and human casualty, will not be presented herein. The ground motions for risk assessment are developed using a probabilistic seismic hazard analysis. A two-dimensional finite element analysis is performed to estimate the dynamic responses of levee, and the probability of levee failure is calculated using the levee fragility curve. The overall objective of the assessment is to develop an analytical tool for assessing the failure risk and the effectiveness of various levee strengthening alternatives for risk reduction. An example of the procedure, as it applies to a levee built along the perimeter of an island for flood protection and water storage, is presented. Variations in earthquake ground motion and soil and water conditions at the site are incorporated in the risk assessment. The effects of liquefaction in the foundation soils are also considered.

  2. Seismic waveform modeling over cloud

    Science.gov (United States)

    Luo, Cong; Friederich, Wolfgang

    2016-04-01

    With the fast growing computational technologies, numerical simulation of seismic wave propagation achieved huge successes. Obtaining the synthetic waveforms through numerical simulation receives an increasing amount of attention from seismologists. However, computational seismology is a data-intensive research field, and the numerical packages usually come with a steep learning curve. Users are expected to master considerable amount of computer knowledge and data processing skills. Training users to use the numerical packages, correctly access and utilize the computational resources is a troubled task. In addition to that, accessing to HPC is also a common difficulty for many users. To solve these problems, a cloud based solution dedicated on shallow seismic waveform modeling has been developed with the state-of-the-art web technologies. It is a web platform integrating both software and hardware with multilayer architecture: a well designed SQL database serves as the data layer, HPC and dedicated pipeline for it is the business layer. Through this platform, users will no longer need to compile and manipulate various packages on the local machine within local network to perform a simulation. By providing users professional access to the computational code through its interfaces and delivering our computational resources to the users over cloud, users can customize the simulation at expert-level, submit and run the job through it.

  3. Seismic transducer modeling using ABAQUS

    Energy Technology Data Exchange (ETDEWEB)

    Stephen R. Novascone

    2004-05-01

    A seismic transducer, known as an orbital vibrator, consists of a rotating imbalance driven by an electric motor. When suspended in a liquid-filled wellbore, vibrations of the device are coupled to the surrounding geologic media. In this mode, an orbital vibrator can be used as an efficient rotating dipole source for seismic imaging. Alternately, the motion of an orbital vibrator is affected by the physical properties of the surrounding media. From this point of view, an orbital vibrator can be used as a stand-alone sensor. The reaction to the surroundings can be sensed and recorded by geophones inside the orbital vibrator. These reactions are a function of the media’s physical properties such as modulus, damping, and density, thereby identifying the rock type. This presentation shows how the orbital vibrator and surroundings were modeled with an ABAQUS acoustic FEM. The FEM is found to compare favorably with theoretical predictions. A 2D FEM and analytical model are compared to an experimental data set. Each model compares favorably with the data set.

  4. Seismic tomography of the Moon

    Institute of Scientific and Technical Information of China (English)

    ZHAO DaPeng; LEI JianShe; LIU Lucy

    2008-01-01

    We attempted to determine the first three-dimensional P and S wave velocity and Poisson's ratio structures of the lunar crust and mantle down to 1000 km depth under the near-side of the Moon by applying seismic tomography to the moonquake arrival-time data recorded by the Apollo seismic network operated during 1969 to 1977. Our results show that significant lateral heterogeneities may exist in the lunar interior. Because there is no plate tectonics in the Moon, the lateral heterogeneities may be produced at the early stage of the Moon formation and evolution, and they have been preserved till today. There seems to be a correlation between the distribution of deep moonquakes and lateral velocity variations in the lunar lower mantle, suggesting that the occurrence of deep moonquakes may be affected by the lunar structural heterogeneity in addition to the tidal stresses. Although this is an experimental work and the result is still preliminary, it indicates that tomographic imaging of the lunar interior is feasible.

  5. DISPLACEMENT BASED SEISMIC DESIGN METHODS.

    Energy Technology Data Exchange (ETDEWEB)

    HOFMAYER,C.MILLER,C.WANG,Y.COSTELLO,J.

    2003-07-15

    A research effort was undertaken to determine the need for any changes to USNRC's seismic regulatory practice to reflect the move, in the earthquake engineering community, toward using expected displacement rather than force (or stress) as the basis for assessing design adequacy. The research explored the extent to which displacement based seismic design methods, such as given in FEMA 273, could be useful for reviewing nuclear power stations. Two structures common to nuclear power plants were chosen to compare the results of the analysis models used. The first structure is a four-story frame structure with shear walls providing the primary lateral load system, referred herein as the shear wall model. The second structure is the turbine building of the Diablo Canyon nuclear power plant. The models were analyzed using both displacement based (pushover) analysis and nonlinear dynamic analysis. In addition, for the shear wall model an elastic analysis with ductility factors applied was also performed. The objectives of the work were to compare the results between the analyses, and to develop insights regarding the work that would be needed before the displacement based analysis methodology could be considered applicable to facilities licensed by the NRC. A summary of the research results, which were published in NUREGICR-6719 in July 2001, is presented in this paper.

  6. World Trade Organization, ILO conventions, and workers' compensation.

    Science.gov (United States)

    LaDou, Joseph

    2005-01-01

    The World Trade Organization, the World Bank, and the International Monetary Fund can assist in the implementation of ILO Conventions relating to occupational safety and health in developing countries. Most countries that seek to trade globally receive permission to do so from the WTO. If the WTO required member countries to accept the core ILO Conventions relating to occupational safety and health and workers' compensation, it could accomplish something that has eluded international organizations for decades. International workers' compensation standards are seldom discussed, but may at this time be feasible. Acceptance of a minimum workers' compensation insurance system could be a requirement imposed on applicant nations by WTO member states.

  7. 2005 China- Britain Standardization Conference -Environmental Protection ·Energy Saving & Standardization

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    @@ 2005 China-Britain Standardization Conference was held at Beijing International Convention Center on June 29th 2005, jointly hosted by Standardization Administration of the People's Republic of China (SAC) and British Standards Institute (BSI), with "Environmental Protection . Energy Saving & Standardization" as its theme.

  8. SeismicWaveTool: Continuous and discrete wavelet analysis and filtering for multichannel seismic data

    Science.gov (United States)

    Galiana-Merino, J. J.; Rosa-Herranz, J. L.; Rosa-Cintas, S.; Martinez-Espla, J. J.

    2013-01-01

    A MATLAB-based computer code has been developed for the simultaneous wavelet analysis and filtering of multichannel seismic data. The considered time-frequency transforms include the continuous wavelet transform, the discrete wavelet transform and the discrete wavelet packet transform. The developed approaches provide a fast and precise time-frequency examination of the seismograms at different frequency bands. Moreover, filtering methods for noise, transients or even baseline removal, are implemented. The primary motivation is to support seismologists with a user-friendly and fast program for the wavelet analysis, providing practical and understandable results. Program summaryProgram title: SeismicWaveTool Catalogue identifier: AENG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 611072 No. of bytes in distributed program, including test data, etc.: 14688355 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.8.0.347 (R2009a) or higher. Wavelet Toolbox is required. Computer: Developed on a MacBook Pro. Tested on Mac and PC. No computer-specific optimization was performed. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.8.0.347 (R2009a) or higher. Tested on Mac OS X 10.6.8, Windows XP and Vista. Classification: 13. Nature of problem: Numerous research works have developed a great number of free or commercial wavelet based software, which provide specific solutions for the analysis of seismic data. On the other hand, standard toolboxes, packages or libraries, such as the MathWorks' Wavelet Toolbox for MATLAB, offer command line functions and interfaces for the wavelet analysis of one-component signals. Thus, software usually is focused on very specific problems

  9. Seismic hazard map of the western hemisphere

    Science.gov (United States)

    Shedlock, K.M.; Tanner, J.G.

    1999-01-01

    Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This